than nvidia 7800/7900? I'm toying with the idea of upgrading my 7800GT to the 1900XTX, so can you really tell the difference on LCD? I'll use it for both gaming and dvd movies.
Printable View
than nvidia 7800/7900? I'm toying with the idea of upgrading my 7800GT to the 1900XTX, so can you really tell the difference on LCD? I'll use it for both gaming and dvd movies.
You'll get a bunch of guys saying there is no difference & a bunch of guys saying it's huge. Really the only way is to see for yourself, but from my experience it's the latter ;)
Personally i could never tell the difference between ATi and nVidia cards :shrug:
Radeon 7200, GF4 TI 4600, 9800 Pro, 6800GT, X850XTPE
Wow, and i just ordered a 7900GT, i never noticed my alternating pattern : P
I really overestimated the X850XTPE when i bought it. Barely faster than my 6800GT, ATi Tool consistently crashes my computer when i tell it to find the max clocks, and i don't like any of the drivers. I ended up liking my old AGP 6800GT a lot more than my X850. Kinda pushed me in the nVidia direction this round.
I'm in the exact same boat - and I'm trying to decide if having 512MB of video RAM is really a benefit over the 256MB cards. I am trying to choose between an XT or XTX and a 7900 GT / GTX.
I have a 19" LCD that runs 1280x1024 right now, but am also thinking about / planning an upgrade to a 24" widescreen (likely the Dell 2407 when it arrives). It seems like maybe I should just hold on to my 7800 GT until I get the bigger LCD or the best route to go becomes more obvious.
If you like AF then defo go Ati, it is a lot better. AA is closer but Ati stil has the edge.
However it depends a lot on what sort of game you play. If you are going to play a slow paced game like Oblivion it might matterm more than a fast paced game like most fps.
Regards
Andy
Quote:
Originally Posted by gr8golf
I too have the 19" Samsung LCD and getting stuck at lower resolution (1280x1024 max), reason I upgrade is not for the extra FPS but mostly for the IQ if the ATI is that much better than Nvidia. I'd figured...if I'm about to drop couple hundred bucks on a dang video card I might as well go the extra mile and get the XTX instead of XT, I play BF2 and Quake4 and use the pc to watch dvd movie quite often. I also do some graphic/photo/video editing on this pc as well.
It may not be noticeable, but if you check any of the 7900 reviews, you'll find the observation in most.
Look for a detailed explanation there. Historically, this isn't the type of question that exudes friendliness at Xtreme 3d :rolleyes:
Perkam
intel intergrated (pentium 90mhz), intel intergrated (pentium mmx 200), geforce2 ultra, radeon 9700, radeon 9800pro, radeon 9800pro, radeon 9800pro (yeah i fried a few), radeon 9800pro (more then a few...), 9600XT (ran out of money there), 9200pro (finally received the alienware desktop i ordered a year and a half befor), 6800gt (couldnt stand the 9200pro), 6800u, x800xt pe, 7800GT SLI.
i have to say, in 2D, the intergrated pentium mmx is just perfect. The problem is :
1) finding one
2) no too good in 3D
orelse in 3D ATI looks better (the textures just look sharper).
That's why you use the image sharpening setting in the NVidia control panel... It'll match up with ATi's "better texture quality" with ease.
ATi has better AF.
NVidia has better AA(SuperSampling > Multisampling, don't argue. Also 8xAA vs 6xAA)
ATi has HDR + AA in games using FP16 HDR(ONLY USABLE IN SS2 AND FAR CRY! Beyond that and it's chop city as the card can't do it playable)
NVidia has Digital Vibrance(Makes colors MUCH more vivid in games!)
In the end, it comes down to preference.
that's an excellent point you made, I almost forgot about the Digital vibrance feature that our card has...dang it. decision decision....
Oh SHOOT i forgot all about digital vibrance. Man i miss the 6800GT, digi vibe on a 2001FP looked so nice.
Everyone forgets about Digital Vibrance... It makes for a VERY vivid picture.
Digital vibrance is what I missed most :(
im surprised ATI hasn't integrated their own version of this digital vibrance. I have never witnessed this myself though have heard about it and it seems like a nice feature. I wish I had something like that on my X1900.
digital vibrance is good for gaming and watching movie, but for photo editing I always turn it off. from some of the images comparison I've seen the ATI tend to look a bit warm, wherea the nvidia is a bit on the cold side, can't say for sure what I like most.
Lol, I thought DV was a joke. And yes I tried it, on sli'd 7800gt's. Reminds me of a combination gamma, contrast, color control.
Anyway, Ati IQ is much better than NV. Although you wouldn't be able to walk into a room and say, "Hey, that is an Ati card your running." Unless you were playing a game such as Guild Wars, which shows Nvidia's shimmering problem all too easy...
That's pretty much what it is, nvidia made it easier for the end user to simply adjust one parameter instead of messing around with a bunch of settings. Cartoon (nemo, monster inc, etc) looks amazing with the Digital Vibrance on. ;)Quote:
Originally Posted by Plywood99
Quote:
Originally Posted by cantankerous
It's called Avivo :p:
I went directly from 7800gt->x1900xtx, im much more happier with it but cannot really notice a difference, side by side im sure it would be more possible to tell. but no once can reference any examples from what ive seen if they are not side by side in a comparison.
I would say that the IQ of Ati Cards is much better.
Because yesterday i used the computer of my girl friend with my benQ TFT and i thought " why is the screen so sharp"?? i reconnected my computer with my 7800GTX card and couln't believe it. The picture wasn't as sharp as with my girl friends computer with a 9250 SE from ATI.
So what??
same monitor? I agree the IQ is probably better but its very hard to notice for me.Quote:
Originally Posted by qujck
yep same Monitor, and the Screen of the Ati looks much clearer and more brilliant than the Screen with NVidia.
Its true.
I thought the "shimmering" thing was all in my head, I have a notebook with a 6600GT and a desktop with an x850xt.
I had some downtime for my desktop while my vga was off for RMA and in world of warcraft it is unbelievably noticeable. To the point where it is distracting. Glad to have my x850xt back :) I will be upgrading the ati route simply because WoW is my antijob. :stick:
To clarify the shimmering is when the textures (particularly on the ground) shimmer and ladder when you move... I liken it to in FFX on PS2 when only using a composite video connection during the intro :p:
I wasn't doubting you just curious. I wonder if this IQ comparison is still true when comparing say an x1900 vs the 7900 series cards. I would like to see a review comparing 7900gtx vs x1900xtx IQ with some kind of visual reference.Quote:
Originally Posted by qujck
[H] found shimmering on both NVidia and ATi's latest offerings.
Sorry if this link isn't allowed, I'll post it anyways...
http://www.gamespot.com/features/6145814/index.html
If it gets deleted then basically Gamespot have a pretty good system for comparing image quality, even if the article is a bit iffy. The image quality difference is most noticeable in 3DMark where the Geforces put jaggies on everything. I was planning on getting a 7900SLI setup, but after realising the IQ really wasn't that impressive I think I'm gonna hold off for a while till the next gen, doesn't seem like much point in having the ultimate rig if it can't even AA with HDR.
That review shows Nearly identical image quality on EVERY game aerlix. Only reason 3dmark looked better on the X1900 was because of AA vs no AA.
Again I say it by the way, AA + HDR is only usable in 2 games, Far Cry and Serious Sam 2. On everything else that uses FP16 HDR it's too slow to work.
HL2 and AOE3 doesn't use FP16 HDR, which is why it works on NVidia cards with AA as well.
That review DOES put to rest the whole "OMGz0r ATi IQ IS TH3 PWN@GE!"... They both produce an image that cannot be chosen as better over the other.
No more said, Case closed, END OF DISCUSSION.
Ok fair do's, there really isn't much difference at all, certainly not as much as people make out. But it was still there when you look at the zooms. Anyway I guess if you are thinking about a 7900GT then why bother caring? I do think the difference was big enough to take into account when considering an $800 investment though.
Ditto.
FWIW, I'm b!tching more now that I've moved from 7800GTX to an X1900XT. Then I find out each one of my IQ issues was being duplicated by another guy on rage3d forums with both cards in tow. :p
My only conclusion was that I'm not b!tching about framerates as much with the ATI part, so now I'm pickin' on IQ!
Also, ATI's CCC has a slider called, "Saturation" under AVIVO that finally does the equiv of DV in NV's panel. Here's a SS. Funny thing is, I don't need it. Default value looks as good as NV's DV up to medium or so.
http://img.photobucket.com/albums/v2...Saturation.jpg
having moved from a 7800gt to x1800xt (not the same thing as 79gt -> x19xt, i know) i would say Ati has better IQ, but i must admit i may not have had a full fiddle in the Nvidia CP.
All i can really say is, both are more than sufficient, however Ati on High Quality in driver looks far sharper on textures than Nvidia set to high quality (again, in drivers).
Isn't AVIVO 2d, not 3d? Infact, according to ATi's website, it IS for 2d and not 3d.Quote:
Originally Posted by Jodiuh
I am in the same anguish as the rest of you. I'm going to upgrade very soon for Oblivion, but I'm so unsure. I know I've liked all my ATI cards (that actually works), but I haven't had since 9600. I know I'm not too fancy about the IQ of my current 7800GTX SLI configuration, however it's the best performing solution (well duh!) I've had. If I go 7900GTX I'm 99% I'll go SLI, however with X1900XTX I'm probably not going CF (since I currently have an NF4 mobo I just bought lol)...
One thing I've heard a lot about the IQ on ATI is that it's most visible in actual motion and thus it's very hard to appreciate just from a screen shot or two. Can anyone verify this? Thanks!
~ Kris
I loved the saturation slider when using CCC when I first got my X1800 card. I am not using the Omega drivers however and I see no options in that driver for AVIVO so I guess I have to do without the saturation slider as it appears to only be present in CCC. I do find it made the desktop etc much clearer and colours had more punch.
there's not much of an upgrade to go from 7800GT to 7900GT other than a few extra FPS which I'm not really lack of with resolution max out at only 1280x1024, I decided give ATI a try and ordered the retail version of SAPPHIRE X1900XTX from newegg, it will be here tomorrow so I'll see how I like it :D :D
If you haven't sold your XFX already, please give us an IQ comparison! I have XFX cards too and I was thinking of Sapphire as well (or HIS for its slightly better cooler). :)Quote:
Originally Posted by ben805
no I haven't sold it so definitely can compare them ;)
Excellent! I'm eagerly awaiting your results!Quote:
Originally Posted by ben805
2 words for you...Quote:
Originally Posted by krille
LAN Party
There's a 250 person party here in 10 days. I'll try to arrange something like that with another lanner.
One of the lamest, biased, most misleading comments I've seen on this board.:stick: :stick: :stick:Quote:
Originally Posted by DilTech
Here is a quote to clear up the slop Diltech just implied.
"Shimmering, Oh My!
There you go, yes, I said it, “shimmering.” Specifically “texture crawling,” caused by either aggressing filtering or really bad LOD. I notice it using the Dell 2405FPW LCD. I believe the brighter contrast and crisper image coupled with the fact that the screen is just physically larger at a higher resolution all amplifies the problem and makes it extremely visible. I don’t notice it in all games, but there are a couple games in which it did negatively impact my overall level of gaming immersion. While this is another raging Green Vs. Red argument found many places on the Net, we have never specifically addressed it as we have never truly seen it impact our gameplay, but that is simply not the case with our 24” widescreen display.
We found in World of Warcraft there was horrible texture crawling on the ground as you walk through the game. It is most notable on coble stone or dirt paths through forests. Comparatively, texture crawling is much worse on NVIDIA GPUs than on ATI GPUs from our experience, but rest assured this is a problem that is present in both teams’ technologies. I noticed distinctly that moving from a NVIDIA-based GPU to the ATI Radeon X1900 XTX or XL very much reduced this “shimmering” problem in World of Warcraft as textures had less crawling. However, there was still some noticeable. Texture crawling is definitely worse on the GeForce 7800/7900 GPUs though. We were using the default driver settings for both NVIDIA and ATI.
We also noticed texture crawling in EverQuest II. Again on the ground, though this game wasn’t as bad as World of Warcraft. I didn’t really notice it or find anything distracting in games other than those mentioned above. It seems that World of Warcraft is the worst of them all when it comes to shimmering, and it was quite distracting on the large widescreen LCD."
Now Skeeter, he didn't mean any trouble...
Anyway, i would like to see one of us here at XS compare them from an unbiased standpoint. As i said in my first post, personally i've never noticed a difference, but i haven't done direct side by side comparisons. i just max games out and enjoy them.
:)
so your opinion is the only one which counts and there is no debate :)Quote:
Originally Posted by DilTech
how come some (a decent proportion of ) people (me included) see a difference ?
because we know how to adjust gamma and contrast on our display without needing digital vibrance... as we can see the difference between FP16 and non-FP16 (nill in current games).
And the verdict is?Quote:
Originally Posted by STEvil
Is this > ATI do not have better image quality than nVidia. It's just something ATI fanboys like to say (but never backup)Quote:
Originally Posted by krille
Nvidia IQ has been noted in every nvidia reviews for years now. But is only visible at high resolutions. I have recommended the 7900GT to multiple people as the IQ is one aspect of the issue, and Nvidia has features ATI doesnt.Quote:
Originally Posted by Pronstar
My hat off to Diltech for singlehandedly trying to give rebuttals to peopel's answers. He may have his perspective on the issue, but he is trying his best to answer everyone's (onesided) questions so pls dont be too fast in criticizing him.
I'm sure Diltech will agree that Nvidia iq IS an issue...an issue that will probably be resolved by better drivers, and therefore since there is no inherent proof that it is caused by the hardware, no one can say the issue will be there tomorrow. Here is the most recent take on the Nvidia IQ in Bit Tech's Round Up of 7900GTXs:
http://www.bit-tech.net/hardware/200...oundup/17.htmlQuote:
Originally Posted by Bit Tech
Is this being used as an issue to undermine Nvidia and its sales ? Yes it is and it shouldn't be. OPP, Kingpin, Shamino, Kinc and many others of the top ocers are playing with these cards at around -85C levels and finding them a blast to oc and break WRs with, and if you were to ask them of this they would tell you there are awesome cards worthy of praise even from ATI.
But there comes a time when someone buying a $300, $400, $500 or a $600 card begins to think of these things when making their buying decisions and that is where Nvidia must address the issue now or in the future before it does become a factor...right now, ATI and Nvidia both have room to improve in this area and it isn't a factor big enough to affect the overall performance and value of the cards...yet.
Perkam
Quote:
Originally Posted by Plywood99
They say, right there, it happens on both... Right along with what I said. So, what's the problem?
I'm afraid in today's world of reading headlines, Dil, you need to point out where in the review it says otherwise people don't read it ;)Quote:
Originally Posted by DilTech
PerkamQuote:
It is most notable on coble stone or dirt paths through forests. Comparatively, texture crawling is much worse on NVIDIA GPUs than on ATI GPUs from our experience, but rest assured this is a problem that is present in both teams’ technologies.
Ah, thanx perkam... I forget that alot of people don't read entire articles(I'll admit, I've been guilty of it too), they just skim to find what helps their argument and ends it off there.
Apparently WoW is the game that's trouble for the 7900's though.
I'm still trying to get familiar with this ATI card and overclock it, haven't pay any attention to the IQ yet but the text looks like shiit during POST and in dos environment, remind me of those cheap 8mb pci video card back in the days, the text are very choppy...maybe it has something to do with my Venus.
Flash the new bios for Venus, but damn man....dos environment still looks like crap!! I don't get it....how can a $500 video card display such suck shiit text, everything looks normal as soon as the Winblows loaded up but man this is a turn off :stick:
I have noticed no issues of text quality during post but really, I must ask, is the quality of your post screen really such a big issue? I would like to think the visual quality of your actual working environment and of course games is the important factor. We don't spend $600 on videocards to admire our screen during an FDISK.
Quote:
Originally Posted by ben805
It's a BIOS issue. DFI just haven't fixed it yet. As proof, look at the latest update to the Biostar TForce. It lists that the ATI BOOT screen problem is fixed, proof that it's a BIOS issue.
The thing is that it's been here since geforce FX (earlier games looked so crappy there isn't much difference to make), what garrantees me it'll be here tomorrow is called FPS race. as long as a sufficient amount of people can't tell/don't care, there really isn't that much of a problem to deal with, if having a slightly lesser IQ can get you that +3% needed to be king of the hill...Quote:
Originally Posted by perkam
Both companies have IQ problems in different applications, been that way since the dawn of videocards.
Anyone who had a 9700pro/9800pro will remember the washed out textures in UT2003. Anyone who had a FX will remember the sheer pain of looking at Far Cry on it.
With a little tweaking on either cards drivers all iq issues can, and are, FIXED. Just takes a little work.
Rivatuner is your friend.
What specifically can RIVA help with? More specifically...can it get rid of those damn moving shadows in BF2 on Guns and far off textures?
it definitely cant fix the angle dependent AF nvidia is using -> hardware limitation, confirmed by nvidia engineers.
How about you show before you say. I have an x1900xtx here and I haven't seen any shimmering.Quote:
Originally Posted by DilTech
I believe what he was pointing out to you was the entirety of the quote which states nVidia suffers from the problem moreso then ATi.Quote:
Originally Posted by perkam
What annoyed me the most back when image quality really was an issue was the SMD's used to filter the VGA output.. good old matrox ;)
Quote:
Originally Posted by DilTech
I read the article start to finish the day it came out. You well know the reason I quoted you. You think I didn't read the quote I took from the article??? Come on man, be a little smarter than that.
As I mentioned in my original post it is what you implied, aka, that they both have an equal problem with shimmering. That is not true.
Don't play word games...:slapass:
Just download the videos from here http://www.thetechlounge.com/article...id=232&page=11
And you will notice the difference between them.
Shimmering only shows at extremely high resolutions, and on high quality lcd's, mainly widescreens. Says it right there in that review.Quote:
Originally Posted by ahmad
Also, nowhere did I say it was equal on both, I merely said shimmering occurs on both brands. Read my post and say where I said it was equal.
As for High quality AF, never said it could be done on NVidia cards, it was on the FX series, but since then NVidia has used the same AF method ATi was using up until the x1k series.
As for rivatuner, you can play with the L.O.D. and usually get a much better, shimmer free picture. That plus High quality = no shimmering on either brand. It's at default settings that it occurs for both brands.
That was when NVidia had a nasty bug that wasn't clamping the lod with AF involved, which brought new jaggies into the picture when textures moved. It has since then been fixed compared to that, and now only shows up at verrrry high rez, just like with ATi cards.Quote:
Originally Posted by TheVaLVe
Like I said before, both brands have their ups and downs, it's merely a matter of preference at this point. :fact:
Ok, I ran some 3Dmarks05/06 and play BF2 for a little while, IQ wise the ATI is only slightly better but hardly noticeable, I don't know how to explain it but graphics look slightly smoother, contrast a bit more profound, as expected overrall the ATI looks a little warmer than my 7800GT, slightly more orange/red hue in BF2. I have a 19" LCD so maybe the IQ will be more obvious on CRT or better/bigger screen. Performance over the 7800GT is impressive, I set everything on High at 1280x1024 @75hz and 6X AA, If I remember correctly fraps reports about ~85 FPS....game play is butter smooth, didn't notice any shimmering yet. Overall I do like this card, it's fun to play with, but the gain I get isn't like jaw dropping or anything, the stock fan is loud at 100% load but that's what the fan control is for, since I'm running phase change with the compressor humming along the fan noise of this card doesnt bother me too much. For the games I play and the LCD I use.... the 7800GT do just fine, if I have to build a brand new machine solely for gaming instead of benching I would definitely get the 7900GT instead and use the left over $200 for something else. :D :D
Maybe you have your red sunglasses on? [H] is a respectable site, there's no reason to doubt them.Quote:
Originally Posted by ahmad
That's debatable :p: [H] isn't the most enlightened of hw communities, though it does have a big following. Not to mention they just LOVE using different settings for different videocards in their reviews and can many times sound very biased towards one side. All that has known to happen at [H]. But then again there isn't one single site out there that does reviews exceptionally well consistently.Quote:
Originally Posted by HaLDoL
The point being you can't look at one review's statements in a vacuum, they need to be compared with reviews by other sites to see if there is a common problem.
Perkam
A lot of ATI fans discredit [H] because they were the ones that came with the first Doom 3 benchmarks and showed the 6800u beating the hell out of every ATi card. Although other sites confirmed [H]'s results, every ati fan was pissed at [H], and accused them of nvidia favourism.Quote:
Originally Posted by perkam
To me, [H] is a great site, they introduced this whole new concept of benchmarking by comparing the max playable setting and using min/max FPS graphs.
I like [H]'s way of reviewing as well, they're the ONLY one's who tell you the max playable settings for all cards in the review, and personally I too like to know(and honestly usually care MORE about) the minimum framerate...
If 2 cards can both go over 60 fps, wouldn't you prefer to know which one doesn't go below 30?
That's ton times better on ATI...Quote:
Originally Posted by TheVaLVe
Anyone know of any videos comparing ATI HQ vs NVIDIA HQ with shimmering bug fixed?
I'm actually leaning towards 7900GTX SLI a bit (basically because of noise and I'm experienced with SLI), however I always like trying new things and I'd love to judge for myself if ATI IQ really is that much better... Hmm... better wait till Oblivion benchies are released (real and official ones). Should be monday, no?
Monday should definitely be the day.
Imo... they should release benchmarks before the game's released, so you can prepare for the game in advance (ie make important hardware upgrade decisions such as video card), know what to expect etc. As it is now, you're gonna be missing out on a day or three of actual gaming just to read reviews, look at benchmarks and purchase hardware. :rolleyes:
Well, monday it is then.
Ok, after messing around with the voltages this card is now rock solid at 700/830. GPU won't clock higher than 700 probably due to excessive heat. I'm only using 1.475. :D
By the way, I did a thorough inspection on all the text in dos environment and holy smoke batman.....I was wrong!! actually those are the detail of the text, the pixel are much more sharper that's why they looks choppy. I'm not sure if this is good or bad :lol:
Quote:
Originally Posted by DilTech
Wrong about a few things in your list:
1. nVidia doesn't have better AA. ATi also supports super sampling via adaptive anti-aliasing (alpha textures anti-aliased) and matches nVidia's transparent super sampling algorithm. The 8xS mode that nVidia supports is technically a bit better than ATi's 6xAAA but it also incurs a much larger penalty thus making it useless for anything but old titles at high resolutions. With Crossfire, ATi cards support modes greater than 8x and the performance is way beyond SLi.
2. ATi has color saturation controls now in CCC that do nearly the same thing digital vibrance does.
ATi's angle independent AF cannot be matched by nVidia hardware no matter how much tweaking you do since it's an inherent hardware limitation.
SLi goes 16xSSAA, Crossfire goes 14xAA... SLi DEFINITELY has better AA, and generally better performance.Quote:
Originally Posted by 5150 Joker
Adaptive AA only uses SS on 2d textures, not the whole image. Also, 6xAAA does NOT match 8xSAA, no way no how. Either way, both 6xAAA and 8xSAA are both useless for new titles, as you stated, but 4xSSAA will still take 4xAAA any day of the week. :fact:
SuperSampling does every pixel on NVidia cards, there's a difference.
Angle independant AF I already said was ATi only, it's their big thing over nvidia right now honestly.
I read in an article, I forget where, that said when you use SM3 no matter with ATI or Nvidia that you have no AA. Is this true? The article also said that at higher resolutions with no AA there were a lot of jagged edges. I always game at least 1600 x 1200 and will of course only want to play at higher resolutions with any future upgrades. WTF? :confused:
These results would disagree with your assertion: http://www.firingsquad.com/hardware/...nce/page12.aspQuote:
Originally Posted by DilTech
http://www.xbitlabs.com/articles/vid...crossfire.html
http://www.hothardware.com/viewartic...&articleid=791
SLI gets slaughtered at antialiasing modes above 4x and no it doesn't have "better" AA.
Adaptive Antialiasing does exactly what nVidia's Transparent Antialiasing does and that is work on an object or polygon that passes alpha test and then has edge antialiasing done on it. However, ATi's adaptive antialiasing (quality mode) takes a much smaller hit than Nvidia's transparent super sampling.Quote:
Adaptive AA only uses SS on 2d textures, not the whole image.
Nobody said it matches it but the difference between the two isn't very pronounced nor is 8xS very practical for 90% of the new games out there at a decent resolution. Even 8xS isn't full supersampling, it's 4X multi-sample with a 2X super-sample (4xRGMS + 2xOGSS).Quote:
Also, 6xAAA does NOT match 8xSAA, no way no how.
Actually 6xAAA takes a very minimal hit on ATi cards: http://www.beyond3d.com/reviews/ati/r580/index.php?p=13Quote:
Either way, both 6xAAA and 8xSAA are both useless for new titles, as you stated, but 4xSSAA will still take 4xAAA any day of the week. :fact:
At 1600x1200 4xAAA/8x AF X1900 gets: 93 fps
At 1600x1200 6xAAA/16x AF (increased AF!) X1900 gets: 88.3 fps
Only in 8xS mode and unfortunately that mode takes a substantial performance penalty.Quote:
SuperSampling does every pixel on NVidia cards, there's a difference.
HQ AF, much better AA performance in Crossfire vs SLi above 4xAA/A, HDR+AA.Quote:
Angle independant AF I already said was ATi only, it's their big thing over nvidia right now honestly.
Very nice post 5150 Joker. Some one doing research on the subject is very refreshing...
I had a 7800 GTX, 7800 GTX 512, x1800 XT and now 7800 GT SLI
I can tell you, ATi IQ enhancement was little to none, and the driver DE-enhancement was enough to shove me back to NV within 3 weeks of my purchase of the XT.
You mean you dislike CCC? Or did ATI provide worse IQ in some area? Sorry!Quote:
Originally Posted by CompGeek
He surely meant ati drivers not being as slick as NVIDIA's. Well this is a good topic and I would like some more in game comparisions. Thank You all.
Quote:
Originally Posted by HaLDoL
They're some of the biggest trolls around, that site is crap and has been for a LOOONG time.
SLI 16x isn't even playable with countrerstrike 1.6... (i tried :( )Quote:
Originally Posted by DilTech
ATI has temporal AA which helps a lot but doesnt show up on screenshots, and only works with an FPS higher then refreshrate.
As for the shimmering, i personally can see shimmering in both ATI and nvidia cards, it's there, the thing tho, is that in certain cases it's ALL OVER with nvidia cards, and is a real pain, with ati cards, you'll every once in a while find a texture that shimmers. The shimmering isn't the same either.
If you want to see what shimmering looks like, and you have both an ATI and an nvidia card, do the following :
- Start with the ATI card, install fresh drivers, take a game with rather plain textures (i took cs 1.6, the colors are plain, the textures are often streched, so if there's any anomily you'll see it...), set IQ max, and move around.
- Set IQ to max performance and move around. The textures will look a bit blurry, and every once in a while you'll be able to spot a texture that slightly shimmers
- Set IQ back to max, move around
LOOK AROUND YOU !!!!!! it's funny, but if you actually look around (your room/outside... something real !!), it does make a difference.
Install nvidia graphics card
- Start with high performance settings, and move around. normally everything shimmers like hell, if you don't see it now you'd better go see an eye doctor, your cones aren't working properly.
- progressivly set the quality modes higher, and move around. Shimmering will slowly go away, and befor you know it, you'll be looking for that "ultra mega high quality mode" that takes away that remaining shimmering in high quality mode.
replace ATI graphics card set in high quality setting to make sure your eyes aren't playing tricks on you.
hope this helps.
Quote:
Originally Posted by 5150 Joker
Nice post, but why waste your energy; denial is not the sort of thing that the truth fixes ;)
nice trick for cs 1.6 (all half-life engine games actually): gl_mode "gl_linear_mipmap_linear"
changes from bilinear filtering to trilinear (which does help even when using anisotropic)
Shimmering is present at both 1280x1024 and 1280x768 with AFx16 and AAx4 to a MUCH greater degree on my 6600GT than my X850XT. I'm using either the cheap LCD built into my lappy or my Sony HS94P which isn't really considered a high quality LCD, it's about 2 years old.Quote:
Originally Posted by DilTech
Like I said, never really noticed it til I played world of warcraft, but now I notice it in any game that uses AF on a plane that has LOD. If you just care how quick 3dmark runs then I guess go for which-ever card is fastest for your $ but if you play a lot of WoW like me then you might want to reconsider.
Not meaning to offend anyone but I'm seeing a lot of qualified reviews being quoted, technological limitations cited, image comparisons, specific situations and even videos to back up the ATI side of things, but just assurances from the nvidia camp... Any videos or images that show a superior NV image?
I came from a 7800gtx, I have to say that NV's digital vibrance was always sweet for desktop/2d and I thought i would miss it.
with the x1900 I turn up the hue and saturation a bit and it looks better than the digital vibrance ever did, much sharper picture and less cartoonish looking in 2d and 3d.
I have to say IMHO the 7800gtx IQ is inferior to the x1900 in many ways.
On the other hand I thought x850 and x800 looked like crap compared to my 6800gt....
it strikes me funny how people with LCD monitors complain about the video card when its probly the monitor causing alot of the IQ disparity.
be a real gamer and use a CRT :D
***edit***
Mitsubishi Diamond Plus 200 22" Natural Flat
...and get cancer.
rather die of cancer than eye strain from a slow LCD
I can't work on CRT's at all. I'm very light sensitive and even 85hz on a nice CRT will give me a headache in under 10 minutes. The constant repainting of the image just kills me.
I had a boss that would view dual 21's...at 60hz. It never bugged him, lol! I only wish...
Quote:
Originally Posted by Jodiuh
Lol, not as sensitive as you. I find 85hz to be the best number for me. I so want to get a wide lcd but am waiting for a good deal.
Another good test for shimmering is Guild Wars. Especially Ascalon City after the Searing. With sli'd 7800gt's I could see shimmering everywhere, on the ground and on buildings and monuments. Turning drivers to high quality helped just a little dut did not get rid of shimmering, only lessened it a bit, but still very noticeable.
X1900xt is much better. There is a slight amount of shimmer on the ground textures, none on buildings and objects. The difference is quite drastic to me. I truly think Nv build there drivers/hardware with too many shortcuts to quality to improve performance...
ummm... the reason people notice these problems on LCD is because LCDs are a lot sharper than CRTs :stick:Quote:
Originally Posted by iboomalot
And what makes you think that a "real gamer" cant use a LCD?
2ms 17" and 19" LCDs are starting to show up and 1000:1 contrast ratios on a VA panel is pretty darn black.
And with the clarity and precision of a LCD screen, doesnt it make it easier to spot your enemies? :rolleyes:
Most people dont even notice motion blur and ghosting on a pure 8ms response time. And if they do notice some motion blur, they usually get used to it to a point that they cant see any ghosting or motion blur
A pure 8ms response time doesnt mean any LCD monitor that has a 8ms rating on it. It all depends on the companie's measureing system, the panel's technology, and the transition between colors. Some companies will say their monitor runs at an average 8ms, but the truth could be that it's a 8ms TN panel, which is likely to jump from 8ms to a response time that's a lot slower like 20ms.
So if you saw a LCD < 8ms and you noticed a lot of ghosting, then you eather saw a cr@%$y monitor or you have the super human ability to notice ghosting at that response time without a CRT next to the LCD.
I've got both a nice LCD and a couple nice CRT's.. dont have any nVidia cards to compare to my x1600xt at present though.
http://www.gamespot.com/features/6145814/index.html
I think, anyway IQ of 1900XTX is better than IQ of 7900GTX.
fearQuote:
Originally Posted by bakalu
"Both sets of cards rendered the game very well, and we'd be hard-pressed to tell the difference between the two. Even when zoomed in, the two setups produce nearly identical images."
hl2
"The images differ slightly due to inconsistent lighting, and the antialiasing implementations differ a bit. Overall, we're not likely to pick one picture over the other."
quake4
"Like the other tests, we cranked the settings all the way up to 1600x1200, with 4xAA, 16xAF, and high quality all around. It looks like we're staring at identical sets of pictures. If there are differences, they're small enough for us not to care."
SPCT
"The pictures differ slightly in lighting levels, but this was about as close as we could get with respect to brightness. We suspect that the Nvidia cards look a bit grainy specifically because of the extra light cast onto the bricks. We're sure that the ATI cards would look the same if they had the same light levels."
Overall not much difference according to them.
After playing with this card for a few days, and today swapping back the 7800GT just to see what happen and I must say....I missed the damn X1900XTX! :lol: I came to a conclusion that the ATI card produce sharper and more define images, be it gaming in BF2, Q4, watching DVD, or photo editing, I am now a believer of the ATI have superior IQ and no I DO NOT missed the digital vibrance....the 7800GT got yanked out after about an hour :D :D Though when it comes to driver support I think ATI still has a long way to go, I tried the Omega CAT 6.2 driver and that crap jacked up the OS, had two missing "unknown drives" in device manager which I later found they're the WDM drivers that were not included in the Omega package. The ATITOOL could definitely use some improvement.
I did ran into some gaming problem with the ATI, the moving key seems to stuck once awhile so the player just move on his own, or while I was moving forward even after I'd released the key it just kept going, it's not a keyboard problem as I had already swapped a different keyboard with the same result, wiped out the hard drive and did a fresh install to no available either....
Wait, you're saying ATI drivers need work because you used an unsupported third party modified driver??? :hm: :nuts:
:rolleyes: WTF?? don't read between lines :slapass: that's not what I was saying, the official CAT 6.3 is alright I guess but they can definitely do away the bloated CCC and other craps. I gave the Omega a try and thought maybe it worth a shot as someone might of improved something but it doesn't work on mine...although I could just ripped the WDM off of the CAT 6.3 but for the display driver I could of use the Omega, but I did not see any differences or improvement.Quote:
Originally Posted by Bar81
I guess I'm not the only one this happens to? I thought it was my G15.Quote:
Originally Posted by ben805
doesn't help.Quote:
Originally Posted by STEvil
works in WoW too.
It happens to me too on my G15 and my MS Natural Elite. I read around that is has to do with ATITool. When that is installed and has temp monitoring/logging as well as fan speed adjustments it sometimes does this.Quote:
Originally Posted by l337x3r1cx
Omega 6.2 Cats here too :) I've been an omega user since 4.12 cat versions from them ;) If only because ATI will continue to make drivers (and good ones) but third party drivers need our support :up:Quote:
Originally Posted by ben805
Plus they're great when bios flashing :D
Perkam