Nvidia has always had a way to disable the optimizations fully though. It's only now that AMD is allowing it's customers the same option..
Printable View
Global seting is not well done, okey, not big deal, and nothing new.
I seen this many times, new graphics just need latest driver.
....And then this happend. Again drivers.
Is all ok, but drivers need bo be written in in good language, small and fast as bullet. (amd dont practice this)
Then driver expands for games settings, more bugs being fix each day and huge drivers is hard to fix :)
Pacth for games fixes game graphics too.
Guys you got nothing to worry abaut it, just relax and wait for new driver :)
This is what the translation says. So I will go with that. You should edit your subject to fall inline with the context of the article. I'm not seeing anything that goes into detail with your subject of "AMD's AF fix proven false" nor have you provided any proof of that.Quote:
Yes, we have detailed considerations in Half-Life 2, Oblivion and Trackmania carried out today and come to the conclusion that the Radeon HD 6800 by bad drivers default filters than its predecessor... (next page) Than with the Catalyst 10.9 or earlier versions, which we re-examined in Half-Life 2, Oblivion and Track Mania have
The drivers may very well no longer be properly optimized for 6+ year old games. Which is why I think the subject of this thread is misleading. Also, I think we should all ask why there are no current games used for AF comparisons? If we are going to examine IQ as a rule of thumb it should only be done with current games. We can't make any comparisons of AF quality from games dating back 6+ years ago to a game recently release.
This is how Trackmania from 2003 looks like...:brick:
http://img.brothersoft.com/screensho...l-67610-1.jpeg
That's really not relevant to the context of my replies to this thread as a whole. The gist of my post(s) is that we can't make any comparisons between games that go as far back several years vs current released games. This is not a good example of how to do IQ comparisons. And I (as well as others) can't draw anything from it since the games used are so old. :up:
I kind of miss trackmania 2003.. at least it runs good on my GTX480 whereas the current version cant. My 4870X2 runs about three times the speed with all options at full :(
Things have turned around, nvidia now has had the better IQ for a while, it was only the 9700 PRO and GF4 times where ATI was thought to have better IQ. They both look the same to me tho, the experts seem to disagree.
Anything to steal some buzz. I guess 1 site can be the mother of all truth.
TBH, I think people need to get over these comparisons trying to look for an "apples to apples" approach. It's been going on for years [IQ optimisation], and the arguements haven't contributed a great deal besides altering the results of said tests from driver to driver. Given the options though I can almost guarantee most cards will output a similar result, just some of us want IQ and some of us want FPS, it's hard to get both.
Maybe we need to complain a bit more about what options we have to play with as 'advanced' users, because tbh the extra options we get from the 'basic' settings are pretty limited at best. Sure I know they introduce problems, which AMD/nVIDIA would really rather not have to provide extra support for, but all they need to do is put up a disclaimer about the extra options not being supported [which they already do in some part] and voila problem solvered. From there we can choose to utilize these features or not.......
Here's some of the option's I'm using in the Catalyst 10.10a drivers. Yes, you'll note I've manually put in some extra settings, and no I'm not using MLAA. I'm still uncertain about me needing that feature, but welcome it for those who can't use 'normal' AA ingame.
I think the whole thing started here - http://www.forum-3dcenter.org/vbulle...494193&page=11
Too bad, it looks, that English sites don't care about IQ anymore, even the last problem wasn't covered at all by them.
So long as you're getting a decent visual experience, it's not cheating or underhanded to mess with your default IQ settings. Crank it up if you want it. Beyond that, who gives a crap? (I know many of you do, but you shouldn't.)
It would only be lame if AMD and nVidia were in a slow race to see how much IQ they could sacrifice to get speed, notching back and forth as they each go a bit further until the "default" no longer serves as a decent level of quality but instead a downmixed 4x4 pixel matrix representing the entire screen. That just isn't the case here. They drop bits of IQ that people don't generally notice if it helps them achieve other goals, but other than that there are improvements going on. Occasionally there are mistakes (missing rocks and such in Crysis for instance) that have to be addressed. Missing geometry is one thing if used as an intentional default (bad practice). It's another if it's a bug (forgivable & normal). It's another as well if it's a non-default but available setting for users to choose for themselves (perfectly legit).
You have to understand that yes there image quality is being sarcificed for speed. However, the settings are still there to improve the image quality.
There was an improvement still not as good as it could have been.
Honestly we are dealing with very slight changes in image quality unless you are actively looking for them I doubt majority of people would be able to notice it.
its pretty simple, either things look like crap for a specific newer game and you should submit a bug to the company (either), or its so insignificant that you shouldnt notice it.
OP, I find your thread title very misleading. :down:
lol @ the drama
1. Nvidia's default settings ARE optimized, they are already represent a trade off in speed vs image quality. That's why you can tweak them, the same is true for ATI.
2. This is some site that tested some old obscure games and it is in their OPINION that the default Nvidia settings are a like the ATI high quality.
Image quality is highly subjective. The last time I noticed anything was Crysis looking better on my x1900 xtx vs my 8800GT. Speed went up going to the 8800GT, but image quality went way down. In switching from the 5870 to the 470 gtx to the gtx 460 I haven't noticed anything in terms of image quality, but I don't play Crysis anymore either.
That's true, unfortunately. When there are no clearly defined rules of "playing fair" or "cheating", as is in the case of image quality, the whole thing becomes a playground for fanboys.
It is for certain that AMD lowered the default AF quality in 6800 series cards and this added a 5% performance boost. I can't even think of the responses if Nvidia had done that.
I think those peeps didnt play quake1, doom tnt, hexen...