Quote Originally Posted by ***Deimos*** View Post
The deaf man following the blind man?
1. There is N0 standard. Nadda. None. WHQL ensures your Start popup menu is rendered correctly.. not Crysis. Sharper = better!? Assuming game developer intended maximum sharpness blur on everything?

Bottom line:
3D rendering is the wild west. You think programming "hello world" is too different in COBOL, Java and Python!? Video drivers are expected to work perfectly with 10000+ games running in DX9, DX10, DX11, OGL and dozens of OS versions, not to mention all the video and GPGPU extensions, and each hardware generation works differently!
There is a standard, its called the API specs and drivers are supposed to follow them. Now lets say developer A creates custom mipmaps and writes the rendering engine assuming no LOD bias as per the SPECS, and the guys at ATI obviously know better and introduce a driver level LOD bias after the game has shipped, now who's responsibility is it to fix the problem? The developer that followed the API specs or the driver team that knew better.

3D rendering is not the wild west! The whole graphical pipeline is very simple, the developer has control over most key stages in the pipeline except for the finer points of triangle setup/traversal/texture filtering/clipping and to an extent blending. The driver team just needs to ensure that the driver behaves according to the API specs.

From your post, I'm assuming you don't really know what LOD bias does and from where the supposed sharpness comes from.

Quote Originally Posted by SKYMTL View Post
One question: why is an optimization being called a "cheat"?

As long as the effect is invisible to the end user and has a positive impact on performance, I personally don't give two hoots if it increases framerates by two percent or two hundred percent. I tried to allude to the same thing when talking about the substitution of FP16 render targets since while it has a positive impact upon performance, a person actually PLAYING a game (instead of staring at comparison screens) will likely NEVER see the difference. Me, I see a slight difference in some isolated cases but that's just because I play some games like DoW 2 ALOT so I can see the minor differences with ATI's implementation in that game.

The same thing goes (IMO) for anything above 8xMSAA. Other than old Myst-style point and move style games, 99% of todays apps involve paying attention to a moving picture; not glorified screenshots. So, why would someone even stop to care about a few jagged lines on a fence 200 feet away? On the other hand, if higher IQ modes can be enabled through the use of higher end hardware, I'm all for that as well.

Naturally, decreasing overall image quality for higher scores in reviews isn't "ethical" but I booted Crysis on both ATI and NVIDIA hardware over the weekend and couldn't see any difference. Yes, the ATI cards do have an odd issue where some edges shimmer a bit but between the 9.12 drivers and the newest 10.3a, I saw no differences in either performance or overall IQ.
The problem with your logic is that that performance boost from their optimization will often mislead consumers into thinking that the card is faster than it actually is. When one card is doing 16xAF and the other is doing trilinear, it really isn't fair to compare performance levels, and I think that is the gist of the problem. Yeh, the gamer will probably not notice but then he will also go around spouting nonsense about how awesome his new fangled card is until he hits a game that isn't "optimized" for his card and then will obviously start complaining about how terrible the game engine is.

If Nvidia disabled their AF in crysis you can expect a large boost in FPS as well but then the internet would run red with nerd rage!

AF is a pretty standard thing and they should implement it correctly instead of attempting to fake it. They are trading quality for performance and I can see how that is perfectly acceptable from their viewpoint but I still think that they should leave that choice to the developer and end user. If I find that AF kills performance let me turn it off, don't do it for me and then lie to me about it.