-
One question: why is an optimization being called a "cheat"?
As long as the effect is invisible to the end user and has a positive impact on performance, I personally don't give two hoots if it increases framerates by two percent or two hundred percent. I tried to allude to the same thing when talking about the substitution of FP16 render targets since while it has a positive impact upon performance, a person actually PLAYING a game (instead of staring at comparison screens) will likely NEVER see the difference. Me, I see a slight difference in some isolated cases but that's just because I play some games like DoW 2 ALOT so I can see the minor differences with ATI's implementation in that game.
The same thing goes (IMO) for anything above 8xMSAA. Other than old Myst-style point and move style games, 99% of todays apps involve paying attention to a moving picture; not glorified screenshots. So, why would someone even stop to care about a few jagged lines on a fence 200 feet away? On the other hand, if higher IQ modes can be enabled through the use of higher end hardware, I'm all for that as well. 
Naturally, decreasing overall image quality for higher scores in reviews isn't "ethical" but I booted Crysis on both ATI and NVIDIA hardware over the weekend and couldn't see any difference. Yes, the ATI cards do have an odd issue where some edges shimmer a bit but between the 9.12 drivers and the newest 10.3a, I saw no differences in either performance or overall IQ.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
Bookmarks