NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
This is real
http://img202.imageshack.us/img202/2527/80269813.jpg
Quote:
The anisotropic filtering, we need to exhibit at our detailed review, unfortunately, a bad witness. On the one hand, the texture filtering on the Radeon HD 6000 generation has been improved (the banding problem has been fixed for the most part), on the other hand, the textures flicker but more intense. That's because AMD has the standard anisotropic filtering at the level of AI Advanced lowered the previous generation. An incomprehensible step for us, because modern graphics cards provide enough performance to improve the image quality.
While there are games that show the difference hardly others suffer, however hard to flickering textures, dull the fun. After all, it is with the "High Quality" function possible, the existing AF-quality (usually) get back. Speak the Radeon HD 6800 provides for manual switching is still the quality of the previous generation, the standard quality is worse now!
Since we will not support such practices in 2010, we decided to test in future every Radeon HD 6000 card with about five percent slower high-quality version, so the final result with the default setting from Nvidia in is roughly comparable
http://www.computerbase.de/artikel/g...on-hd-6800/15/
:down:
Why dont they just test with quality options maxed out on both cards if "full quality" is so important?
btw, I agree that stepping back on quality is bad.
The point is the amount of shimmer irregardless of setting. But I agree, if you are too lazy or stupid to have catalyst and nvidia control panel constantly at HQ you have no right to :banana::banana::banana::banana::banana:. The performance gain has always been too little to warrant the degradation in quality.
Sounds like the 6*** series have Nv worried :yepp: :D
Do we need another thread on this?
Original: http://www.xtremesystems.org/forums/...=261588&page=5
Furthermore it has been proven that it depends mainly on games used, where AF quality can be higher on HD 6000 in newer titles and lower on NV. :shrug:
Starting with Catalyst 10.10 (and also including 10.11), the IQ is significantly reduced from previous ATI driver releases. The IQ reduction only affects HD 5800 and up GPUs and HD 6800 GPUs. This reduction gives a significant performance increase to the affected AMD GPUs. For an apples-to-apples comparison against NVIDIA GPUs either NVIDIA's IQ settings need to be dropped, or, ideally, AMD's need to be raised. Even raised, AMD's IQ cannot seem to match NVIDIA's default IQ.
Only video can illustrate the quality difference, but it's discernible: http://www.tweakpc.de/hardware/tests...d_6850/s09.php The videos are split frame, with the left side showing depicting what a GPU produces at a specific setting against what should be generated on the right.
While NVIDIA has posted about this on their blog: http://blogs.nvidia.com/ntersect/201...e-quality.html It's not their work, it's the finding of several major European tech sites.
Amorphous
Oh for god sake, not this :banana::banana::banana::banana: again... The reality is, if no one would point this out, no one would even notice it, so why make a big deal out of something 99% of ppl won't even notice? If you're such a purist, you already know what CCC is and how to adjust things to turn all optimizations OFF. So do that and stop complaining. One thing are optimizations that you cannot circumvent and others that you can't. In this case you can. So to that. Facepalm.
nvidia have done this way before ati so cause now it hurts suddenly iq became important ? and lowering iq is unethical ? lol
Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.
AMD reducing the default IQ means benchmarkers are going to need to adjust their testing procedures to generate an apples-to-apples result, or it's no longer a remotely fair comparison. Might as well benchmark with widely different AA settings.
Users can and should make their own determination about what level of IQ they desire, and adjust their settings appropriately for their desired gaming experience.
Amorphous
Ohh Nvidia privat fanboy army is here to reveal the truth... :rolleyes:
Save our ignorant souls so all money is spent on the only, true company that did not ever optimize their drivers. Ohh wait?
On a more serious note - Original AF thread is here: http://www.xtremesystems.org/forums/...=261588&page=5
This one should be locked.
fail news
IQ on my 6850 with the new HQ setting is noticeably higher than on my old 5850
AMD = FAIL!
so amd cheat again
but it does not matter when its aMd who cheat???:mad:
amd have failed in so many ways recently
they are much worse than nVidia has ever been
Why can not they just admit they also have lost this round and move forward:yepp:
i fail to see how they fail, they offer their users a higher quality setting and the possibility to get a higher performance if you don't notice any differences yet people claim that amd screws its customers?????
what they should do is make the HQ setting default and offer the high performance setting as an option but your comment still is a mountain of fail
Its always the same story.. Fanboys immediately attack the other side and exaggerate:) Sensible people wouldn't really believe what they are saying so it just feels like they need to encourage themselves
i hope this topic stirrs up a lot of official debate and name calling from both companies so that both put iq at the top of their prioriy and stop worrying so much about only fps and releasing pointless technology like 3d.
well for my part i dont expect people working for nvidia to be unbiased. even non related people have bias, so if you work for them i think that comes with the job no?
anyway if new drivers lower iq default then reviwers should be aware and when cayman is out they should perform all tests in high quality baseline.
And the puppet masters start pulling the strings, Story is contradictory and configuration dependent, point being as normal ATI Image quality is better at default then it was on 5870 as it was 4870>5870.
Both sides have issues with IQ depending on driver OS and game, and other API.
Nvidia maybe opening a can of worms for themselves here if anything.
When i've installed Cat 10.10e, the first thing that i did was move the texture quality slider to High Quality but kept the Optimize surface feature enabled and i play all games with 16x AF and MLAA. I can't really complain about image quality because i don't see any reason to do that. I don't have anything against NVIDIA settings that i was familiar with in the past. Some optimizations exhibited shimmering effect on textures, but other than that if the optimization can give a significant boost and you can only notice the difference on side to side image comparisons, i think the optimization is well justified. But maybe both NVIDIA and AMD should release drivers with big red button that says "TURN EVERYTHING MAXXXXXXXX!!!!!!!!11111" to make all the whiners happy.