NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
[history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K
This is real
The anisotropic filtering, we need to exhibit at our detailed review, unfortunately, a bad witness. On the one hand, the texture filtering on the Radeon HD 6000 generation has been improved (the banding problem has been fixed for the most part), on the other hand, the textures flicker but more intense. That's because AMD has the standard anisotropic filtering at the level of AI Advanced lowered the previous generation. An incomprehensible step for us, because modern graphics cards provide enough performance to improve the image quality.
While there are games that show the difference hardly others suffer, however hard to flickering textures, dull the fun. After all, it is with the "High Quality" function possible, the existing AF-quality (usually) get back. Speak the Radeon HD 6800 provides for manual switching is still the quality of the previous generation, the standard quality is worse now!
Since we will not support such practices in 2010, we decided to test in future every Radeon HD 6000 card with about five percent slower high-quality version, so the final result with the default setting from Nvidia in is roughly comparable
http://www.computerbase.de/artikel/g...on-hd-6800/15/
Last edited by cold2010; 11-19-2010 at 09:48 PM.
Why dont they just test with quality options maxed out on both cards if "full quality" is so important?
btw, I agree that stepping back on quality is bad.
All along the watchtower the watchmen watch the eternal return.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
The point is the amount of shimmer irregardless of setting. But I agree, if you are too lazy or stupid to have catalyst and nvidia control panel constantly at HQ you have no right to . The performance gain has always been too little to warrant the degradation in quality.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
Sounds like the 6*** series have Nv worried
lots and lots of cores and lots and lots of tuners,HTPC's boards,cases,HDD's,vga's,DDR1&2&3 etc etc all powered by Corsair PSU's
Do we need another thread on this?
Original: http://www.xtremesystems.org/forums/...=261588&page=5
Furthermore it has been proven that it depends mainly on games used, where AF quality can be higher on HD 6000 in newer titles and lower on NV.
Starting with Catalyst 10.10 (and also including 10.11), the IQ is significantly reduced from previous ATI driver releases. The IQ reduction only affects HD 5800 and up GPUs and HD 6800 GPUs. This reduction gives a significant performance increase to the affected AMD GPUs. For an apples-to-apples comparison against NVIDIA GPUs either NVIDIA's IQ settings need to be dropped, or, ideally, AMD's need to be raised. Even raised, AMD's IQ cannot seem to match NVIDIA's default IQ.
Only video can illustrate the quality difference, but it's discernible: http://www.tweakpc.de/hardware/tests...d_6850/s09.php The videos are split frame, with the left side showing depicting what a GPU produces at a specific setting against what should be generated on the right.
While NVIDIA has posted about this on their blog: http://blogs.nvidia.com/ntersect/201...e-quality.html It's not their work, it's the finding of several major European tech sites.
Amorphous
Last edited by Amorphous; 11-20-2010 at 01:11 AM.
NVIDIA Forums Administrator
Oh for god sake, not this again... The reality is, if no one would point this out, no one would even notice it, so why make a big deal out of something 99% of ppl won't even notice? If you're such a purist, you already know what CCC is and how to adjust things to turn all optimizations OFF. So do that and stop complaining. One thing are optimizations that you cannot circumvent and others that you can't. In this case you can. So to that. Facepalm.
Last edited by RejZoR; 11-20-2010 at 01:21 AM.
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.
AMD reducing the default IQ means benchmarkers are going to need to adjust their testing procedures to generate an apples-to-apples result, or it's no longer a remotely fair comparison. Might as well benchmark with widely different AA settings.
Users can and should make their own determination about what level of IQ they desire, and adjust their settings appropriately for their desired gaming experience.
Amorphous
NVIDIA Forums Administrator
Ohh Nvidia privat fanboy army is here to reveal the truth...
Save our ignorant souls so all money is spent on the only, true company that did not ever optimize their drivers. Ohh wait?
On a more serious note - Original AF thread is here: http://www.xtremesystems.org/forums/...=261588&page=5
This one should be locked.
Last edited by Shadov; 11-20-2010 at 01:45 AM.
fail news
IQ on my 6850 with the new HQ setting is noticeably higher than on my old 5850
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
AMD = FAIL!
so amd cheat again
but it does not matter when its aMd who cheat???
amd have failed in so many ways recently
they are much worse than nVidia has ever been
Why can not they just admit they also have lost this round and move forward
Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24Ļin nVidia Surround
i fail to see how they fail, they offer their users a higher quality setting and the possibility to get a higher performance if you don't notice any differences yet people claim that amd screws its customers?????
what they should do is make the HQ setting default and offer the high performance setting as an option but your comment still is a mountain of fail
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
Its always the same story.. Fanboys immediately attack the other side and exaggerate Sensible people wouldn't really believe what they are saying so it just feels like they need to encourage themselves
i hope this topic stirrs up a lot of official debate and name calling from both companies so that both put iq at the top of their prioriy and stop worrying so much about only fps and releasing pointless technology like 3d.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
well for my part i dont expect people working for nvidia to be unbiased. even non related people have bias, so if you work for them i think that comes with the job no?
anyway if new drivers lower iq default then reviwers should be aware and when cayman is out they should perform all tests in high quality baseline.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
And the puppet masters start pulling the strings, Story is contradictory and configuration dependent, point being as normal ATI Image quality is better at default then it was on 5870 as it was 4870>5870.
Both sides have issues with IQ depending on driver OS and game, and other API.
Nvidia maybe opening a can of worms for themselves here if anything.
When i've installed Cat 10.10e, the first thing that i did was move the texture quality slider to High Quality but kept the Optimize surface feature enabled and i play all games with 16x AF and MLAA. I can't really complain about image quality because i don't see any reason to do that. I don't have anything against NVIDIA settings that i was familiar with in the past. Some optimizations exhibited shimmering effect on textures, but other than that if the optimization can give a significant boost and you can only notice the difference on side to side image comparisons, i think the optimization is well justified. But maybe both NVIDIA and AMD should release drivers with big red button that says "TURN EVERYTHING MAXXXXXXXX!!!!!!!!11111" to make all the whiners happy.
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
Bookmarks