Page 1 of 8 1234 ... LastLast
Results 1 to 25 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #1
    Xtreme Mentor
    Join Date
    Feb 2009
    Location
    Bangkok,Thailand (DamHot)
    Posts
    2,693

    Exclamation NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

    NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance

    Link http://www.legitreviews.com/news/9482/

    Source: Legit Reviews
    Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
    EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
    Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
    Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
    [history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K

  2. #2
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    This is real



    The anisotropic filtering, we need to exhibit at our detailed review, unfortunately, a bad witness. On the one hand, the texture filtering on the Radeon HD 6000 generation has been improved (the banding problem has been fixed for the most part), on the other hand, the textures flicker but more intense. That's because AMD has the standard anisotropic filtering at the level of AI Advanced lowered the previous generation. An incomprehensible step for us, because modern graphics cards provide enough performance to improve the image quality.

    While there are games that show the difference hardly others suffer, however hard to flickering textures, dull the fun. After all, it is with the "High Quality" function possible, the existing AF-quality (usually) get back. Speak the Radeon HD 6800 provides for manual switching is still the quality of the previous generation, the standard quality is worse now!

    Since we will not support such practices in 2010, we decided to test in future every Radeon HD 6000 card with about five percent slower high-quality version, so the final result with the default setting from Nvidia in is roughly comparable

    http://www.computerbase.de/artikel/g...on-hd-6800/15/

    Last edited by cold2010; 11-19-2010 at 09:48 PM.

  3. #3
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Why dont they just test with quality options maxed out on both cards if "full quality" is so important?

    btw, I agree that stepping back on quality is bad.

    All along the watchtower the watchmen watch the eternal return.

  4. #4
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    AI off = quality, i dont see the point. i guess it serves them right for adding in a higher setting that is not hidden like it always was. and NV dose not start at the highest setting ether so i dont get the point
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  5. #5
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by zanzabar View Post
    AI off = quality, i dont see the point. i guess it serves them right for adding in a higher setting that is not hidden like it always was. and NV dose not start at the highest setting ether so i dont get the point
    The point is the amount of shimmer irregardless of setting. But I agree, if you are too lazy or stupid to have catalyst and nvidia control panel constantly at HQ you have no right to . The performance gain has always been too little to warrant the degradation in quality.

  6. #6
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    Quote Originally Posted by zanzabar View Post
    AI off = quality, i dont see the point. i guess it serves them right for adding in a higher setting that is not hidden like it always was. and NV dose not start at the highest setting ether so i dont get the point
    sorry very sorry , A.I Quality bad vrey bad " Like it or not " A.I Quality is bad , good luck
    Last edited by cold2010; 11-19-2010 at 11:32 PM.

  7. #7
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by cold2010 View Post
    sorry very sorry , A.I Quality bad vrey bad " Like it or not " A.I Quality is bad , good luck
    the settings are equal not that no AI is good quality. stop reading into things


    also look at the SS from the cards, that dose not have them but they look fine
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  8. #8
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    another misleading thread title


    Quote Originally Posted by cold2010 View Post
    sorry very sorry , A.I Quality bad vrey bad " Like it or not " A.I Quality is bad , good luck

    other then say bad very bad... explain to us why its bad ...
    Last edited by Sn0wm@n; 11-20-2010 at 12:10 AM.
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  9. #9
    Xtreme Addict
    Join Date
    Feb 2005
    Location
    OZtralia
    Posts
    2,051
    Sounds like the 6*** series have Nv worried
    lots and lots of cores and lots and lots of tuners,HTPC's boards,cases,HDD's,vga's,DDR1&2&3 etc etc all powered by Corsair PSU's

  10. #10
    Do we need another thread on this?

    Original: http://www.xtremesystems.org/forums/...=261588&page=5

    Furthermore it has been proven that it depends mainly on games used, where AF quality can be higher on HD 6000 in newer titles and lower on NV.

  11. #11
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    Starting with Catalyst 10.10 (and also including 10.11), the IQ is significantly reduced from previous ATI driver releases. The IQ reduction only affects HD 5800 and up GPUs and HD 6800 GPUs. This reduction gives a significant performance increase to the affected AMD GPUs. For an apples-to-apples comparison against NVIDIA GPUs either NVIDIA's IQ settings need to be dropped, or, ideally, AMD's need to be raised. Even raised, AMD's IQ cannot seem to match NVIDIA's default IQ.

    Only video can illustrate the quality difference, but it's discernible: http://www.tweakpc.de/hardware/tests...d_6850/s09.php The videos are split frame, with the left side showing depicting what a GPU produces at a specific setting against what should be generated on the right.

    While NVIDIA has posted about this on their blog: http://blogs.nvidia.com/ntersect/201...e-quality.html It's not their work, it's the finding of several major European tech sites.


    Amorphous
    Last edited by Amorphous; 11-20-2010 at 01:11 AM.
    NVIDIA Forums Administrator

  12. #12
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    Oh for god sake, not this again... The reality is, if no one would point this out, no one would even notice it, so why make a big deal out of something 99% of ppl won't even notice? If you're such a purist, you already know what CCC is and how to adjust things to turn all optimizations OFF. So do that and stop complaining. One thing are optimizations that you cannot circumvent and others that you can't. In this case you can. So to that. Facepalm.
    Last edited by RejZoR; 11-20-2010 at 01:21 AM.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  13. #13
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    nvidia have done this way before ati so cause now it hurts suddenly iq became important ? and lowering iq is unethical ? lol
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  14. #14
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.

    AMD reducing the default IQ means benchmarkers are going to need to adjust their testing procedures to generate an apples-to-apples result, or it's no longer a remotely fair comparison. Might as well benchmark with widely different AA settings.

    Users can and should make their own determination about what level of IQ they desire, and adjust their settings appropriately for their desired gaming experience.


    Amorphous

    Quote Originally Posted by RejZoR View Post
    Oh for god sake, not this again... The reality is, if no one would point this out, no one would even notice it, so why make a big deal out of something 99% of ppl won't even notice? If you're such a purist, you already know what CCC is and how to adjust things to turn all optimizations OFF. So do that and stop complaining. One thing are optimizations that you cannot circumvent and others that you can't. In this case you can. So to that. Facepalm.
    NVIDIA Forums Administrator

  15. #15

    Exclamation

    Quote Originally Posted by Amorphous View Post
    Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.

    AMD reducing the default IQ means benchmarkers are going to need to adjust their testing procedures to generate an apples-to-apples result, or it's no longer a remotely fair comparison. Might as well benchmark with widely different AA settings.

    Users can and should make their own determination about what level of IQ they desire, and adjust their settings appropriately for their desired gaming experience.


    Amorphous
    Ohh Nvidia privat fanboy army is here to reveal the truth...

    Save our ignorant souls so all money is spent on the only, true company that did not ever optimize their drivers. Ohh wait?

    On a more serious note - Original AF thread is here: http://www.xtremesystems.org/forums/...=261588&page=5

    This one should be locked.
    Last edited by Shadov; 11-20-2010 at 01:45 AM.

  16. #16
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    fail news

    IQ on my 6850 with the new HQ setting is noticeably higher than on my old 5850
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  17. #17
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    112
    AMD = FAIL!
    so amd cheat again
    but it does not matter when its aMd who cheat???
    amd have failed in so many ways recently
    they are much worse than nVidia has ever been

    Why can not they just admit they also have lost this round and move forward
    Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
    8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
    MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24Ļin nVidia Surround

  18. #18
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by E30M3 View Post
    AMD = FAIL!
    so amd cheat again
    but it does not matter when its aMd who cheat???
    amd have failed in so many ways recently
    they are much worse than nVidia has ever been

    Why can not they just admit they also have lost this round and move forward
    i fail to see how they fail, they offer their users a higher quality setting and the possibility to get a higher performance if you don't notice any differences yet people claim that amd screws its customers?????

    what they should do is make the HQ setting default and offer the high performance setting as an option but your comment still is a mountain of fail
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  19. #19
    Xtreme Member
    Join Date
    Sep 2009
    Location
    London
    Posts
    247
    Its always the same story.. Fanboys immediately attack the other side and exaggerate Sensible people wouldn't really believe what they are saying so it just feels like they need to encourage themselves

  20. #20
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    i hope this topic stirrs up a lot of official debate and name calling from both companies so that both put iq at the top of their prioriy and stop worrying so much about only fps and releasing pointless technology like 3d.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  21. #21
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Quote Originally Posted by generics_user View Post
    i fail to see how they fail, they offer their users a higher quality setting and the possibility to get a higher performance if you don't notice any differences yet people claim that amd screws its customers?????

    what they should do is make the HQ setting default and offer the high performance setting as an option but your comment still is a mountain of fail
    Check his signature...

  22. #22
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by Amorphous View Post
    Starting with Catalyst 10.10 (and also including 10.11), the IQ is significantly reduced from previous ATI driver releases. The IQ reduction only affects HD 5800 and up GPUs and HD 6800 GPUs. This reduction gives a significant performance increase to the affected AMD GPUs. For an apples-to-apples comparison against NVIDIA GPUs either NVIDIA's IQ settings need to be dropped, or, ideally, AMD's need to be raised. Even raised, AMD's IQ cannot seem to match NVIDIA's default IQ.

    Only video can illustrate the quality difference, but it's discernible: http://www.tweakpc.de/hardware/tests...d_6850/s09.php The videos are split frame, with the left side showing depicting what a GPU produces at a specific setting against what should be generated on the right.

    While NVIDIA has posted about this on their blog: http://blogs.nvidia.com/ntersect/201...e-quality.html It's not their work, it's the finding of several major European tech sites.


    Amorphous


    what's what in your sig ??? is it bias i see ???
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  23. #23
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by Sn0wm@n View Post
    what's what in your sig ??? is it bias i see ???
    well for my part i dont expect people working for nvidia to be unbiased. even non related people have bias, so if you work for them i think that comes with the job no?

    anyway if new drivers lower iq default then reviwers should be aware and when cayman is out they should perform all tests in high quality baseline.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  24. #24
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    211
    And the puppet masters start pulling the strings, Story is contradictory and configuration dependent, point being as normal ATI Image quality is better at default then it was on 5870 as it was 4870>5870.

    Both sides have issues with IQ depending on driver OS and game, and other API.

    Nvidia maybe opening a can of worms for themselves here if anything.

  25. #25
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    When i've installed Cat 10.10e, the first thing that i did was move the texture quality slider to High Quality but kept the Optimize surface feature enabled and i play all games with 16x AF and MLAA. I can't really complain about image quality because i don't see any reason to do that. I don't have anything against NVIDIA settings that i was familiar with in the past. Some optimizations exhibited shimmering effect on textures, but other than that if the optimization can give a significant boost and you can only notice the difference on side to side image comparisons, i think the optimization is well justified. But maybe both NVIDIA and AMD should release drivers with big red button that says "TURN EVERYTHING MAXXXXXXXX!!!!!!!!11111" to make all the whiners happy.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

Page 1 of 8 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •