MMM
Page 7 of 8 FirstFirst ... 45678 LastLast
Results 151 to 175 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #151
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by zalbard View Post
    I think an article showing the performance and quality difference would be a nice start, and quite necessary to form an opinion in this case.

    Quote Originally Posted by god_43 View Post
    omg why does this matter so much? this thread has gone on for six pages...about quality settings?
    You can't make a judgment based on benchmarks alone.

    Quote Originally Posted by Chickenfeed View Post
    I can honestly say that I'm noticing less texture shimmering on a GTX 580 with texture quality set to high quality compared to my previous HD 5870 with Catalyst AI set to high quality (10.10e hotfix ) I'd still like to see how the 5800 compare to the 6800s when they both have acess to these newer Catalyst AI options.
    Did they check the texture sharpness in the comparison? When I follow the test of rage3d regarding 6800 series and AF, I notice that shimmering disappear (on my 5870) when I increase the lod bias to a positive value.
    If you bump LoD to +0.65 the shimmer disappears. On NV cards nudge LoD to -0.65 and it appears.
    I haven't tried this myself but here is the link to this discussion.

    I also saw this posted today.

    Last edited by BababooeyHTJ; 12-01-2010 at 02:05 PM.

  2. #152
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    I have always enabled the best quality options on either brand. Any reviewers that test high quality performance, even just for a section of the review, and do a quality comparison would surely receive more traffic from me.

  3. #153
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    I say default, but with disclaimer.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  4. #154
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    Quote Originally Posted by god_43 View Post
    omg why does this matter so much? this thread has gone on for six pages...about quality settings?
    And what else is the point of looking for a better video card? If we didn't care about image quality, we'd still be on ancient hardware. But in the end, the whole reason we even care about benchmarks is because we want to know "how smooth, and how good-looking can I make my games?)
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  5. #155
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    Yes,

    Here in France some are doing reviews with HQ mode and I think it's a good move.

    AMD lowering default quality settings is really a bad thing

  6. #156
    Xtreme Member
    Join Date
    Nov 2006
    Location
    Brazil
    Posts
    257
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    Yes, reviewers should use HQ settings, but with transparent anti-aliasing off.

  7. #157

    Exclamation

    Quote Originally Posted by Olivon View Post
    Yes,

    Here in France some are doing reviews with HQ mode and I think it's a good move.

    AMD lowering default quality settings is really a bad thing
    Have you seen any visual difference in games that you play between ATI and Nvidia?

    Can you control AF quality on Nvidia cards?

    Are you sure the AF quality is the same on Nvidia and ATI?

    Do you know that AF HQ quality on AMD disables ANY driver optimization made (even the ones without visual difference)?

    That way reviewers should use comperable settings on both vendors and not crank up the settings on one brand (Radeon) that allows control of AF filters, while the other can get away with optimizations enabled (GeForce).

  8. #158
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Shadov View Post
    Are you sure the AF quality is the same on Nvidia and ATI?

    Do you know that AF HQ quality on AMD disables ANY driver optimization made (even the ones without visual difference)?

    That way reviewers should use comperable settings on both vendors and not crank up the settings on one brand (Radeon) that allows control of AF filters, while the other can get away with optimizations enabled (GeForce).
    This is why we need some testing done. With performance numbers and *.bmp screenshots.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  9. #159
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by Shadov View Post
    Have you seen any visual difference in games that you play between ATI and Nvidia?

    Can you control AF quality on Nvidia cards?

    Are you sure the AF quality is the same on Nvidia and ATI?

    Do you know that AF HQ quality on AMD disables ANY driver optimization made (even the ones without visual difference)?

    That way reviewers should use comperable settings on both vendors and not crank up the settings on one brand (Radeon) that allows control of AF filters, while the other can get away with optimizations enabled (GeForce).
    Just read germans articles Shadov ... AMD made a mistake, nothing more to say

  10. #160

    Exclamation

    Quote Originally Posted by Olivon View Post
    Just read germans articles Shadov ... AMD made a mistake, nothing more to say
    I wouldnt base my opinion on one article where one editor has skipped other recent games which would prove otherwise.

    Believe me both AMD and Nvidia have optimizations in their drivers and it mostly depends on what games and scenes you choose to show.

    But now we have a situation where AMD enables users the control of the filter, while Nvidia can optimize and tweak their drivers till someone screams again bloody murder. Of course that is if they (key word coming) *notice*.



    Edit: To make it more clear, reviewers should ask for a clear statement which AMD vs NV modes to use while comparing performance or in the near future AMD could disable AF quality filtering settings in their drivers to have the same level of control over applications (and performance) as Nvidia, which is currently hidden away from the end user.
    Last edited by Shadov; 12-02-2010 at 05:16 AM.

  11. #161
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875

    Exploring ATI Image Quality Optimizations

    http://www.guru3d.com/article/explor...optimizations/
    By: Hilbert Hagedoorn | Edited by Editor | Published: December 3, 2010

  12. #162
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    Exploring ATI Image Quality Optimizations
    wt this ? Exploring , show one game image Quality sorry Hilbert But

  13. #163
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Final8ty View Post
    http://www.guru3d.com/article/explor...optimizations/
    By: Hilbert Hagedoorn | Edited by Editor | Published: December 3, 2010
    OK, interesting article.
    I've been staring at these ME2 screenshots to no avail.
    My verdict is:
    This optimisation is not a big deal... 5% performance traded for an almost-impossible-to-see difference in picture quality (at least in actual gaming scenarios). So I may as well leave this option on for a tiny bit of extra FPS, since I will never notice the difference anyway (as long as I do not stay still and stare at the screen for ages instead of actually playing the game).
    On a side note, Nvidia's implementation still looks better... The gradient transaction is very smooth, while for AMD it goes from sharp to blurry with quite a noticeable edge.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  14. #164
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    No no the only exemple he shown, is in ME there's no difference between quality ..

    Nvidia on left, and AMD + Catalyst with optimisation on the right .... (set on quality not HQ ) ( AA is not applied the same can be disturbing but we speak about AF optimisation, not AA )

    Global scene ..


    Then i have zoom to the back and on specific zone.. ( for see if the AF optimisation was made far of the first scene ) ..




    So what the proof in his article ? Trackmania screen from 3D center ?
    Last edited by Lanek; 12-02-2010 at 06:59 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  15. #165
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    It's more like 10% across the whole board and the question isn't if people should leave it on, but what should the reviewers do.

  16. #166
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Stick to defaults? Which is Quality setting. A lot of people will use the default setting, plus it is nearly impossible to spot the difference in image quality...
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  17. #167
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Posts
    678
    So, the difference in quality isn't noticable, for a performance gain that isn't noticable. They get a few reviews showing better numbers while they get some fuzz about lower quality from others.

    In the end it all seems so unnecessary. There is really no gain for AMD, it's all just bad publicity in turn for better numbers which people don't trust.

  18. #168
    Xtreme Member
    Join Date
    Nov 2008
    Location
    London
    Posts
    300
    It would also be unfair to have AMD use the high quality in reviews as this disables optimisations all together, and as Shadov said, Nvidia will still be able to use their optimisations.

    I lose quite a bit of fps in some games by disabling catalyst AI on my 4890, but who would ever disable catalyst AI? Even reviewers wouldn't. This is the equivalent of what is happening here. But i guess the only thing is, that AMD have added extra optimisations to the 68xx compared to the 4xxx and 5xxx series. Which is another unfair part.

    There should be an open standard of compulsory (undetectable but fps gaining) optimisations

    edit: Wait, how would i go about adjusting lod bias for my card? I want standard+ quality, i don't care for the pathetic % of fps loss.
    Last edited by Oztopher; 12-02-2010 at 07:26 AM.
    -
    Core i7 860 @ 3.80GHz, 1.28v | GA-P55A-UD4 | G.Skill Ripjaw 4GB DDR3 @ 1900MHz 7-9-8-24 1N, 1.57v | HIS HD 6950 2GB, 1536sp @ 900/1400, 1.10v | Samsung F3 500GB | Thermaltake 750W | Windows 7 64bit | Air

    Crunching away...

  19. #169
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by -Boris- View Post
    So, the difference in quality isn't noticable, for a performance gain that isn't noticable. They get a few reviews showing better numbers while they get some fuzz about lower quality from others.

    In the end it all seems so unnecessary. There is really no gain for AMD, it's all just bad publicity in turn for better numbers which people don't trust.
    It is in some game, but at same time the article of 3Dcenter mislead a lot, using old games as Trackmania ( a game they use for showe they was allready a problem with the algorythm of ATI till 3 years ), mixing the FP16 demotion who is only able in some game ( DOW, Oblivion,) .. and put all of this together for use it as proof AMD have decrease greatly the image quality on last driver, is a bit strange.. why never use BC2, Dirt2.. well the games who are tested now in review..

    Same goes for the fps test, you need use an old driver 10.9 set standard, and then use the last driver set it to quality, and see how much is the difference, if you just test moving the slide on the last driver, you can't claim AMD have change it ...

    If you disable all optimisation in the Nvidia driver and you bench a game, you will get the same performance lost..
    Last edited by Lanek; 12-02-2010 at 07:40 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  20. #170
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Forum lag double post.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  21. #171
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Final8ty View Post
    http://www.guru3d.com/article/explor...optimizations/
    By: Hilbert Hagedoorn | Edited by Editor | Published: December 3, 2010
    I don't understand.

    They showed images from Mass Effect 2 / 3DCenter test and then ran benchmarks in Dirt and Far Cry 2? What about all the other games they use?

    I mean, of COURSE going from the default setting to HQ will impact performance...

  22. #172
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Edmonton,Alberta
    Posts
    182
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    This has been going on for about a year now and for the longest time I couldn't see much difference in quality.

    That was until I got the idea instead of use of using my main monitor an Acer 22" I tried my secondary monitor a Samsung 22" then I could start seeing some of the differences.

    For me gaming on my Acer I probably wouldn't notice, unless something was totally FUBAR.

    What I'm saying is the quality differences have to show up across different monitors, what you see on a 30" Dell I might not see on my old 22" Acer.

  23. #173
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by Olivon View Post
    Just read germans articles Shadov ... AMD made a mistake, nothing more to say

    and what is this mistake ????
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  24. #174
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by AMDDeathstar View Post
    This has been going on for about a year now and for the longest time I couldn't see much difference in quality.

    That was until I got the idea instead of use of using my main monitor an Acer 22" I tried my secondary monitor a Samsung 22" then I could start seeing some of the differences.

    For me gaming on my Acer I probably wouldn't notice, unless something was totally FUBAR.

    What I'm saying is the quality differences have to show up across different monitors, what you see on a 30" Dell I might not see on my old 22" Acer.

    Is one S-IPS and the other TN?

    All along the watchtower the watchmen watch the eternal return.

  25. #175
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    @SKYMTL:

    once this settings becomes available on the 5xxx series you have to do a comparison between the old driver with catalys ai on and th enew ones with defaults + HQ

    i bet that HQ is going to perform worse than the old AI on and default is going to perform better than AI on
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

Page 7 of 8 FirstFirst ... 45678 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •