Page 8 of 8 FirstFirst ... 5678
Results 176 to 188 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #176
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    Texture shimmering does not appear in screenshots. It requires movement to see.

    I don't really care who has better IQ, so long as there is enough quality. But I strongly believe that bechmarks should be run with similar IQ settings. Otherwise, what's the point of benching cards against each other?

    Quote Originally Posted by Lanek View Post
    No no the only exemple he shown, is in ME there's no difference between quality ..

    Nvidia on left, and AMD + Catalyst with optimisation on the right .... (set on quality not HQ ) ( AA is not applied the same can be disturbing but we speak about AF optimisation, not AA )

    Global scene ..


    Then i have zoom to the back and on specific zone.. ( for see if the AF optimisation was made far of the first scene ) ..




    So what the proof in his article ? Trackmania screen from 3D center ?
    NVIDIA Forums Administrator

  2. #177

    Exclamation

    Quote Originally Posted by Amorphous View Post
    Texture shimmering does not appear in screenshots. It requires movement to see.

    I don't really care who has better IQ, so long as there is enough quality. But I strongly believe that bechmarks should be run with similar IQ settings. Otherwise, what's the point of benching cards against each other?
    NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.

    Tsk, tsk - only 75 posts so far Amorphous. Does that mean no new, free card for Xmas?

    Though seriously, the whole topic is like pot calling the kettle black and various optimizations can be proven to both sides. As long as we don't require a magnifying glass to see the difference or objects in game disappear (hello 3dmark dragon) then it's all fine and guess what those optimizations give us better performance in scenes where the FPS counter is close to 30.

  3. #178
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    I find it funny the number of people complaining that probably dont even know or care if they are using TN panel

    All along the watchtower the watchmen watch the eternal return.

  4. #179
    Quote Originally Posted by STEvil View Post
    I find it funny the number of people complaining that probably dont even know or care if they are using TN panel
    True and I just love NV users moaning about the need to "control AF slider settings" on AMD cards, while GeForce cards dont even have the same control option enabled.

    Anyway, thats all that can be said to this topic I guess.

    STEvil on a side note, im considering a new IPS-S monitor. Anything you would recommend?

  5. #180
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    IPS or TN, doesn't matter. It's not related to texture filtering and i can have just as sharp textures on a 4 years old TN or on a brand new IPS screeen. It's all just about screen view angles and response times.

    As for the image, i'm currently using High Quality mode with Surface format optimization disabled. 16x AF used at all times. The textures are nice and sharp, no shimmering effect due to optimizations, nothing. Just secks textures
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  6. #181
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Edmonton,Alberta
    Posts
    182
    Quote Originally Posted by STEvil View Post
    Is one S-IPS and the other TN?
    One is an Acer AL2216W the other is Samsung T220HD, I'm pretty sure the Samsung is TN.

  7. #182
    Xtreme Member
    Join Date
    May 2007
    Location
    Portland, OR.
    Posts
    326
    Sooo....all this hoopla over something your only going to see if your face is literally an inch or two away from the screen. Wow.....Nvidia really will whine about anything.
    - Autobot -
    Intel Q9400 (3.6Ghz) / Asus P5Q Deluxe / 8GB Corsair XMS2 DDR2-800
    MSI R6970 Afterburner 2GB / Asus Xonar DX
    Crucial M4 128GB SSD / Seagate 640GB SATA2 / Lite-On DVD/RW Litescribe SATA
    XFX XXX-Edition modular 650W PSU
    Enzotech / Swiftech / Koolance

  8. #183
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by Shadov View Post
    True and I just love NV users moaning about the need to "control AF slider settings" on AMD cards, while GeForce cards dont even have the same control option enabled.

    Anyway, thats all that can be said to this topic I guess.

    STEvil on a side note, im considering a new IPS-S monitor. Anything you would recommend?
    Dunno, i've not looked into getting a new monitor for a while. I'm happy with my LG W3000H 30" and Toshiba XV648 46"

    Quote Originally Posted by RejZoR View Post
    IPS or TN, doesn't matter. It's not related to texture filtering and i can have just as sharp textures on a 4 years old TN or on a brand new IPS screeen. It's all just about screen view angles and response times.

    As for the image, i'm currently using High Quality mode with Surface format optimization disabled. 16x AF used at all times. The textures are nice and sharp, no shimmering effect due to optimizations, nothing. Just secks textures
    Thats not the point. The TN will have larger banding effects which is far more annoying and noticeable than texture filtering algorithms, and possibly the banding effects could make the texture filtering more or less visible.

    All along the watchtower the watchmen watch the eternal return.

  9. #184
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    London, UK
    Posts
    575
    In conclusion; Nvidia should just lower default IQ to compensate for performance disadvantage and be done with it.


    Quote Originally Posted by creidiki View Post
    We are a band of fearless modern-day alchemists who, for fun, run solutions through sophisticated, if overpriced, separator setups, and then complain when we succeed in separating said solution.

  10. #185
    Quote Originally Posted by eternal_fantasy View Post
    In conclusion; Nvidia should just lower default IQ to compensate for performance disadvantage and be done with it.
    Have you missed the memo, that AMD did it to match Nvidia default quality?

    Remember that AMD introduced a new angle independent AF filtering algorithm in HD 6000, but im not sure how it compares to Nvidia (quality and resources needed).

  11. #186
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    London, UK
    Posts
    575
    Quote Originally Posted by Shadov View Post
    Have you missed the memo, that AMD did it to match Nvidia default quality?
    Then I guess AMD overdid it... according to Nvidia's post-it.

    Which graphics card brand you buy determines which memo it dispenses. Fact.


    Quote Originally Posted by creidiki View Post
    We are a band of fearless modern-day alchemists who, for fun, run solutions through sophisticated, if overpriced, separator setups, and then complain when we succeed in separating said solution.

  12. #187
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Portugal
    Posts
    233
    Quote Originally Posted by Shadov View Post
    Have you missed the memo, that AMD did it to match Nvidia default quality?

    Remember that AMD introduced a new angle independent AF filtering algorithm in HD 6000, but im not sure how it compares to Nvidia (quality and resources needed).
    @ Guru3D review of GTX570

    Speaking of AMD, the ATI graphics team at default driver settings applies an image quality optimization which can be seen, though very slightly and in certain conditions. It gives their cards ~8% extra performance. NVIDIA does not apply such a tweak and opts better image quality. We hope to see that move from AMD/ATI soon as well.
    (http://www.guru3d.com/article/geforce-gtx-570-review/21)

  13. #188
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Portugal
    Posts
    107
    Guru3D is one of the most biased websites ever made
    Don't take life too seriously.....no-one's getting out alive.

Page 8 of 8 FirstFirst ... 5678

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •