MMM
Page 3 of 5 FirstFirst 12345 LastLast
Results 51 to 75 of 118

Thread: AMD's AF fix proven false

  1. #51
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    When buying high end hardware I want the best image quality,and should not have to adjust settings for a better picture.These cards are not fast enough to deliver a good experience without sacrificing image quality.



  2. #52
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by Carfax View Post
    Nvidia has always had a way to disable the optimizations fully though. It's only now that AMD is allowing it's customers the same option..
    Cat AI sliders exist till what 3 - 4+ years now ( and before the system was different )
    for use the specific high optimisation you need to set Cat AI to "Advanced", i don't know any one who use it, all use the Cat AI on " Standard " or completely disabled...

    All reviews and ATI user disable it or use it on " standard " ....
    Last edited by Lanek; 11-01-2010 at 09:18 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  3. #53
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Quote Originally Posted by Dimitriman View Post
    I thought all the reviews used custom image quality settings? so how does driver standard settings make it any relevant upon final performance?
    AFAIK most reviewers just use default IQ settings in their reviews.

  4. #54
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Not a hardware problem, it is a driver problem.

    Ultimately, something that can be fixed quickly but at virtually no gain to the gamer.
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  5. #55
    Xtremely Retired OC'er
    Join Date
    Dec 2006
    Posts
    1,084
    Quote Originally Posted by Hell Hound View Post
    When buying high end hardware I want the best image quality,and should not have to adjust settings for a better picture.These cards are not fast enough to deliver a good experience without sacrificing image quality.
    Whats is what ppl are paying for

    Whole net says there is somthing wrong.
    When ppl expect hi end and get

  6. #56
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Rai View Post
    Image quality is highly subjective. The last time I noticed anything was Crysis looking better on my x1900 xtx vs my 8800GT. Speed went up going to the 8800GT, but image quality went way down. In switching from the 5870 to the 470 gtx to the gtx 460 I haven't noticed anything in terms of image quality, but I don't play Crysis anymore either.
    IQ is only subjective if you look at it with out any methodology. aliasing and artifacts can be quantified fairly well. what looks "good" is definitely a different story though. for example, one of the reasons crysis looks good is because it is in a lush tropical island which has more appeal than a desert.

    oh and crysis is a poor game to judge IQ with. nv and ati optimize drivers for big titles like crysis frequently. that's why an 8000 series card could have inferior image quality in crysis but still be superior in other games.

  7. #57
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    if there is only one way to render something, we wont have any innovation in optimization.

    if edge detect AA looks 95% as good but with half the impact as box AA, would you use it or despise it?

    each person has their own answer to that, and should be happy if they are given the options for both.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  8. #58
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Lanek View Post
    Cat AI sliders exist till what 3 - 4+ years now ( and before the system was different )
    for use the specific high optimisation you need to set Cat AI to "Advanced", i don't know any one who use it, all use the Cat AI on " Standard " or completely disabled...

    All reviews and ATI user disable it or use it on " standard " ....
    You must have Cat AI enabled when using crossfire. In SLI, you can disable all the optimizations if you want.

    Quote Originally Posted by Rai View Post
    lol @ the drama

    1. Nvidia's default settings ARE optimized, they are already represent a trade off in speed vs image quality. That's why you can tweak them, the same is true for ATI.

    2. This is some site that tested some old obscure games and it is in their OPINION that the default Nvidia settings are a like the ATI high quality.

    Image quality is highly subjective. The last time I noticed anything was Crysis looking better on my x1900 xtx vs my 8800GT. Speed went up going to the 8800GT, but image quality went way down. In switching from the 5870 to the 470 gtx to the gtx 460 I haven't noticed anything in terms of image quality, but I don't play Crysis anymore either.
    Texture banding isn't opinion or subjective, sorry.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  9. #59
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by ElSel10 View Post
    You must have Cat AI enabled when using crossfire. In SLI, you can disable all the optimizations if you want.
    Cat AI just has to be on, it can either be set to standard or advanced, as long as its on.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  10. #60
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by nn_step View Post
    Not a hardware problem, it is a driver problem.

    Ultimately, something that can be fixed quickly but at virtually no gain to the gamer.
    Well, no matter what the cause is, it is still a problem. And until it's fixed it's there to annoy users. 5870 has been around for ages and virtually nothing's been done about the issue AFAIK. So hope AMD sorts this out by Cayman release. Read: ASAP.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  11. #61
    Banned
    Join Date
    Jan 2010
    Posts
    30
    Quote Originally Posted by ElSel10 View Post
    You must have Cat AI enabled when using crossfire. In SLI, you can disable all the optimizations if you want.


    Texture banding isn't opinion or subjective, sorry.
    Maybe I'm too busy actually playing the games to try and spot small details, as stated before the Nvidia's optimizations for Crysis made the game look terrible, that's not subjective either.

  12. #62
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by zalbard View Post
    Well, no matter what the cause is, it is still a problem. And until it's fixed it's there to annoy users. 5870 has been around for ages and virtually nothing's been done about the issue AFAIK. So hope AMD sorts this out by Cayman release. Read: ASAP.
    The issue has been popping up in varying degrees for the last 13 months...or 13 driver revisions in AMD's book. If it gets fixed now, it will be a bloody miracle.

  13. #63
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Thread title suggests AMD lied about a hardware improvement and that it is not present even if AMD claimed so. Very misleading and innacurate geared to purposively generate anti climax with the new releases. Mods should interfere with thread title. If its not 100% accurate its not news, its fud.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  14. #64
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Honestly how much of a problem is this? I've seen the videos and it's pretty bad but how common is this texture banding? It's not like there are tons of threads about people dumping their ATI card due to these AF issues.

    I have a 6870 on the way so I guess that I'll find out soon enough.

    Quote Originally Posted by Chumbucket843 View Post
    PCGH are the masters of IQ reviews and comparisons. a lot of reviewers screw this up. remember when most reviewers said the 5870's AF was perfect?
    I remember a lot of people here in the news section arguing that too.

    Quote Originally Posted by Eastcoasthandle View Post
    That's really not relevant to the context of my replies to this thread as a whole. The gist of my post(s) is that we can't make any comparisons between games that go as far back several years vs current released games. This is not a good example of how to do IQ comparisons. And I (as well as others) can't draw anything from it since the games used are so old.
    One of the main reasons that I sold my 4870x2 was it's performance in older, modded, and less popular games. Morrowind really sticks out. I play Darkplaces pretty regularly which happens to be another game where the x2 didn't perform as well as my 280.

    Thats actually one thing that I can't stand about video card reviews. Every site uses the same handful of new games most of which aren't even very good which means they won't be getting played much if at all. It would be nice to see some sites use say a modded Morrowind or even Oblivion install in their reviews. Maybe Stalker SOC complete, Eduke32, or Darkplaces. Even Serious Sam HD hammers my GTX280 but I've never seen it used in a review. Everyone just uses the same handful of top games that you know da** well both companies drivers are optimized for.

    What do I care how Metro runs? I'll never play that again. AvP? I won't even consider buying that pos. BFBC2? How many UE3 games do you need reviewed?
    Last edited by BababooeyHTJ; 11-01-2010 at 12:11 PM.

  15. #65
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Dimitriman View Post
    Thread title suggests AMD lied about a hardware improvement and that it is not present even if AMD claimed so. Very misleading and innacurate geared to purposively generate anti climax with the new releases. Mods should interfere with thread title. If its not 100% accurate its not news, its fud.
    Actually, no.

    AMD claimed better AF quality. Meanwhile, they are still using FP16 render target demotion which can decrease the overall IQ in scenes but increase performance.

  16. #66
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by ElSel10 View Post
    You must have Cat AI enabled when using crossfire. In SLI, you can disable all the optimizations if you want.


    Texture banding isn't opinion or subjective, sorry.
    Standard don't include the optimisations incriminate, it's just apply the "standard profile", exactly same Nvidia driver do ... it's needed for allow the ATI cards to use the specific crossfire profile / games, as typer of renderer ( AFR etc ).. Nothing more .. for enable the "Incriminate" optimisation you need set the Cat AI slider on " advanced " ...
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  17. #67
    Banned
    Join Date
    Jan 2010
    Posts
    30
    Quote Originally Posted by BababooeyHTJ View Post
    Honestly how much of a problem is this? I've seen the videos and it's pretty bad but how common is this texture banding? It's not like there are tons of threads about people dumping their ATI card due to these AF issues.

    I have a 6870 on the way so I guess that I'll find out soon enough.



    I remember a lot of people here in the news section arguing that too.



    One of the main reasons that I sold my 4870x2 was it's performance in older, modded, and less popular games. Morrowind really sticks out. I play Darkplaces pretty regularly which happens to be another game where the x2 didn't perform as well as my 280.

    Thats actually one thing that I can't stand about video card reviews. Every site uses the same handful of new games most of which aren't even very good which means they won't be getting played much if at all. It would be nice to see some sites use say a modded Morrowind or even Oblivion install in their reviews. Maybe Stalker SOC complete, Eduke32, or Darkplaces. Even Serious Sam HD hammers my GTX280 but I've never seen it used in a review. Everyone just uses the same handful of top games that you know da** well both companies drivers are optimized for.

    What do I care how Metro runs? I'll never play that again. AvP? I won't even consider buying that pos. BFBC2? How many UE3 games do you need reviewed?
    Really good point re which games to test. That's why I put the most stock into Techpowerup's reviews. They test a really vast array of games, quite of few of them are older too. I don't see the point in testing a half dozen newer games, most of which are garbage.

  18. #68
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by BababooeyHTJ View Post
    What do I care how Metro runs? I'll never play that again. AvP? I won't even consider buying that pos. BFBC2? How many UE3 games do you need reviewed?
    dont forget FarCry2
    played that for a few dozen hours then never looked at it again. great graphics, and smooth playing, but no where near what i would call an awesome game.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  19. #69
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by SKYMTL View Post
    Actually, no.

    AMD claimed better AF quality. Meanwhile, they are still using FP16 render target demotion which can decrease the overall IQ in scenes but increase performance.
    Hum im maybe wrong. but is FP16 demition is not the surface optimisation setting you can disable or enable by just check it or uncheck it under the Cat AI setting ?

    I quote an article about this setting:

    The Surface Format Optimization checkbox allows improved performance in selected games that use 16-bit floating point surfaces for HDR rendering. It is designed to have no discernable effect on image quality,.
    Last edited by Lanek; 11-01-2010 at 12:38 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  20. #70
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Lanek View Post
    Hum im maybe wrong. but is FP16 demition is not the surface optimisation setting you can disable or enable by just check it or uncheck it under the Cat AI setting ?

    I quote an article about this setting:

    The Surface Format Optimization checkbox allows improved performance in selected games that use 16-bit floating point surfaces for HDR rendering. It is designed to have no discernable effect on image quality,.
    Sounds like a reviewer quoting from the supplied AMD PR materials to me.

    From my experience, the check box has no effect on the way AMD's drivers handle FP16 at this point. I'll check again when the 10.11 drivers are released but 10.10 WHQL and 10.10c don't show any difference when it is toggled.

    I've got bits and pieces of testing done for an IQ article and FP16 is one of the first things I tested for when 10.10 came out.

  21. #71
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by BababooeyHTJ View Post
    One of the main reasons that I sold my 4870x2 was it's performance in older, modded, and less popular games. Morrowind really sticks out. I play Darkplaces pretty regularly which happens to be another game where the x2 didn't perform as well as my 280.

    Thats actually one thing that I can't stand about video card reviews. Every site uses the same handful of new games most of which aren't even very good which means they won't be getting played much if at all. It would be nice to see some sites use say a modded Morrowind or even Oblivion install in their reviews. Maybe Stalker SOC complete, Eduke32, or Darkplaces. Even Serious Sam HD hammers my GTX280 but I've never seen it used in a review. Everyone just uses the same handful of top games that you know da** well both companies drivers are optimized for.

    What do I care how Metro runs? I'll never play that again. AvP? I won't even consider buying that pos. BFBC2? How many UE3 games do you need reviewed?
    We differ in opinion than. While you have a problem with reviewers using current games as a bases for a review I do not. Perhaps you should email them asking them to use older games. Furthermore, as already pointed out drivers may not longer be fully optimized to run those older games like they use to. As for reviewers using a handful of current games to review, that's been common practice for sometime now.

    However, lets put the gist your post to practice shall we? Computer Base used those older games to do IQ comparisons yet used current games to do the actual benchmark reviews. So the question is why did they do that? Obviously they had current games on hand to show IQ comparisons along with the rest of the review. So again, how can I make any distinction between HL2 IQ vs Dirt 2 benchmark review? The answer is simple, I cannot. So in the end, the IQ part of the review loses merit as it's a complete disconnect to the benchmark review.
    Last edited by Eastcoasthandle; 11-01-2010 at 12:59 PM.
    [SIGPIC][/SIGPIC]

  22. #72
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Eastcoasthandle View Post
    However, lets put the gist your post to practice shall we? Computer Base used those older games to do IQ comparisons yet used current games to do the actual benchmark reviews. So the question is why did they do that? Obviously they had current games on hand to show IQ comparisons along with the rest of the review. So again, how can I make any distinction between HL2 IQ vs Dirt 2 benchmark review? .
    That is a good point. Especially considering that some of the games mentioned had known problems with AMD (at the time ATI) drivers and AF implementation.

    There is one thing to remember: NVIDIA called out AMD about FP16 demotion several months ago and specifically mentioned the following games:

    Empire Total War
    Far Cry
    Dawn of War 2
    Need for Speed: Shift
    Oblivion
    Serious Sam 2


    There may have been others as well but I can't remember them off the top of my head.

  23. #73
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by SKYMTL View Post
    Sounds like a reviewer quoting from the supplied AMD PR materials to me.

    From my experience, the check box has no effect on the way AMD's drivers handle FP16 at this point. I'll check again when the 10.11 drivers are released but 10.10 WHQL and 10.10c don't show any difference when it is toggled.

    I've got bits and pieces of testing done for an IQ article and FP16 is one of the first things I tested for when 10.10 came out.

    Nice point, i have not really test further with this settings, as im not even sure CCC modded with this settings is working 100% correctly with 5870 right now ..... (same goes for MLAA)
    Last edited by Lanek; 11-01-2010 at 01:11 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  24. #74
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    One reason for using older games might have been, that the problem is more apparent. It also shows, that either these "optimizations" are more general in nature, but don't work well with some games or they are game specific and if they miss something or the game you want to play is not exactly an AAA title, you might be out of luck.

    Testing every game you benchmark with every driver version and comparing it not only with competition, but with older generation of cards would be quite exhausting in either of these cases and it still might not be enough.

  25. #75
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Eastcoasthandle View Post
    We differ in opinion than. While you have a problem with reviewers using current games as a bases for a review I do not. Perhaps you should email them asking them to use older games. Furthermore, as already pointed out drivers may not longer be fully optimized to run those older games like they use to. As for reviewers using a handful of current games to review, that's been common practice for sometime now.

    However, lets put the gist your post to practice shall we? Computer Base used those older games to do IQ comparisons yet used current games to do the actual benchmark reviews. So the question is why did they do that? Obviously they had current games on hand to show IQ comparisons along with the rest of the review. So again, how can I make any distinction between HL2 IQ vs Dirt 2 benchmark review? The answer is simple, I cannot. So in the end, the IQ part of the review loses merit as it's a complete disconnect to the benchmark review.
    Why should drivers need to be "optimized" for older games? Yes Nvidia has had issues recently with older games. Gothic and NWN2 comes to mind both of which have been fixed. I want to be able to play an older game or a source port (some of which can be very demanding) and have it looking as it should.

    Why would a benchmark on a game like HL2 matter? We all know that it runs at a more than acceptable framerate. I applaud Computerbase for pointing these issues out. The point stands no matter how much you try and downplay it. Benchmarks only tell half of the story.

    There is no excuse for sub-par image quality on a $200-$700 product. When I spend that much on a video card I want it to be able to run anything that I throw at it. Not only games made in the last three years.

    That said I have yet to see a post about one person switching to a Nvidia card because of AF issues so it can't be that common of a problem.

    Quote Originally Posted by Vardant View Post
    Testing every game you benchmark with every driver version and comparing it not only with competition, but with older generation of cards would be quite exhausting in either of these cases and it still might not be enough.
    I agree with that but if you want more hits and you want your review to stand out why not try something different? I clicked on quite a few links from OTH in the 6870/50 review thread and most of the reviews tested the same da** games. If I wasn't interested in picking up a 6870 I wouldn't have done that and probably only checked a couple of sites. There are quite a few pretty popular sites whos reviews I don't even bother to check anymore since it's the same crap done elsewhere. Use one surprise game every once in a while.
    Last edited by BababooeyHTJ; 11-01-2010 at 02:20 PM.

Page 3 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •