MMM
Page 1 of 5 1234 ... LastLast
Results 1 to 25 of 118

Thread: AMD's AF fix proven false

  1. #1
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533

    AMD's AF fix proven false

    I'm sure everybody remembers slides about AF being improved on the HD 6x00 line and almost every review mentioned it as well. The guys from ComputerBase.de decided to take a closer look.

    They were quite surprised, when they found out, that the situation got even worse. It went so far, that from here on out, they will be testing AMD cards with AF set to high-quality.

    You can see their findings here - http://www.computerbase.de/artikel/g...adeon-hd-6800/

  2. #2
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Your title seems a bit misleading. The Germans basically are just saying the AMD drivers suck. Big f'ing deal. They just bumped the quality settings and things got better. In the 5000 series, AF was actually broken; not so in the 6000 series. So what have we learned today? ATI driver suck so we will continue to get monthly updates to fix stuff.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  3. #3
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    PCGH are the masters of IQ reviews and comparisons. a lot of reviewers screw this up. remember when most reviewers said the 5870's AF was perfect?
    Quality" is the new default setting, also, the render target Replacements active. We noticed at the beginning of the test show that the anisotropic filter of the HD 6800 cards compared with a Radeon HD 5870 flickers stronger. After consulting with AMD informed us that the default "quality" aggressive filtering than the previous driver standard (AI standard, where the texture filtering optimizations in the HD 5800 series have already disabled). Only the "High Quality" bring the improvements to previously unknown levels, and is similar to the previous AI standard of HD 5800 cards. The HD 6000 cards filter that is standard on the level of a HD 5000 card with AI Advanced / Advanced. AMD gives its new cards a fps advantage at the expense of image quality. Once "High quality" is activated, the AF unsightly banding disappears almost completely, the flicker is also reduced - the frame rate of course.
    http://www.pcgameshardware.de/aid,79...e/Test/?page=4
    bad translation. good article.

  4. #4
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by Mechromancer View Post
    Your title seems a bit misleading. The Germans basically are just saying the AMD drivers suck. Big f'ing deal. They just bumped the quality settings and things got better. In the 5000 series, AF was actually broken; not so in the 6000 series. So what have we learned today? ATI driver suck so we will continue to get monthly updates to fix stuff.
    If I understood it correctly, the article says, that with the AF set to default, the picture quality is now worse on HD 6x00 line, than it was on HD 5x00 line.
    Last edited by Vardant; 10-31-2010 at 10:15 AM.

  5. #5
    Registered User
    Join Date
    Nov 2008
    Posts
    72
    @Vardant
    Exactly, Quality is now like AI Advanced. High Quality like the old standard without banding. PCGH and CB both now test with HQ AF to maintain the old Qualitysettings which are better comparable to NVs Q AF.

  6. #6
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Looking at Trackmania, AF is obviously better than HD 5870 was. Where there was an abrupt change before there is now some noise, but that is a good tradeoff. GF100 has the least noise in Trackmania, but if you look at the Half life 2 image HD6870 retains more detail than HD5870 which in turn retains more than GF100. Seems like AMD is making a tradeoff between sharpness in some textures for distortion in noisy ones.

    And GF100 filtering also isn't perfect. As rare as it is sometimes it's non-angle independent nature manifests itself. Look at the second comparison at guru3D's review, ignore what the author is saying about the holster, and instead look at the side of the player's gun where GF100 blurs out the vertical lines while barts makes them sharp

    http://www.guru3d.com/article/radeon...-6870-review/9

  7. #7
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Quote Originally Posted by Vardant View Post
    If I understood it correctly, the article says, that with the AF set to default, the picture quality is now worse on HD 6x00 line, than it was on HD 5x00 line.
    No the fix is only on High Quality mode, your title says the fix doesn't work, in default mode the optimisation is more aggressive. You should change your thread title so it sounds like you're sane.

  8. #8
    Registered User
    Join Date
    Nov 2008
    Posts
    72
    You have to look at the vids, not the pics. Sharper means worse AF and more flickering. The Banding was just a bug in 5xxx series, i don't know why they sell it as a feature, that it's gone.
    I don't have a real problem with AMDs Af quality, it's ok. But to tell the people we made better Af and turn down the standard driver quality to get 5% more speed is just bulls***.

  9. #9
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by Mechromancer View Post
    Your title seems a bit misleading. The Germans basically are just saying the AMD drivers suck. Big f'ing deal. They just bumped the quality settings and things got better. In the 5000 series, AF was actually broken; not so in the 6000 series. So what have we learned today? ATI driver suck so we will continue to get monthly updates to fix stuff.
    how you come to that conclusion?

    so far, neither we nor computerbase know whether this problem is a driver issue or a hardware/filtering algorithm issue.

    the texture flickering on the 6800 series looks pretty bad in the videos computerbase recorded. almost as bad as on a 7800gt, which probably had the worst AF filtering of all time
    if you watch the videos you'll see that the 6800 series texture flickering is way worse than that of the 5800. how come you say AF was broken in the 5800s while it still looks better in these vids than on the 6800s?

    however, i don't agree with computerbase's conclusion that from now on they'll benchmark ati cards with high quality settings instead of the default settings to make the fps comparable.
    imo cards should always be benched with default settings - and if these default settings offer less image quality than the competitor point that out in the conclusion and/or fine that in the final score.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  10. #10
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by hurleybird View Post
    Looking at Trackmania, AF is obviously better than HD 5870 was. Where there was an abrupt change before there is now some noise, but that is a good tradeoff. GF100 has the least noise in Trackmania, but if you look at the Half life 2 image HD6870 retains more detail than HD5870 which in turn retains more than GF100. Seems like AMD is making a tradeoff between sharpness in some textures for distortion in noisy ones.

    And GF100 filtering also isn't perfect. As rare as it is sometimes it's non-angle independent nature manifests itself. Look at the second comparison at guru3D's review, ignore what the author is saying about the holster, and instead look at the side of the player's gun where GF100 blurs out the vertical lines while barts makes them sharp

    http://www.guru3d.com/article/radeon...-6870-review/9
    all filtering algorithms are inherently imperfect. until pixels become infinitesimally small, an infinite number of samples (e.g. 4x,8x,16x) must be taken to perfectly represent the information at each pixel whether it be textures, polys, or lighting. this is assuming that we cant design an infinitely fast processor though.

  11. #11
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    @Razz: I agree. I don't like it when reviewers take it upon themselves to decide which settings are comparable. Bench at default then comment on any subjective differences. Granted the simple minded will only look at the graphs but the rest of us can think for ourselves. I'm still amazed that AMD reduced the default quality and are so blase about it though. I guess they also noticed that most fanboi hate is directed nVidia's way and they are relatively safe

  12. #12
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    All comparaison i have see was using the AF with "Quality" and not "High quality", and they show an increase on the AF quality between 5800 and 6800...

    I don't see too what this have to do with the "fix" of the AF on extreme noisy textures... And why they don't just use the tunnel test for show us if it's fixed or not ?

    Cat AI enable or disable the driver optimisations, it work not only on AF.... It disable completely the driver optimisation specific to a game... Why don't disable them on Nvidia driver too ?
    Last edited by Lanek; 10-31-2010 at 11:12 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  13. #13
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    The general feeling after the reviews hit the web was, that AF is fixed now. That's what I meant.

  14. #14
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by AffenJack View Post
    You have to look at the vids, not the pics. Sharper means worse AF and more flickering.
    You're right, about the first part at least. Went through the hassle of registering for a German site just to DL the comparison, and Barts was by far the most noisy out of the three. Damn, I hope this doesn't effect cayman which is what I'm looking to upgrade to.

  15. #15
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by AffenJack View Post
    @Vardant
    Exactly, Quality is now like AI Advanced. High Quality like the old standard without banding. PCGH and CB both now test with HQ AF to maintain the old Qualitysettings which are better comparable to NVs Q AF.
    So basically, the default settings on HD6000 are even worse than they were on HD5000, but manually setting it to high quality improves quality further than high quality on HD5000? Seems kind of underhanded. Most reviews will use the default settings, which are even worse quality than HD5000, and thus, more FPS.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  16. #16
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by ElSel10 View Post
    So basically, the default settings on HD6000 are even worse than they were on HD5000, but manually setting it to high quality improves quality further than high quality on HD5000? Seems kind of underhanded. Most reviews will use the default settings, which are even worse quality than HD5000, and thus, more FPS.
    It's called optimization. The point is that the high quality setting is better than the previous generation. That's all that matters. That's all any of us is going to use anyway.

    If you want to see some crap in your games, set you settings to "bilinear filtering" and see if the low quality AF doesn't look a lot better.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  17. #17
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by Mechromancer View Post
    Your title seems a bit misleading. The Germans basically are just saying the AMD drivers suck. Big f'ing deal. They just bumped the quality settings and things got better. In the 5000 series, AF was actually broken; not so in the 6000 series. So what have we learned today? ATI driver suck so we will continue to get monthly updates to fix stuff.
    would be a big effing deal if it was an nvidia card, would get many many long winded rambling manifestos as to why one should never compromise on IQ.

  18. #18
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by Mad Pistol View Post
    It's called optimization. The point is that the high quality setting is better than the previous generation. That's all that matters. That's all any of us is going to use anyway.

    If you want to see some crap in your games, set you settings to "bilinear filtering" and see if the low quality AF doesn't look a lot better.
    no, that's not all that matters. this way it's a bluff package. most people never touch these settings, so in fact they get worse quality than the previous generation.

    for me, it's a no-go that - out of the box - a newer generation is worse than its predecessor (no matter in what manner).
    unfortunately, it's always ati/amd that puts itself in such situations. over and over again.
    after the poor filters of the 7800 series nvidia improved the IQ with every new generation, whereas with ati/amd it has always been a rollercoaster ride - and i seriously don't know why.

    don't get me wrong, ati/amd has very good graphics cards, but these little things always make think "WHY OH WHY?".
    is it too much to ask to get smth right after the 3174632714th try?

    Quote Originally Posted by Dainas View Post
    would be a big effing deal if it was an nvidia card, would get many many long winded rambling manifestos as to why one should never compromise on IQ.
    i completely agree, even though in other cases it's the other way around.

    for me it doesn't matter who the black sheep is. name it nvidia or amd - i'll criticize both for fails and stupidness
    Last edited by RaZz!; 10-31-2010 at 12:07 PM.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  19. #19
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Posts
    668
    I'm amazed at the fact that this thread doesn't have 300 pages criticizing AMD for doing this.
    Lol at the guy that says it's optimization,this lowering IQ to have higher FPS to make people believe that this new 6870 and the rest are nothing but a rebranded 5870 with a few more magic powders in it.
    Proud owner of an iPhone3G 16G White




    SpiTweaker by Monteboy,try it

  20. #20
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Nvidia have similar optimisation on their driver, so what ? and ofc 6870 is a 5870 with a new stickers on it.

    It's remember me the FP16 demotion "cheat" history, before we discover Nvidia use it too since Geforce 4xx...

    Anyway i don't have a 6870 for test, so can't judge.
    Last edited by Lanek; 10-31-2010 at 12:31 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  21. #21
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    889
    Quote Originally Posted by RaZz! View Post
    how you come to that conclusion?

    so far, neither we nor computerbase know whether this problem is a driver issue or a hardware/filtering algorithm issue.

    the texture flickering on the 6800 series looks pretty bad in the videos computerbase recorded. almost as bad as on a 7800gt, which probably had the worst AF filtering of all time
    if you watch the videos you'll see that the 6800 series texture flickering is way worse than that of the 5800. how come you say AF was broken in the 5800s while it still looks better in these vids than on the 6800s?

    however, i don't agree with computerbase's conclusion that from now on they'll benchmark ati cards with high quality settings instead of the default settings to make the fps comparable.
    imo cards should always be benched with default settings - and if these default settings offer less image quality than the competitor point that out in the conclusion and/or fine that in the final score.
    Quote Originally Posted by trinibwoy View Post
    @Razz: I agree. I don't like it when reviewers take it upon themselves to decide which settings are comparable. Bench at default then comment on any subjective differences. Granted the simple minded will only look at the graphs but the rest of us can think for ourselves. I'm still amazed that AMD reduced the default quality and are so blase about it though. I guess they also noticed that most fanboi hate is directed nVidia's way and they are relatively safe
    The problem with this approach is it allows ATI and NVIDIA to "cheat" their benchmarks by overly sacrificing quality for performance. By making sure their "performance" option provides nice IQ, they are able to save face and boost performance.

    However, reviewers need to insure they eliminate bias towards one vendor over another. I agree both Nvidia and ATI utilize very different approaches to IQ and its very difficult to compare apples to oranges. However, if the reviewer does it right, I think it provides a more indicative conclusion. Maybe they can do both (default, comparative, and high IQ settings).

    Now heres a question for you:

    What would you prefer: A GPU that provides higher FPS with low IQ/AA/AF, and lower FPS with high IQ/AA/AF, or a GPU that provides higher FPS with high IQ/AA/AF, and playable (but lower FPS) with lower IQ/AA/AF?
    Intel 8700k
    16GB
    Asus z370 Prime
    1080 Ti
    x2 Samsung 850Evo 500GB
    x 1 500 Samsung 860Evo NVME


    Swiftech Apogee XL2
    Swiftech MCP35X x2
    Full Cover GPU blocks
    360 x1, 280 x1, 240 x1, 120 x1 Radiators

  22. #22
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Well if the high quality setting removes the AF artifacts seen in previous generations at high quality then I would disagree with the title.

    I can't say I agree with reducing default image quality though. It doesn't affect me because I always set HQ, but it is a bit deceiving.

  23. #23
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    A.I , AMD




    Since we will not support such practices in 2010, we decided to test in future every Radeon HD 6000 card with about five percent slower high-quality version, so the final result with the default setting from Nvidia in is roughly comparable.
    and

    To continue to ensure comparability with the Nvidia GeForce products, we have therefore also committed to the Radeon HD 5800 cards in the Catalyst drivers from 10:10 to shut off Catalyst AI to test.

  24. #24
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    They can test at default settings, but as you can see, if it's not NVIDIA, then "people" don't care and AMD wouldn't be forced to change anything.

    But in this case, they will lose few percent in every benchmark and that might bring them to change their mind about default AF quality and the stupid thing called Catalyst AI, that interferes more than it helps in various cases.

  25. #25
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    After all, unlike back then thought of us is the "Quality Setting", which is the default driver on a Radeon HD 6800 is not equal to Catalyst AI standard - but worse!
    The above, in part simply states that "quality" is not the same as AI standard. But wait there's more to it:

    A small all-clear we can give the sense that the controller "high quality" the quality of the anisotropic filter is returned to the level of Catalyst AI standard.
    Shocker . So in part, what it's saying is that in order to get what you were getting before in AF you have to set it to HQ . This makes the OP/Subject of the OP misleading. The article does not discuss the merits of any fix but the settings found in CCC in regards to AF. As there is no mention of any engineering or other technical specifications.

    Also, I take the article with a pinch of salt. Why? Because the article is using very old games to show AF quality with. Which draws the question of why they aren't using more current games to show AF for? I see no reason why AF is only examined in games that go as far back as November 16, 2004. Or is that November 2003 (Trackmania)

    By no means do I advocate for AMD. If there is a problem let us examine it. However, per this article I'm not finding anything note worthy as posted in the OP and subject of this thread.
    Last edited by Eastcoasthandle; 10-31-2010 at 01:04 PM.
    [SIGPIC][/SIGPIC]

Page 1 of 5 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •