MMM
Page 2 of 5 FirstFirst 12345 LastLast
Results 26 to 50 of 118

Thread: AMD's AF fix proven false

  1. #26
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by Lanek View Post
    Nvidia have similar optimisation on their driver, so what ? and ofc 6870 is a 5870 with a new stickers on it.

    It's remember me the FP16 demotion "cheat" history, before we discover Nvidia use it too since Geforce 4xx...
    Nvidia has always had a way to disable the optimizations fully though. It's only now that AMD is allowing it's customers the same option..
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  2. #27
    Xtremely Retired OC'er
    Join Date
    Dec 2006
    Posts
    1,084
    Global seting is not well done, okey, not big deal, and nothing new.
    I seen this many times, new graphics just need latest driver.

    ....And then this happend. Again drivers.
    Is all ok, but drivers need bo be written in in good language, small and fast as bullet. (amd dont practice this)
    Then driver expands for games settings, more bugs being fix each day and huge drivers is hard to fix

    Pacth for games fixes game graphics too.
    Guys you got nothing to worry abaut it, just relax and wait for new driver

  3. #28
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by Eastcoasthandle View Post
    I see no reason why AF is only examined in games that go as far back as November 16, 2004. Or is that November 2003 (Trackmania)
    It's either Trackmania United or Nations, that's a 2006 game. So who's misleading who?

  4. #29
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Eastcoasthandle View Post
    The above, in part simply states that "quality" is not the same as AI standard. But wait there's more to it:


    Shocker . So in part, what it's saying is that in order to get what you were getting before in AF you have to set it to HQ . This makes the OP/Subject of the OP misleading. The article does not discuss the merits of any fix but the settings found in CCC in regards to AF. As there is no mention of any engineering or other technical specifications.

    Also, I take the article with a pinch of salt. Why? Because the article is using very old games to show AF quality with. Which draws the question of why they aren't using more current games to show AF for? I see no reason why AF is only examined in games that go as far back as November 16, 2004. Or is that November 2003 (Trackmania)
    Trackmania is the only game that i noticed AF not being what it should be & maybe that's why they used old games, but the problem is that drivers maybe optimized for today's games & not for the old ones.

  5. #30
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Vardant View Post
    It's either Trackmania United or Nations, that's a 2006 game. So who's misleading who?
    Yes, we have detailed considerations in Half-Life 2, Oblivion and Trackmania carried out today and come to the conclusion that the Radeon HD 6800 by bad drivers default filters than its predecessor... (next page) Than with the Catalyst 10.9 or earlier versions, which we re-examined in Half-Life 2, Oblivion and Track Mania have
    This is what the translation says. So I will go with that. You should edit your subject to fall inline with the context of the article. I'm not seeing anything that goes into detail with your subject of "AMD's AF fix proven false" nor have you provided any proof of that.


    Quote Originally Posted by Final8ty View Post
    Trackmania is the only game that i noticed AF not being what it should be & maybe that's why they used old games, but the problem is that drivers maybe optimized for today's games & not for the old ones.
    The drivers may very well no longer be properly optimized for 6+ year old games. Which is why I think the subject of this thread is misleading. Also, I think we should all ask why there are no current games used for AF comparisons? If we are going to examine IQ as a rule of thumb it should only be done with current games. We can't make any comparisons of AF quality from games dating back 6+ years ago to a game recently release.
    Last edited by Eastcoasthandle; 10-31-2010 at 01:18 PM.
    [SIGPIC][/SIGPIC]

  6. #31
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    This is how Trackmania from 2003 looks like...

  7. #32
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Vardant View Post
    This is how Trackmania from 2003 looks like...
    http://img.brothersoft.com/screensho...l-67610-1.jpeg
    That's really not relevant to the context of my replies to this thread as a whole. The gist of my post(s) is that we can't make any comparisons between games that go as far back several years vs current released games. This is not a good example of how to do IQ comparisons. And I (as well as others) can't draw anything from it since the games used are so old.
    [SIGPIC][/SIGPIC]

  8. #33
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    I kind of miss trackmania 2003.. at least it runs good on my GTX480 whereas the current version cant. My 4870X2 runs about three times the speed with all options at full

    All along the watchtower the watchmen watch the eternal return.

  9. #34
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Location
    Burbank, CA
    Posts
    563
    Things have turned around, nvidia now has had the better IQ for a while, it was only the 9700 PRO and GF4 times where ATI was thought to have better IQ. They both look the same to me tho, the experts seem to disagree.

  10. #35
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Anything to steal some buzz. I guess 1 site can be the mother of all truth.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  11. #36
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by dimitriman View Post
    anything to steal some buzz. I guess 1 site can be the mother of all truth.

    qft
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  12. #37
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by HelixPC View Post
    Things have turned around, nvidia now has had the better IQ for a while, it was only the 9700 PRO and GF4 times where ATI was thought to have better IQ. They both look the same to me tho, the experts seem to disagree.
    The nvidia 6 and 7 series were a joke with I.Q., but so was the X800....well compared to todays standard.

    To me the greatest extent was the difference between an 7900GTX and the X1900XT, the latter was just plain better by every conceivable measure.

  13. #38
    Registered User
    Join Date
    Jul 2010
    Posts
    20
    TBH, I think people need to get over these comparisons trying to look for an "apples to apples" approach. It's been going on for years [IQ optimisation], and the arguements haven't contributed a great deal besides altering the results of said tests from driver to driver. Given the options though I can almost guarantee most cards will output a similar result, just some of us want IQ and some of us want FPS, it's hard to get both.
    Maybe we need to complain a bit more about what options we have to play with as 'advanced' users, because tbh the extra options we get from the 'basic' settings are pretty limited at best. Sure I know they introduce problems, which AMD/nVIDIA would really rather not have to provide extra support for, but all they need to do is put up a disclaimer about the extra options not being supported [which they already do in some part] and voila problem solvered. From there we can choose to utilize these features or not.......

    Here's some of the option's I'm using in the Catalyst 10.10a drivers. Yes, you'll note I've manually put in some extra settings, and no I'm not using MLAA. I'm still uncertain about me needing that feature, but welcome it for those who can't use 'normal' AA ingame.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	ATI-Options.jpg 
Views:	720 
Size:	164.9 KB 
ID:	109042  

  14. #39
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    I think the whole thing started here - http://www.forum-3dcenter.org/vbulle...494193&page=11

    Too bad, it looks, that English sites don't care about IQ anymore, even the last problem wasn't covered at all by them.

  15. #40
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    380
    Quote Originally Posted by Vardant View Post
    I think the whole thing started here - http://www.forum-3dcenter.org/vbulle...494193&page=11

    Too bad, it looks, that English sites don't care about IQ anymore, even the last problem wasn't covered at all by them.
    but but benchmarks are everything

  16. #41
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    Quote Originally Posted by Carfax View Post
    Nvidia has always had a way to disable the optimizations fully though. It's only now that AMD is allowing it's customers the same option..
    Really, if they want to call for fair comparisons.... they should disable optimizations in both camps, otherwise it's not fair at all.

  17. #42
    Xtreme X.I.P. Particle's Avatar
    Join Date
    Apr 2008
    Location
    Kansas
    Posts
    3,219
    So long as you're getting a decent visual experience, it's not cheating or underhanded to mess with your default IQ settings. Crank it up if you want it. Beyond that, who gives a crap? (I know many of you do, but you shouldn't.)

    It would only be lame if AMD and nVidia were in a slow race to see how much IQ they could sacrifice to get speed, notching back and forth as they each go a bit further until the "default" no longer serves as a decent level of quality but instead a downmixed 4x4 pixel matrix representing the entire screen. That just isn't the case here. They drop bits of IQ that people don't generally notice if it helps them achieve other goals, but other than that there are improvements going on. Occasionally there are mistakes (missing rocks and such in Crysis for instance) that have to be addressed. Missing geometry is one thing if used as an intentional default (bad practice). It's another if it's a bug (forgivable & normal). It's another as well if it's a non-default but available setting for users to choose for themselves (perfectly legit).
    Last edited by Particle; 11-01-2010 at 06:04 AM.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Rule 3:
    When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

    Random Tip o' the Whatever
    You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.

  18. #43
    Xtreme Cruncher
    Join Date
    Jan 2007
    Location
    VA, USA
    Posts
    932
    Quote Originally Posted by Vardant View Post
    I think the whole thing started here - http://www.forum-3dcenter.org/vbulle...494193&page=11

    Too bad, it looks, that English sites don't care about IQ anymore, even the last problem wasn't covered at all by them.
    You have to understand that yes there image quality is being sarcificed for speed. However, the settings are still there to improve the image quality.

    There was an improvement still not as good as it could have been.
    Honestly we are dealing with very slight changes in image quality unless you are actively looking for them I doubt majority of people would be able to notice it.


    ^^^^
    Click me

  19. #44
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    its pretty simple, either things look like crap for a specific newer game and you should submit a bug to the company (either), or its so insignificant that you shouldnt notice it.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  20. #45
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Mad Pistol View Post
    It's called optimization. The point is that the high quality setting is better than the previous generation. That's all that matters. That's all any of us is going to use anyway.
    Whatever. If Nvidia "optimized" their default settings to have worse image quality and higher performance, people would be screaming bloody murder.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  21. #46
    Banned
    Join Date
    Jan 2010
    Posts
    30
    OP, I find your thread title very misleading.

  22. #47
    Banned
    Join Date
    Jan 2010
    Posts
    30
    Quote Originally Posted by ElSel10 View Post
    Whatever. If Nvidia "optimized" their default settings to have worse image quality and higher performance, people would be screaming bloody murder.
    lol @ the drama

    1. Nvidia's default settings ARE optimized, they are already represent a trade off in speed vs image quality. That's why you can tweak them, the same is true for ATI.

    2. This is some site that tested some old obscure games and it is in their OPINION that the default Nvidia settings are a like the ATI high quality.

    Image quality is highly subjective. The last time I noticed anything was Crysis looking better on my x1900 xtx vs my 8800GT. Speed went up going to the 8800GT, but image quality went way down. In switching from the 5870 to the 470 gtx to the gtx 460 I haven't noticed anything in terms of image quality, but I don't play Crysis anymore either.

  23. #48
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Quote Originally Posted by ElSel10 View Post
    Whatever. If Nvidia "optimized" their default settings to have worse image quality and higher performance, people would be screaming bloody murder.
    That's true, unfortunately. When there are no clearly defined rules of "playing fair" or "cheating", as is in the case of image quality, the whole thing becomes a playground for fanboys.

    It is for certain that AMD lowered the default AF quality in 6800 series cards and this added a 5% performance boost. I can't even think of the responses if Nvidia had done that.

  24. #49
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by hurrdurr View Post
    That's true, unfortunately. When there are no clearly defined rules of "playing fair" or "cheating", as is in the case of image quality, the whole thing becomes a playground for fanboys.

    It is for certain that AMD lowered the default AF quality in 6800 series cards and this added a 5% performance boost. I can't even think of the responses if Nvidia had done that.
    I thought all the reviews used custom image quality settings? so how does driver standard settings make it any relevant upon final performance?
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  25. #50
    Xtremely Retired OC'er
    Join Date
    Dec 2006
    Posts
    1,084
    I think those peeps didnt play quake1, doom tnt, hexen...

Page 2 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •