MMM
Page 5 of 13 FirstFirst ... 2345678 ... LastLast
Results 101 to 125 of 317

Thread: ati cheating in crysis benchmark?

  1. #101
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    Quote Originally Posted by Cybercat View Post
    Convenient? How is it convenient? A driver glitch that potentially lowers IQ without any evidence of enhancing performance? Sounds more like a PR disaster.

    And since when was timing irrefutable proof that something nefarious is going on?

    How about we look at this logically and objectively, for a change?
    thuo you do have to question the launch date of the drivers or you would be burying your head in the sand.
    i'm sure it was make sure you use these drivers type thing for reviewers but that just me with a tinfoil hat on
    its more like they pushed more towards performance and have topped out the card so far driver wise,what else could they do?
    _________________

  2. #102
    Xtreme Addict
    Join Date
    Jun 2002
    Location
    Ontario, Canada
    Posts
    1,782
    I'm getting old. My 40 year plus eyes can't tell the difference in the pictures on the first page of this thread.

    Good thing I don't stop to look around at scenery in my games. LOL....
    As quoted by LowRun......"So, we are one week past AMD's worst case scenario for BD's availability but they don't feel like communicating about the delay, I suppose AMD must be removed from the reliable sources list for AMD's products launch dates"

  3. #103
    Xtreme Addict
    Join Date
    Jun 2007
    Posts
    2,064
    Quote Originally Posted by freeloader View Post
    I'm getting old. My 40 year plus eyes can't tell the difference in the pictures on the first page of this thread.

    Good thing I don't stop to look around at scenery in my games. LOL....
    same here.

    to me ... it's 2 different model from 2 different manufacturers, with 2 different specs .... wasn't it supposed to have atleast 2 different results?

  4. #104
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Starscream View Post
    I dont realy understand your post. I said its convenient that this bug apears right now, i never said that that is proof of ATI doing it on purpose.

    Further as i said in my post we stil dont know how much extra perf this bug gives.
    Seeing this bug forces less processing it would only be logical to assume that it increases performance.
    And I'm saying that any convenience this may provide for ATI is negated with the problems it might cause, particularly in regard to public relations.

    And assuming it gives a performance boost is just that, an assumption. Crysis is probably not particularly texture performance bound.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  5. #105
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by SKYMTL View Post
    The responses in this thread crack me up.
    Yep, hilarious

    Also, the substitution of FP16 render targets by ATI has been confirmed in quite a few games. DoW 2, NFS: Shift and a few others are the most obvious candidates.
    You referring to substituting in FP10 where the app requested FP16? I thought that little rumour was still under wraps.

  6. #106
    :)
    Join Date
    Feb 2010
    Location
    San Francisco
    Posts
    116
    Coldon's posts make the most sense to me, but regardless... if ATI did this, then shame, shame .

    I don't see why people want to blame someone here. It seems some ATI "fanboys" are blaming Nvidia for previous mishaps and some Nvidia "fanboys" (and perhaps they are right) are complaining that if this was Nvidia everyone would've gone crazy about it.


    Quote Originally Posted by Eastcoasthandle View Post
    I did the name change and found no difference something asked for a few times in this thread.
    So until someone provides some real evidence I'm going to call this BS and will trust Eastcoasthandle's response.
    Main Rig
    Gigabyte GA-Z68XP-UD4 | i5 2500K @ 4.5GHz | 8GB G.Skill 2133MHz 9-11-10-28| ASUS DCII GTX 580 | Samsugn 256GB 830 SSD | 2x 1TB Samsung Spinpoint F3, Hitachi 1TB | Noctua NH-D14 | Seasonic X750

  7. #107
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by insurgent View Post
    It's ATI they get a free pass
    The sad part is, that is basically true.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  8. #108
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    This is bad if true. Why? nVidia gets a free pass consequently on driver optimizations, and that's considering what y they generated with the 480 (480 only) in terms of TDP/Power they can be very desperate.
    If ATI is playing with render targets and texture formats, what would nVidia try? Tinkering with water reflection rate again?

    But again, moral high ground on the last 2.5 generations didn't sell ATI cards. Image quality seems decent, nothing of the really big shoddy hacks like GF FX/Quack.exe . That's purely from another perspective.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  9. #109
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    I don't see how that "problem" with water in the link above is cheating. It looks horrible and you can't miss it. All these are just bugs. And if we jump into the renaming war, both AMD and NVIDIA do per-application profiles, in this case they just broken something related to Crysis. Renaming it to "iamnotcheating.exe" solves it because that profile is not linking to the broken profile anymore. Though i'd say NVIDIA actually cheating would be more probable. After all, it's not AMD that's late and desperate, it's NVIDIA at the moment. And as for the current gen of cards, NVIDIA has other problems like high power draw and low availability, not performance. AMD on the other hand has lower prices, lower power usage and high availability. So frame or two less, who really cares?
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  10. #110
    Banned
    Join Date
    Mar 2010
    Posts
    88
    Quote Originally Posted by Macadamia View Post
    This is bad if true. Why? nVidia gets a free pass consequently on driver optimizations, and that's considering what y they generated with the 480 (480 only) in terms of TDP/Power they can be very desperate.
    If ATI is playing with render targets and texture formats, what would nVidia try? Tinkering with water reflection rate again?

    But again, moral high ground on the last 2.5 generations didn't sell ATI cards. Image quality seems decent, nothing of the really big shoddy hacks like GF FX/Quack.exe . That's purely from another perspective.
    That wasn't tinkering. It was a driver bug where an SLI path in the driver was triggering in single GPU mode. It was immediately fixed.

  11. #111
    Banned
    Join Date
    Mar 2010
    Posts
    88
    Quote Originally Posted by Zayras View Post
    Coldon's posts make the most sense to me, but regardless... if ATI did this, then shame, shame .

    I don't see why people want to blame someone here. It seems some ATI "fanboys" are blaming Nvidia for previous mishaps and some Nvidia "fanboys" (and perhaps they are right) are complaining that if this was Nvidia everyone would've gone crazy about it.



    So until someone provides some real evidence I'm going to call this BS and will trust Eastcoasthandle's response.
    PCGH is not claiming it's an issue with Cat AI or exe specific.. they simply say that it's happening when you diff in image from a GeForce at 16x AF and an ATI at 16x AF. The banding is happening on the ATI, despite their supposedly slightly better AF (as seen in the directed tests with AF the test with the circles).

    The question is why is there this difference between the directed AF tests and the games?

  12. #112
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    Quote Originally Posted by RejZoR View Post
    I don't see how that "problem" with water in the link above is cheating. It looks horrible and you can't miss it. All these are just bugs. And if we jump into the renaming war, both AMD and NVIDIA do per-application profiles, in this case they just broken something related to Crysis. Renaming it to "iamnotcheating.exe" solves it because that profile is not linking to the broken profile anymore. Though i'd say NVIDIA actually cheating would be more probable. After all, it's not AMD that's late and desperate, it's NVIDIA at the moment. And as for the current gen of cards, NVIDIA has other problems like high power draw and low availability, not performance. AMD on the other hand has lower prices, lower power usage and high availability. So frame or two less, who really cares?
    oh come now its nvidia fault and there the ones that are cheating omfg i heard it all now
    _________________

  13. #113
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    lol at you guys not believing it cause its ATI, how about looking up the history of both companies, both have been caught in many occasions cheating in all sorts of ways, definitely including ATI, infact they started a big cheat war way long ago.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  14. #114
    Xtreme Addict
    Join Date
    Mar 2007
    Posts
    1,377
    I don't have a brand preference, but I do notice a ton of ATI lovin in this xtreme oven. I think most fanboyism stems from the fact that a lot of people tend to love what they just bought or want, and ATI cards were standing on top of the mountain for the past six months.
    Last edited by mrcape; 04-02-2010 at 10:56 PM.

  15. #115
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by Decami View Post
    lol at you guys not believing it cause its ATI, how about looking up the history of both companies, both have been caught in many occasions cheating in all sorts of ways, definitely including ATI, infact they started a big cheat war way long ago.
    It's great when the shoe is on the other foot isn't it?

    Quote Originally Posted by Decami View Post
    People are always trying to blame these two companies for cheating. When it may have just have been a minor side effect of a driver. I remember not too long ago someone was blaming Nvidia for cheating because their driver release had "ghost" images inserted into video content to enhance quality, when this is actually a natural use of picture enhancement.
    Quote Originally Posted by Decami View Post
    Both companies have done everything in their power to get ahead. So when it comes to this kind of stuff, I could care less. They both cheat, not really anymore though, they dont really need to, both companies are pretty solid atm. Anything nowadays is pretty much an un-expected side effect or looked at the wrong way. ATI has some catching up to do, for shure, though.

    If Nvidia says its a bug, I would believe them right now. They obviously have a good distance lead to ATI, atm. Why would they purposely cheat? for fun? These guys are people like us, they make mistakes. Somtimes things (even big) are hugely looked over. Thats what makes these people human, and thats also what betas are for..
    Quote Originally Posted by Decami View Post
    Quote Originally Posted by DilTech View Post
    It's a bug plain and simple...

    These kind of things commonly occur with both companies, it's just a matter of if a site takes the time to complain about it before it's fixed or not.

    As for "cheats" both companies were VERY guilty of it during the FX/6800 days. For the most part those days are gone, and now all that remains is drivers bugs generally.

    As for image quality by both companies, on a technical level NVidia's is better, that much is a fact. Which you prefer is a matter of personal opinion, but going by paper facts nvidia has better IQ this gen.
    QFT LOL something I was trying to say a page ago LOL. Cant stop the fanboys!
    Quote Originally Posted by Decami View Post
    Which is why I stick with Nvidia usually. See us Nvidia buyers are pretty nit picky. That way we keep Nvidia on their toes. Which in turn forces them to work harder and release better. Maybe the ATI guys should go on strike and hold signs in front an AMD building they havent sold yet! LOL jkjk

    But anyway i do agree with you, too many people are too quick to call cheats on things!
    I agree with you, the old you, that it is most likely a bug...in both cases.

  16. #116
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    Quote Originally Posted by cowie View Post
    oh come now its nvidia fault and there the ones that are cheating omfg i heard it all now
    What if you'd ACTUALLY READ, you'd see i never said that... but since ppl never read stuff properly...
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  17. #117
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by SKYMTL View Post
    The responses in this thread crack me up. If the roles were reversed, I am sure the responses would be FAR different. Just shows how long so many here have been smoking the good stuff.

    Also, the substitution of FP16 render targets by ATI has been confirmed in quite a few games. DoW 2, NFS: Shift and a few others are the most obvious candidates.
    Well said... I too love how because it's ATI no one cares, but if nVidia did this the SAME people would be grilling them and flaming nV to cinders.


    Quote Originally Posted by Luka_Aveiro View Post
    I think you may need eye glasses.
    Actually I noticed poor AF with noticeable lines between texture filtering areas in some games I played as well... I chalked it up to just being an ATI/nV difference when I had my 5870, now I guess we know they were cheating to gain some extra performance, eh?

    Quote Originally Posted by mrcape View Post
    I don't have a brand preference, but I do notice a ton of ATI lovin in this xtreme oven. I think most fanboyism stems from the fact that a lot of people tend to love what they just bought or want, and ATI cards were standing on top of the mountain for the past six months.
    Definitely... normally people would clamor over the gains a GTX 480 brings over the 5870, but because the 5870 is the "cool kid" in town, everyone rushes to slam the other brand's solutions.
    Last edited by GoldenTiger; 04-02-2010 at 11:11 PM.

  18. #118
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    I was informed that ATI drivers were cutting out some IQ in a "minor way" to improve performance. I wasn't planning on sharing that tidbit here because I lack an ATI card to actually prove it to anyone with. As far as I know, it affects more than just Crysis. My understanding is that it defaults to some lower IQ settings that affect commonly benchmarked games.

    Anyone want to loan me a 5870? :P


    Amorphous
    NVIDIA Forums Administrator

  19. #119
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Quote Originally Posted by Solus Corvus View Post
    I agree with you, the old you, that it is most likely a bug...in both cases.
    Hahaha! Hilarious! "Can't stop the fanboys!"

  20. #120
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    to all the people claiming you dont see it on the screenshots, use the mouseover script on pcgh... much better to compare it that way, and youll see how it goes from sharp to blurr and from aniso to bilinear

    and this is a still... if you move the textures will flicker a lot...

    and if crysis is no big deal and nobody plays it, then why did ati cheat and risk ruining their pr karma?

  21. #121
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    this may also be why BFBC2 still flickers and flashes on ATI cards, even with the new drivers BFBC2 still has graphical issues with the 5series cards...

  22. #122
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    this whole thread is one giant *ROFL*

  23. #123
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    this thread is giant fail imho nobody tested with other drivers to see if its really optimization or not
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  24. #124
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Svnth View Post
    The question is why is there this difference between the directed AF tests and the games?
    Because the directed test checks for angle independence, not filtering quality. Everyone looks at that little circle test and thinks everything is honky dory.

  25. #125
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by eric66 View Post
    this thread is giant fail imho nobody tested with other drivers to see if its really optimization or not
    Why does this have to be brought with the "latest" drivers to be an issue? Maybe it was this way from the beginning. And I think it DOES have to bring a performance improvement, looking at the screenshots one looks like bilinear filtering from a certain point, the other is AF all the way.

    Quote Originally Posted by Sn0wm@n
    this whole thread is one giant *ROFL*
    why so? because ati can't do wrong?
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

Page 5 of 13 FirstFirst ... 2345678 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •