MMM
Page 9 of 13 FirstFirst ... 6789101112 ... LastLast
Results 201 to 225 of 317

Thread: ati cheating in crysis benchmark?

  1. #201
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    yeah, quite a bit off the line..

    Stop all the ati/nv fanboy crap guys, bangun is being loaded.

    All along the watchtower the watchmen watch the eternal return.

  2. #202
    Registered User
    Join Date
    May 2009
    Location
    Amsterdam
    Posts
    45
    Quote Originally Posted by BenchZowner View Post
    When I have sex I don't look at her face... so I don't mind if her face looks like it has been run over by a truck

    [ sorry if this joke is kinda off the line folks ]
    Not a very good comparison and not a very tasteful joke if you ask me...

    I don't really understand why the word "cheating" is used before enough testing is done.
    Changing the exe does not change anything and it's unsure if this bug improves performance.

    At this point it is a bit early to compare this to the things nvidia did in the past in my opinion.

  3. #203
    Xtremely High Voltage Sparky's Avatar
    Join Date
    Mar 2006
    Location
    Ohio, USA
    Posts
    16,040
    Quote Originally Posted by STEvil View Post
    yeah, quite a bit off the line..

    Stop all the ati/nv fanboy crap guys, bangun is being loaded.
    Ban gun? Crap.

    Guys, the mods have upgraded to ranged weapons!
    The Cardboard Master
    Crunch with us, the XS WCG team
    Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64

  4. #204
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Ontario, Canada
    Posts
    231
    Wait, do we have any real confirmation that this IS increasing FPS? Did anyone do a follow-up? Screenshots are nice, but I see no FPS numbers... or old vs. new driver comparisons.

  5. #205
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Sparky View Post
    Ban gun? Crap.

    Guys, the mods have upgraded to ranged weapons!
    MM already had a range



    2 things on this, 1) its a glitch not a cheat since it didnt make it go faster, why would they bother i dont think that any1 makes a buying decision or benches cysis anymore if they were to cheat they would have done it some ware better (still bad that this is messed up in the game) and 2) who cares about crysis its a poorly coded 3 year old game

    i do realize that im more of an ati fan as i dislike NV but i have never found cysis to be relevant as to what card maker is better
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  6. #206
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Portugal
    Posts
    107
    10.3 gave me a performance boost i dont notice any IQ lowering thats why i decided to take those screen-shots, i can take more if that helps sort this mess
    Don't take life too seriously.....no-one's getting out alive.

  7. #207
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    above USA...below USSR
    Posts
    1,186
    IMO i dont think its cheating, ATI have managed to get the drivers to mix of bi linear and tri linear texture filtering on the fly for improved performance. In some console games, the video might pop in and out of v-syn or AA for better performance-and its not noticeable unless you really look at it.
    Case-Coolermaster Cosmos S
    MoBo- ASUS Crosshair IV
    Graphics Card-XFX R9 280X [out for RMA] using HD5870
    Hard Drive-Kingston 240Gig V300 master Seagate 160Gb slave Seagate 250Gb slave Seagate 500Gb slave Western Digital 500Gb
    CPU-AMD FX-8320 5Ghz
    RAM 8Gig Corshair c8
    Logitech 5.1 Z5500 BOOST22
    300Gb of MUSICA!!


    Steam ID: alphamonkeywoman
    http://www.techpowerup.com/gpuz/933ab/

  8. #208
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    Quote Originally Posted by Solus Corvus View Post
    It's great when the shoe is on the other foot isn't it?









    I agree with you, the old you, that it is most likely a bug...in both cases.
    yes but, all comments are true, even the most recent one.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  9. #209
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    bi/trilinear filtering has a massive performance advantage over anisotropic, if ATI is dropping back to bi/trilinear, then there will be a performance increase. I'm still very curious about all this talk about angle independent anisotropic filtering, as the the view angle is what determines the line of anisotropy and therefore the sampling region, bi/tri linear is angle independent as the sampling region is always uniform.

    everywhere you look you see that one screen shot of the round rings and everyone uses that as proof of how good the ATI filtering implementation is, unfortunately the major problems in regard to texture filtering, which a lot of the posters above me don't get, is that the problem occur during motion.

    Ever notice how a texture all of a sudden "pops" from low res to a sharper version? that's bi-linear filtering, trilinear smooths this out by interpolating between the mip levels, so the transition is smoother and anisotropic is a special case of trilinear, created to reduce the aliasing of oblique textured surfaces.

    perhaps ATI figured that no one would notice their "feature" and so enabled it to gain extra performance but just like everyone else in this thread I want some proof:

    * compare the 10.3s to the 9.12s and see if this "feature" is present
    * compare filtering across games and APIs, might be a DX10 only "feature"

    Someone should also post a performance comparision between bilinear / trilinear / 8x anisotropic and 16x anisotropic in this thread just to quiet down the guys claiming its not a big deal.

    I know for a fact that the graphics programmer has absolutely NO control over how filtering is performed, all he can do is enable/disable it. The actual implementation of the algorithm is company specific and in the drivers.

  10. #210
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    I think it was on Anandtech where they were talking about Catalysts applying negative LOD automatically, i think it was in Cat 9.11 and later. That sharpens the textures but sometimes cause shimmering. It's not performance optimization because negative LOD actually creates small performance hit, but it may look like optimizations that also caused shimmering on textures even with 16xAF.

    And just for info, as far as i know, AF doesn't work on parallax mapped surfaces (at least in Crysis) so...
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  11. #211
    Xtreme Enthusiast
    Join Date
    Jun 2008
    Posts
    619
    pfft. NVidia cheats by having an automatic built-in advantage in Crysis in the game's coding.
    ASRock 990FX Extreme4
    AMD FX 8350
    Kingston 16GB (4GBx4) DDR3 1333
    Gigabyte NVidia GTX 680 2GB
    Silverstone 1000W PSU

  12. #212
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    I'm waiting for a site like anandtech or tech report to publish an article, I mean, if it's really a big thing they will investigate this too.

  13. #213
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Razrback16 View Post
    pfft. NVidia cheats by having an automatic built-in advantage in Crysis in the game's coding.
    Because we all know it just can't be due to a faster GPU or better architecture. I mean, what would that be suggesting? Nvidia GPUs are faster than ATI GPUs??
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  14. #214
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Portugal
    Posts
    107
    Quote Originally Posted by ElSel10 View Post
    Because we all know it just can't be due to a faster GPU or better architecture. I mean, what would that be suggesting? Nvidia GPUs are faster than ATI GPUs??
    its been know for a good time that crysis is optimized for nvidia gpu's same thing with farcry 2
    Don't take life too seriously.....no-one's getting out alive.

  15. #215
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    The deaf man following the blind man?

    100 posts and 0 facts.

    But from my experience benchmarking 30+ cards:

    1. There is N0 standard. Nadda. None. WHQL ensures your Start popup menu is rendered correctly.. not Crysis. Sharper = better!? Assuming game developer intended maximum sharpness blur on everything?

    2. 10 years since Q3. Hundreds of games. Actual performance cheats (ie 3Dmark) confirmed < 5. Driver rendering bugs people notice < 100. Actual game rendering bugs > 99999.

    3. Can't just do per-pixel comparison between nVidia, Intel and AMD. Which are you assuming is the golden reference? BIAS hmm? They all use different AF algorithms. Likewise, pixel differences in new drivers doesn't mean cheating either - what if it was fix for graphics glitch you didn't notice?

    4. For the most part, the 3DMark AF and other artificial AF tests are meaningless. They are detected (not just exe name). And behavior even with same game engine is different.

    Lets say a game developer tests code on their nVidia rig. Rendering looks ok. They ship it. But, it was using a driver bug. If nVidia fixes it, rendering is not like intended. Should AMD "break" their driver to emulate intended rendering? What do you do when game patch comes out - do patch version detection in driver?

    Bottom line:
    3D rendering is the wild west. You think programming "hello world" is too different in COBOL, Java and Python!? Video drivers are expected to work perfectly with 10000+ games running in DX9, DX10, DX11, OGL and dozens of OS versions, not to mention all the video and GPGPU extensions, and each hardware generation works differently!

    If a few pixels are darker than before, use the older driver, and cut them some slack.

    ================
    EDIT: Statement like "this game is optimized for x" are total BS. What does that mean? For 3 years nVidia/AMD did nothing to fix rendering or improve performance? Quite insulting. Sorry to burst your bubbles, but virtually ALL games, even Start menu and mouse cursor are "optimized".
    Last edited by ***Deimos***; 04-05-2010 at 06:19 AM.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  16. #216
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Portugal
    Posts
    107
    Quote Originally Posted by ***Deimos*** View Post
    snip
    this man is right
    Don't take life too seriously.....no-one's getting out alive.

  17. #217
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    UK
    Posts
    1,083
    Quote Originally Posted by Scorpio[pt] View Post
    its been know for a good time that crysis is optimized for nvidia gpu's same thing with farcry 2
    Probably ever since someone looked on the back of the game boxes
    TJ07 | Corsair HX1000W | Gigabyte EX58 Extreme | i7 930 @ 4ghz | Ek Supreme | Thermochill PA 120.3 | Laing DDC 12v w/ mod plexi top | 3x2gb Corsair 1600mhz | GTX 680 | Raid 0 300gb Velociraptor x 2 | Razer Lachesis & Lycosa | Win7 HP x64 | fluffy dice.

  18. #218
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    32AA mode is a big cheat by itself according to same article iq is clearly lower than atis 24 AA so what does that mean ? false ad ? or cheat ?
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  19. #219
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    One question: why is an optimization being called a "cheat"?

    As long as the effect is invisible to the end user and has a positive impact on performance, I personally don't give two hoots if it increases framerates by two percent or two hundred percent. I tried to allude to the same thing when talking about the substitution of FP16 render targets since while it has a positive impact upon performance, a person actually PLAYING a game (instead of staring at comparison screens) will likely NEVER see the difference. Me, I see a slight difference in some isolated cases but that's just because I play some games like DoW 2 ALOT so I can see the minor differences with ATI's implementation in that game.

    The same thing goes (IMO) for anything above 8xMSAA. Other than old Myst-style point and move style games, 99% of todays apps involve paying attention to a moving picture; not glorified screenshots. So, why would someone even stop to care about a few jagged lines on a fence 200 feet away? On the other hand, if higher IQ modes can be enabled through the use of higher end hardware, I'm all for that as well.

    Naturally, decreasing overall image quality for higher scores in reviews isn't "ethical" but I booted Crysis on both ATI and NVIDIA hardware over the weekend and couldn't see any difference. Yes, the ATI cards do have an odd issue where some edges shimmer a bit but between the 9.12 drivers and the newest 10.3a, I saw no differences in either performance or overall IQ.

  20. #220
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by eric66 View Post
    32AA mode is a big cheat by itself according to same article iq is clearly lower than atis 24 AA so what does that mean ? false ad ? or cheat ?
    No, because it is clearly documented how it works. I am speechless after reading the responses to this thread. If the roles were reversed here, and it was the GTX480 called out for this issue, there would be a storm greater than the one coming in 2012. Not only are people dismissing this issue, some people are actually trying to bash Nvidia some more...

    It seems AMD's social engineering program is very successful.
    Last edited by ElSel10; 04-05-2010 at 07:52 AM.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  21. #221
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    Quote Originally Posted by ***Deimos*** View Post
    The deaf man following the blind man?
    1. There is N0 standard. Nadda. None. WHQL ensures your Start popup menu is rendered correctly.. not Crysis. Sharper = better!? Assuming game developer intended maximum sharpness blur on everything?

    Bottom line:
    3D rendering is the wild west. You think programming "hello world" is too different in COBOL, Java and Python!? Video drivers are expected to work perfectly with 10000+ games running in DX9, DX10, DX11, OGL and dozens of OS versions, not to mention all the video and GPGPU extensions, and each hardware generation works differently!
    There is a standard, its called the API specs and drivers are supposed to follow them. Now lets say developer A creates custom mipmaps and writes the rendering engine assuming no LOD bias as per the SPECS, and the guys at ATI obviously know better and introduce a driver level LOD bias after the game has shipped, now who's responsibility is it to fix the problem? The developer that followed the API specs or the driver team that knew better.

    3D rendering is not the wild west! The whole graphical pipeline is very simple, the developer has control over most key stages in the pipeline except for the finer points of triangle setup/traversal/texture filtering/clipping and to an extent blending. The driver team just needs to ensure that the driver behaves according to the API specs.

    From your post, I'm assuming you don't really know what LOD bias does and from where the supposed sharpness comes from.

    Quote Originally Posted by SKYMTL View Post
    One question: why is an optimization being called a "cheat"?

    As long as the effect is invisible to the end user and has a positive impact on performance, I personally don't give two hoots if it increases framerates by two percent or two hundred percent. I tried to allude to the same thing when talking about the substitution of FP16 render targets since while it has a positive impact upon performance, a person actually PLAYING a game (instead of staring at comparison screens) will likely NEVER see the difference. Me, I see a slight difference in some isolated cases but that's just because I play some games like DoW 2 ALOT so I can see the minor differences with ATI's implementation in that game.

    The same thing goes (IMO) for anything above 8xMSAA. Other than old Myst-style point and move style games, 99% of todays apps involve paying attention to a moving picture; not glorified screenshots. So, why would someone even stop to care about a few jagged lines on a fence 200 feet away? On the other hand, if higher IQ modes can be enabled through the use of higher end hardware, I'm all for that as well.

    Naturally, decreasing overall image quality for higher scores in reviews isn't "ethical" but I booted Crysis on both ATI and NVIDIA hardware over the weekend and couldn't see any difference. Yes, the ATI cards do have an odd issue where some edges shimmer a bit but between the 9.12 drivers and the newest 10.3a, I saw no differences in either performance or overall IQ.
    The problem with your logic is that that performance boost from their optimization will often mislead consumers into thinking that the card is faster than it actually is. When one card is doing 16xAF and the other is doing trilinear, it really isn't fair to compare performance levels, and I think that is the gist of the problem. Yeh, the gamer will probably not notice but then he will also go around spouting nonsense about how awesome his new fangled card is until he hits a game that isn't "optimized" for his card and then will obviously start complaining about how terrible the game engine is.

    If Nvidia disabled their AF in crysis you can expect a large boost in FPS as well but then the internet would run red with nerd rage!

    AF is a pretty standard thing and they should implement it correctly instead of attempting to fake it. They are trading quality for performance and I can see how that is perfectly acceptable from their viewpoint but I still think that they should leave that choice to the developer and end user. If I find that AF kills performance let me turn it off, don't do it for me and then lie to me about it.

  22. #222
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Coldon View Post
    When one card is doing 16xAF and the other is doing trilinear,
    if its that black and white, people everyone should notice it and complain about it. i dont care enough to look into if its a bug or intentional, or how to fix it. but if i was playing and i couldnt turn on 16xAF, i would be pissed. i always game at my screens native res, and 16xAF, then i max out textures until the framerates too low, then i add in AA if i feel its needed or if i have more perf available. but AF is one of the few things where its a small perf hit, for a whole truckload of more IQ (seriously when theres lines across the ground and textures magically look different, its complete BS)

  23. #223
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    I know its not that black and white, I just used that as an example. Though AF can be seen as a special case of trilinear and as such the difference may often be a tad indistinguishable to the average user.

  24. #224
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Quote Originally Posted by ElSel10 View Post
    No, because it is clearly documented how it works. I am speechless after reading the responses to this thread. If the roles were reversed here, and it was the GTX480 called out for this issue, there would be a storm greater than the one coming in 2012. Not only are people dismissing this issue, some people are actually trying to bash Nvidia some more...

    It seems AMD's social engineering program is very successful.
    Or maybe it's Nv's fault that they take a lot of flak? Too much renaming; too arrogant CEO; too expensive GT200 at launch; some cheats/bugs too; wooden screws (that was funneh); late, overhyped and paper launched fermi. Hell, after all these I'm not even crossing them out of my list, I still consider Nv cards when buying.

    ATI.. R600 was crap (very late too), R700 was hot with stock cooling but pushed prices down, and R800 was a great launch.

    I'm guessing others have the same train of thought.

  25. #225
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    The cheats that Nvidia used changed the image quality and thus the gameplay but this is hardly a game play changer.

    Has anyone seen the AF results of GF100, can you say that the implementation in practicality and theory complement each other? In theory a 5850 is much better than the GTX 480 in AF but does in really translate in practical superior AF performance....


    GTX 480


    5870

    http://www.xtremesystems.org/forums/...d.php?t=248897

    So AF seems much better in 5870 so one can say Nvidia's implementation should be faster because its not perfect and Nvidia is cheating and gaining performance in AF situations but in reality the IQ is not really influenced that much at all.
    Coming Soon

Page 9 of 13 FirstFirst ... 6789101112 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •