Page 1 of 2 12 LastLast
Results 1 to 25 of 35

Thread: Will GTX570 be caught in new app detection cheat?

  1. #1
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875

    Will GTX570 be caught in new app detection cheat?

    November 27th, 2010 at 3:46 pm - Author Jules

    Over the years, there have been many famous ‘app detection’ revelations. Probably the most famous was revealed by Beyond3D when genius uber-geeks discovered that changing the name of the 3DMark 2003 executable file created a huge difference for the performance of nVidia cards. Looking at recent documents posted into the KitGuru forum, we have another cause for investigation on our hands. If it’s a lie, then it is a very clever one, and we will work hard to find out who perpetrated it. If it’s the truth, then it’s a very worrying development. KitGuru powers up a robotic Watson and takes it for a walk on the Image Quality moors to see if we can uncover any truth implicating a modern-day Moriarty (of either persuasion).
    Quick backgrounder on image quality, so we all know what we’re talking about.

    Journalists benchmark graphic cards so you, the public, can make a buying decision. So far, so simple.

    The first rule is to use the right tests. There is no absolute right and wrong, but if you remember how far Bob Beamon jumped in the Mexico Olympics in 1968, you will see that his record lasted for 23 years simply because he set the record at high altitude. That’s a case where someone ended up with an advantage, without any malice whatsoever. All jumpers were at the same altitude and the jump was amazing anyway.

    When Ben Johnson took gold for Canada at the 1988 Olympics in Seoul, he ran straight past favourite Carl Lewis and appeared to take the world record for 100 metres with a stunning 9.79 seconds. He had used steroids. No one else was on drugs. The gold medal and world record were stripped.

    So, two key things to look for with benchmarking are:-

    1. Is the test itself fair for everyone ?
    For example, does the test use a lot of calls/functions etc that only one card has – and which are not found in the majority of games that a consumer is likely to play. Every year, 3DMark is late because one GPU vendor or the other is late, and Futuremark don’t want to go live unless they have given both sides a fair shout to deliver next-gen hardware
    .
    2. Is everyone doing the same amount of work ?
    Are short-cuts being taken which mean that the test will automatically score higher on one card than another. If an improvement works in all games/apps, then that is obviously a benefit. However, detecting an application and identifying a way to increase your scores by dropping image quality (in such a way that gamers can see it) is downright sneaky. You could buy a card thinking it’s faster than it really is, because it’s driver has been tweaked for a benchmark.

    The arguments around ’1′ and ’2′ focus on things like ‘If nVidia has more tessellation capability than AMD, then what level of tessellation should be tested ?’ and ‘If you want the result for a light calculation, where one side stores all possible outcomes in a table and looks them up, while another GPU calculates the value as it goes – then which is correct ?’. The second was uncovered a few years back in Quake, where it became clear that nVidia’s look up function was faster than its calculation ability – so both companies took different paths to the same result. The tessellation argument will probably be solved by Summer 2011 when a lot more DX11 games are out and using that function. We’ll then know what constitutes a ‘reasonable load’ in a game.

    Anti-Aliasing (AA)
    This weird and wonderful technique uses calculated blurring to make images seem smoother. Totally counter intuitive, but it works a treat. Why counter-intuitive? Because, in the real world of limitless resolution/detail, when we want to draw the best line possible, we normally try to avoid blurring at all.
    Proper AA can also use a ton of memory. At the simplest level, instead of looking at one dot and saying ‘is it black or is it white’, you sample dots in a region and set levels of grey. While at the lowest level, it can seem messy, the overall effect is that the lines themselves appear smoother. Which is a good thing. There are different ways to do the sampling. There are also different levels. For most modern gaming benchmarks, people tend to set good quality AA to 4x.

    Telling the difference
    It’s harder to see in a still image, but when a game is running, poorer quality AA will make it seem that the edges of things are ‘crawling’. It can have a negative effect on your ability to play. For example, if you’re running past trees, looking for enemies to shoot, then your eyes are really sensitive to small movements. When a branch twitches, is it an enemy preparing to shoot or is your graphic card failing to deliver decent AA ?
    In a still, it is easiest to tell when you create a simple animation of one image on top of the other and just swap between them (see below).

    ‘Evidence’ posted on the KitGuru forum
    High end cards are often tested using 30″ panels with a 2560×1600 resolution. We have had a lot of ‘documents’ posted recently that appear to show a difference in AA quality on second generation nVidia cards like the GTX570 and GTX580. Lowering the image quality when running AA can give a card a boost. The stuff we’ve seen, seems to show a boost of 8%. That’s significant. But is it real? We’re going to go deep into the validity of these claims and we invite nVidia to confirm categorically that press drivers for GTX570 (launching next week) will NOT include a lowering of AA quality to get a boost in games like HAWX.

    Following the launch of the awesome GTX460, nVidia’s Fermi series really seems to have come into its own and it would be a real shame if something strange was being done in the driver. We’re firmly camped in the ‘let us hope not’ area. Plus, we’re happy to test this exhaustively and report the truth.

    Is there a difference and is the difference real?
    OK, so enough preamble, what is it we’re really talking about? Below is a simple animated GIF that (according to the posts we have received) seems to show that the anti-aliasing image quality (IQ) drops off for the GTX570/580 is the driver detects that HAWX (commonly used for benchmarking) is running. Looking along the plane’s edge, one of the images definitely appears to show more detail. How was this achieved? By renaming the application, we’re being told. It could all be an elaborate ruse, but if benchmarks are being detected to allow image quality to be dropped and benchmark scores to be raised, then that’s pretty serious stuff. The animated GIF will take a few seconds to load. We recommend that you let it roll through a few times and you’ll see that more detailed sampling seems to be done when the same program is called HACKS than HAWX.

    If these shots tell the whole truth and you had to write this ‘logic’ into a sentence (that anyone could understand), it would say “If you’re being asked to run a game that’s commonly used as a benchmark, then do less work”. Have a look and tell us if you can see less sampling when the drivers detect HAWK and not HACKS.
    http://www.kitguru.net/components/gr...tection-cheat/









    http://www.kitguru.net/forum/showthr...2541#post22541

    "For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.
    "
    http://blogs.nvidia.com/ntersect/201...e-quality.html

    Image Quality comparison AMD vs NVIDIA

    A bit of debate lately has been AMD's image quality. It seems that AMD forfeited a tiny bit in certain settings, enabling more optimization than needed. As mentioned in the previous chapter AMD addressed this issue and feels that the default settings for NVIDIA is comparable to ATI's default image quality settings. So, let's have a look.

    We took three games, each part of our benchmark suite, each setup in the AA and AF levels we use in the test suite and for each game we'll take a still screenshot of the exact same location at 2560x1600, we then crop to a 300 pixels wide segment of the 3D scene and thus leave the pixels in-tact, we do not resize them, and now we can start to compare, let's have a peek:

    ** if you download the original stills - they are 2560x1600 24-bit PNG files .. as such file-sizes are anywhere from 5MB to 10MB per screenshot.
    http://www.guru3d.com/article/radeon...-6870-review/9

    My pennies worth: At the end of the DAY both ATI & NV should have an option to turn off all deliberate visual I.O degrading performance optimizations.

  2. #2
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Is ... the GTX570 even released yet?
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  3. #3
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Is it just me or is the tail and the wing identical in both pictures(unless you have to zoom in by another 50x)? If that's the case, then is the AA really changing, and why would a change in AA only effect the back end of the plane?

    Either way, you can't tell a difference in the standard screen shot unless you zoom by 10-20x, which as far as I am concerned makes it a valid optimization. If this is something that is noticeable to anyone playing the game at any given time then it's a problem, but from what I can tell it's clearly not visible to anyone making it a valid optimization.


    Quote Originally Posted by cegras View Post
    Is ... the GTX570 even released yet?
    No, it hasn't been, that's why I was confused too when I first saw the thread title. Sounds to me more like damage control than it anything.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  4. #4
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    I found this in the comment thread:

    Hi Everybody,

    What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

    In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

    You may remember that Geforce 8800 introduced Coverage Sampling AA (CSAA) technology, which added higher quality AA using little extra storage. Prior to 8800 GTX and CSAA, there was only one “sample quality level” for each AA level, so if an application requested four AA samples, the hardware performed standard 4xAA. However, with 8800 GTX GPUs onwards, our drivers expose additional sample quality levels for various standard AA levels which correspond to our CSAA modes at a given standard AA level.

    The “sample quality level” feature was the outcome of discussions with Microsoft and game developers. It allowed CSAA to be exposed in the current DirectX framework without major changes. Game developers would be able to take advantage of CSAA with minor tweaks in their code.

    Unfortunately, HawX requests the highest quality level for 4xAA, but does not give the user the explicit ability to set CSAA levels in their control panel. Without the driver profile fix, 16xCSAA is applied instead of standard 4xAA. Recall that 16xCSAA uses 4 color/Z samples like 4xAA, but also adds 12 coverage samples. (You can read more about CSAA in our GeForce 8800 Tech Briefs on our Website).

    When you rename the HawX.exe to something else, the driver profile bits are ignored, and 16xCSAA is used. Thus the modest performance slowdown and higher quality AA as shown in the images.

    To use “standard” 4xAA in a renamed HawX executable, you should select any level of anti-aliasing in the game, then go into the NVIDIA Control Panel and set 4xAA for “Antialiasing Setting” and turn on “Enhance the application setting” for the “Antialiasing mode”.

    Nick Stam, NVIDIA
    TL;DR Nvidia doesn't cheat.

    Anyways, Kitguru loves to stir up the pot while issuing a TON of equivocating statements. "If this is true it's terrible, if not Kitguru will tar and feather the perpetrator!"
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  5. #5
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Had a feeling this was one was someone grasping for straws.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  6. #6
    Banned Movieman...
    Join Date
    May 2009
    Location
    illinois
    Posts
    1,809
    you need your eyes checked the 2nd one is smoother, the first is really crappy looking from my laptop

  7. #7
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by cegras View Post
    Is ... the GTX570 even released yet?
    The card will be released next week ( the 30 or the 1th december at same time of 3Dmark )... for what i have understand this appear on the driver send by Nvidia with the GTX570 to tester ....
    what you can understand easely, is they will surely correct it on the "standard " official driver. ( well now ) ...
    Last edited by Lanek; 11-27-2010 at 08:54 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  8. #8
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by stangracin3 View Post
    you need your eyes checked the 2nd one is smoother, the first is really crappy looking from my laptop
    In the animated gif the image isn't 100% lined up. Looking at the tail and wing area in the still images I really don't see a difference, only on the back of the plane.

    NVidia's answer does make sense. Anyone with HAWX want to test by simply checking it named HAWX.exe with 16xCSAA turned on in the driver using "enhance application setting", then with standard "let application decide" renaming the exe?


    Quote Originally Posted by Lanek View Post
    The card will be released next week ... for what i have understand this appear on the driver send by Nvidia with the GTX570 to tester ....
    what you can understand easely, is they will surely correct it on the "standard " official driver. ( well now ) ...
    If what NVidia said is true, then there's nothing to correct. It's a case of the application setting 16xCSAA, which is an enhancement to the standard AA setting. If someone has the time, they can do as I stated to test it. I'd do it myself, but I don't have HAWX.
    Last edited by DilTech; 11-27-2010 at 08:54 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  9. #9
    Banned Movieman...
    Join Date
    May 2009
    Location
    illinois
    Posts
    1,809
    Quote Originally Posted by DilTech View Post
    In the animated gif the image isn't 100% lined up. Looking at the tail and wing area in the still images I really don't see a difference, only on the back of the plane.
    i am and thats where i see alot of extra jaggy spots right before where the tail starts going up

  10. #10
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by DilTech View Post
    In the animated gif the image isn't 100% lined up. Looking at the tail and wing area in the still images I really don't see a difference, only on the back of the plane.

    NVidia's answer does make sense. Anyone with HAWX want to test by simply checking it named HAWX.exe with 16xCSAA turned on in the driver using "enhance application setting", then with standard "let application decide" renaming the exe?




    If what NVidia said is true, then there's nothing to correct. It's a case of the application setting 16xCSAA, which is an enhancement to the standard AA setting. If someone has the time, they can do as I stated to test it. I'd do it myself, but I don't have HAWX.
    For each 'ledge' in the zoomed in part in the ring, the "HACKS" screenshot has a couple extra gradient colour blocks vs. the 4xAA, where each edge ends a little more abruptly.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  11. #11
    Xtreme Rack Freak
    Join Date
    Jun 2006
    Location
    Belle River, Canada
    Posts
    1,806
    lol
    Is this serious ?

    Main Rigs...
    Silver : i7-2600k / Asus P8H67-I Deluxe / 8GB RAM / 460 GTX SSC+ / SSD + HDD / Lian Li PC-Q11s
    WCG rig(s)... for team XS Full time
    1. i7 860 (Pure Cruncher)
    2. i7-870 (Acts as NAS with 5 HDDs)
    3. 1065T (Inactive currently)

  12. #12
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Location
    Melbourne, Australia
    Posts
    942
    Umm app specific 'optimisations' have been going on for ages.. ATI, nvidia and even intel

    I'm guessing they did this with hawx because it was benchmarked so much

    ATI did that with furmark after it killed their cards, so renaming furmark.exe to something else would give a performance boost
    Q9550 || DFI P45 Jr || 4x 2G generic ram || 4870X2 || Aerocool M40 case || 3TB storage


  13. #13
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Location
    Burbank, CA
    Posts
    563
    LOL, cmon, this cant be for real. Get some sun light.

  14. #14
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    I wouldn't be surprised, if they or the tester(who could it possibly be? someone, who is cheating in benchmarks maybe?) knew, what was really going on, but released an article anyway hoping, that NV won't respond so soon, to get some hits and stir some controversy.

  15. #15
    Xtreme Cruncher
    Join Date
    Oct 2008
    Location
    Chicago, IL
    Posts
    840
    Wow. This is going to be a fun thread to watch. The truth of the matter is that the complexity of the relationship between the video driver(and it's version), the video card hardware(and it's version), the operating system(and it's version), and the game(and it's version) will leave so many "possible" causes that could be deliberate or unintended, that nobody in this forum will actually be able to tell what the cause was for a few pixels that might be out of place. Here's the steps:

    1. Fanboys jump on board and start arguing over who is honest or lying.
    2. nVidia issue statement of what they think is going on.
    3. Fanboys jump on the new statement with arguments on nVidia being honest/lying.
    4. Thread FINALLY gets locked because the thread is just full of BS and no facts are likely to present themselves since none of us have access to the proprietary drivers/hardware/operating system calls.

    Can we, for once, NOT do 1 to 4 and just keep the discussion about observations and not try to fight over who is being honest or lying? Please forum? I'm much more interested with facts that people find that are important to this situation and less interested with 'zomg nvidia must be lying'.

  16. #16
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Lol Josh totally right with you ... and specially it's just impossible to check what's going on, as we don't get the 570 and this driver ...

    anyway, just for add.


    BC2 AA mode performance report . ( Nvidia forum, GTX580 ) look logical their response.
    Last edited by Lanek; 11-27-2010 at 10:18 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  17. #17
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    I'm confused.

    If you set 4xMSAA in the game then why would it run in reality 16xCSAA if you renamed the exe?

    All along the watchtower the watchmen watch the eternal return.

  18. #18
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    I thought the response from NV was very clear.

    The game calls for the highest possible quality. NVIDIA added CSAA into their drivers and with help from MS, they are now exposed directly through DirectX.

    The problem is, that 16xCSAA is 4xAA with 12 coverage samples, so HAWX thinks, that's the 4xAA it should apply.

  19. #19
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by STEvil View Post
    I'm confused.

    If you set 4xMSAA in the game then why would it run in reality 16xCSAA if you renamed the exe?
    Because the driver fix is based on the name of the executable. If you change the name, the fix is no longer applied.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  20. #20
    Xtremely Retired OC'er
    Join Date
    Dec 2006
    Posts
    1,084
    Quote Originally Posted by ElSel10 View Post
    Because the driver fix is based on the name of the executable. If you change the name, the fix is no longer applied.
    Agree, im renaming files too, and then later on are all of kinds of behavior

    I say ignore this thread, and setup your own quality for hawx.exe in your nvidia controll panel.
    Last edited by Nikolasz; 11-28-2010 at 12:20 AM.

  21. #21

  22. #22
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by ElSel10 View Post
    Because the driver fix is based on the name of the executable. If you change the name, the fix is no longer applied.
    so.....

    With the EXE renamed you get better quality, but "worse" AA... but with it named correctly you get worse quality and "better" AA... um..

    Anyone got other screenshots?

    All along the watchtower the watchmen watch the eternal return.

  23. #23
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    It should be easy to test if what Nvidia says is true. Take a screenshot and do a bench at 4xAA then take a screenshot and do a bench with the exe renamed and 4xAA forced in the drivers; or do it the other way with 4xAA and a renamed exe then force 16xCSAA with a normal exe. Either way, if the screenshots/benches are the same then it would verify NV's explanation.

  24. #24
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by STEvil View Post
    so.....

    With the EXE renamed you get better quality, but "worse" AA... but with it named correctly you get worse quality and "better" AA... um..

    Anyone got other screenshots?
    With HAWX.exe you get 4xAA as you should, faster, lower quality compared to 16xCSAA.

    With renamed exe, the profile workaround isn't working and you get slower, higher quality 16xCSAA.

  25. #25
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    yeah.. I misread something so what vardant says.
    Last edited by STEvil; 11-28-2010 at 12:46 AM.

    All along the watchtower the watchmen watch the eternal return.

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •