MMM
Page 3 of 13 FirstFirst 123456 ... LastLast
Results 51 to 75 of 317

Thread: ati cheating in crysis benchmark?

  1. #51
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Ontario, Canada
    Posts
    231
    Wow, I would swear everyone is joking ("ATI wouldn't do this", "Nvidia did this") but it sounds like everyone is serious.... wow. Just wow.

  2. #52
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Jakarta, Indonesia
    Posts
    244
    There's so many funny reaction here

  3. #53
    Xtreme Member
    Join Date
    Feb 2007
    Posts
    337
    I don't see any reason to raise a stink until we can compare the performance once this "bug" is fixed. If at the point frame rates drop substantially, then I would say ATI deserves a good slapping.

    Perhaps someone could try renaming the EXE?
    System: Core I7 920 @ 4200MHz 1.45vCORE 1.35VTT 1.2vIOH // EVGA x58 Classified E760 // 6GB Dominator GT 1866 @ 1688 6-7-6-18 1T 1.65V // Intel X25 80GB // PCP&C 750W Silencer
    Cooling: Heatkiller 3.0 LT CPU block // 655 Pump // GTX360 Radiator
    Sound: X-FI Titanium HD --> Marantz 2265 --> JBL 4311WXA's
    Display: GTX480 // Sony GDM-FW900

  4. #54
    Xtreme Member
    Join Date
    Nov 2006
    Posts
    324
    Complete bullbanana...

    Pictures have different contrast. Open ATI's one in PS and add contrast to equalize it - and uh-oh... Nvidia picture has less details but hides it by increased contrast!?..

    So, who is cheater?
    And, also who prepared pictures?
    Windows 8.1
    Asus M4A87TD EVO + Phenom II X6 1055T @ 3900MHz + HD3850
    APUs

  5. #55
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    The "brilinear" optimization isn't a new thing at all.
    It has been used by both AMD & nVIDIA since the Radeon 8xxx series and the RivaTNT series
    Only difference between the camps at the moment is that nVIDIA let's you turn it off.
    On the other end AMD doesn't allow you to turn it off alone, you can disable Catalyst AI to disable the "brilinear" optimization but that disables other stuff too ( some of them are good, they don't harm the IQ noticeably yet they raise the performance )
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  6. #56
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Starscream View Post
    @Cybercat
    Like Saaya says.
    Its very convenient that this bug apears in a game used by many reviewers just at the time when Nvidia releases their new high-end for reviews.
    Even if this bug gets fixed in the next driver version these reviews using the bugged drivers wil stil be out there.
    So in a few months when someone who doesnt know about all of this wants to buy a new high-end card and googles some reviews he wil mostly find these old ones using the bugged drivers.

    Then again we stil need to know how much extra perf this bug gives.

    But even if most people cant see the diffrence, in a review all settings should be the same for a fair result.
    Convenient? How is it convenient? A driver glitch that potentially lowers IQ without any evidence of enhancing performance? Sounds more like a PR disaster.

    And since when was timing irrefutable proof that something nefarious is going on?

    How about we look at this logically and objectively, for a change?
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  7. #57
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    what happened to that perfect IQ we saw in charts from august. the one with the colored circles that would be square or star shaped if the AF was weaker. has IQ really changed at all between the 4xxx and 5xxx?

  8. #58
    Xtreme Enthusiast TheBlueChanell's Avatar
    Join Date
    Dec 2006
    Posts
    565
    I think this is being blown out of proportion and I also seem to remember nvidia doing something exactly like this in the original crysis.

    I did notice last night though in BC2 that my AF looked strange. If it's a bug hopefully it's fixed and the increased performance stays.
    Main: 900D - Prime 1000T - Asus Crosshair VI Extreme - R7 1700X @ 4.0ghz - RX Vega 64? - 32GB DDR4 3466 - 1TB 960 Pro -
    --- XSPC AX360 x3 - HK IV Pro - HK RX480 - HK 200 D5 - BP Compression ---
    HTPC: 250D - Prime 850T - Gigabyte G1 ITX - i7 6700K @ 4.5ghz - GTX 1080 Ti - 16GB 3200 - 1TB 960 Pro -
    --- ST30 x UT60 - Kyros HF - KryoGraphics 1080 - HK100 DDC - Monsoon Compression ---
    HV01: Define XL R2 - Prime 1200P - Asus Zenith Extreme - TR 1950X - RX580CF - 128GB DDR4 ECC - 512GB 960P - 4x 2TB RE
    HV02: Node 804 - Prime 850T - SuperMicro X1SSH - E3-1230 v6 - Vega FE - 64GB ECC - 512GB 960 Pro - 4x 6TB Gold -

  9. #59
    Registered User
    Join Date
    Feb 2008
    Posts
    82
    The pictures are tagged with:
    HD 5870, A.I. Standard
    GTX 480, Quality + Tri-Opt.
    GTX 285, Quality + Tri-Opt.

    I think a key point in this what settings were set in crysis? I don't know about the nvidia cards + Tri-Opt but the AI standard is in CCC.
    He mentions this (google translated) "According to our benchmarks lose a Radeon HD 5870 only about five percent of their fps when you select "16:1" AF of linear filtering is in place." So i assume filtering is set to max in crysis warhead?

    Is it the same in all modes of Catalyst AI?
    cpu: AMD Phenom II 1090T
    motherboard: MSI 890FXA-GD70
    ram: A-DATA 4GB(2 x 2GB) DDR3 1333
    videocard: Sapphire Radeon HD6870
    powersupply: Silverstone DA1000 1000W

  10. #60
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    It's a bug. I don't know why would AMD resort to cheating when their hardware is clearly capable enough. Besides, days of old fashion cheating are long over imo...
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  11. #61
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Cybercat View Post
    Is there any actual proof this results in some sort of performance boost?
    It requires less processing?

    http://www.xtremesystems.org/forums/...1&postcount=42
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  12. #62
    Registered User
    Join Date
    Dec 2008
    Posts
    21
    From the same article:
    Während der Vorbereitungen zum Launch der Geforce GTX 480/470 teile Nvidia uns mit, dass man bei den Catalyst-Treiber von AMD Unregelmäßigkeiten in den folgenden Spielen entdeckt habe: Dawn of War 2, Empire Total War, Far Cry, Need for Speed: Shift, TES IV: Oblivion und Serious Sam 2. Genauer gesagt sollten AMDs Treiber über die Funktion "Catalyst A.I." in den genannten Spielen FP16-Rendertargets gegen solche im R11G11B10-Format ersetzen und damit die Leistung potenziell auf Kosten der Bildqualität erhöhen. Damit, so Nvidia, verschaffe man sich einen unfairen Vorteil im Wettbewerb. Diese Einschätzung können wir durchaus nachvollziehen und haben das Thema deshalb weiter verfolgt.
    ATI disables FP16 rendering when Catalyst A.I. is enabled.

  13. #63
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by ElSel10 View Post
    That's not proof. That's an explanation of why it might improve performance, but without actual data, we don't know if this is actually the case.

    This might simply be an issue with Catalyst AI, as cirons pointed out, in which case the optimization supposedly gains them 5%, but can be disabled. It might not even be a bug, then. If that's all there is to this that makes this thread all the more stupid.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  14. #64
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Cybercat View Post
    That's not proof. That's an explanation of why it might improve performance, but without actual data, we don't know if this is actually the case.

    This might simply be an issue with Catalyst AI, as cirons pointed out, in which case the optimization supposedly gains them 5%, but can be disabled. It might not even be a bug, then. If that's all there is to this that makes this thread all the more stupid.
    What a load of garbage. If this was Nvidia we were talking about, everybody and their brother would be castrating them.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  15. #65
    Why are we making so much fuzz over a driver bug in one game? Even skipping the bug part do I need a magnifying glass to notice a difference?

    Besides that I remember Nvidia was asking in their "reviewers guide" to turn off catalyst AI, it seems PCGH is happy to follow. Weak and biased...

  16. #66
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by ElSel10 View Post
    What a load of garbage. If this was Nvidia we were talking about, everybody and their brother would be castrating them.
    And that's a problem.

    Believe what you want to believe. The intelligent among us will choose to believe the facts.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  17. #67
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by Shadov View Post
    Why are we making so much fuzz over a driver bug in one game? Even skipping the bug part do I need a magnifying glass to notice a difference?
    no, you don't actually, in the article it says that the distortion is very clear when the player is moving. it's harder to tell from a static screenshot
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  18. #68
    Quote Originally Posted by annihilat0r View Post
    no, you don't actually, in the article it says that the distortion is very clear when the player is moving. it's harder to tell from a static screenshot
    Omg a driver bug in a few years old game - shocka lets keep this thread alive so everyone can see it and people start buying Nvidia Grillx480!

  19. #69
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    not cool if its not bug btw 24 aa is kicking 32 aa's ass badly according to same article
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  20. #70
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by Shadov View Post
    Omg a driver bug in a few years old game - shocka lets keep this thread alive so everyone can see it and people start buying Nvidia Grillx480!
    wow that's so smart of you to detect that I receive a payment of $0.99 for each GF100 sold
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  21. #71
    XS_THE_MACHINE
    Join Date
    Jun 2005
    Location
    Denver
    Posts
    932
    Quote Originally Posted by ElSel10 View Post
    What a load of garbage. If this was Nvidia we were talking about, everybody and their brother would be castrating them.
    Did you realize you are talking to someone with an NVIDIA 8800GTS in their sig? You are no better than the people you are whining about. Besides, nvidia deserves every dig they get right now.

    Anyways... It looks like an unintentional bug to me, but needs to be fixed soon if it is.


    xtremespeakfreely.com

    Semper Fi

  22. #72
    Quote Originally Posted by annihilat0r View Post
    wow that's so smart of you to detect that I receive a payment of $0.99 for each GF100 sold
    No, but Im rather sure you dont fully understand the term unconfirmed driver bug in 10,3a which dosent even have WHQL status or 10,3.

    Do I need to remind you of last nvidia drivers that grilled cards or the ones before that disabled OC? Im not even going into various game bugs present on both sides that way again why is there so much fuzz about this?

  23. #73
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    The timing of the driver release is suspicious, but it's not enough to prove anything. And proving intent isn't easy either.

    Either way, if it was intentional or just an accident, it is a bug that needs to be fixed in future drivers. Lowering IQ is not acceptable.

  24. #74
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    Quote Originally Posted by Cybercat View Post
    Convenient? How is it convenient? A driver glitch that potentially lowers IQ without any evidence of enhancing performance? Sounds more like a PR disaster.

    And since when was timing irrefutable proof that something nefarious is going on?

    How about we look at this logically and objectively, for a change?
    I dont realy understand your post. I said its convenient that this bug apears right now, i never said that that is proof of ATI doing it on purpose.

    Further as i said in my post we stil dont know how much extra perf this bug gives.
    Seeing this bug forces less processing it would only be logical to assume that it increases performance.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  25. #75
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    ati/amd did promise us that 10.3 would have the performance gains, and the fact they let us have 10.3a and had to wait an extra week for official ones, shows they were working hard to get everything in. its obviously expected there would be a bug or two.

Page 3 of 13 FirstFirst 123456 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •