Page 1 of 16 123411 ... LastLast
Results 1 to 25 of 444

Thread: Nvidia responds to Batman:AA

Hybrid View

  1. #1
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653

    Nvidia responds to Batman:AA

    NVIDIA statement on Batman AA
    A representative of AMD recently claimed that NVIDIA interfered with anti-aliasing (AA) support for Batman: Arkham Asylum on AMD cards. They also claimed that NVIDIA’s The Way It’s Meant to be Played Program prevents AMD from working with developers for those games.
    Both of these claims are NOT true. Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
    Games in The Way It’s Meant to be Played are not exclusive to NVIDIA. AMD can also contact developers and work with them.
    We are proud of the work we do in The Way It’s Meant to be Played. We work hard to deliver kickass, game-changing features in PC games like PhysX, AA, and 3D Vision for games like Batman. If AMD wants to deliver innovation for PC games then we encourage them to roll up their sleeves and do the same.

    NVIDIA Developer Relations
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  2. #2
    And this explains why Batman Demo could run AA fine on ATI Radeon, after changing the device_id. NOT!

    Im sorry but this is a typical PR response they have given you Fugger

  3. #3
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by Shadov View Post
    And this explains why Batman Demo could run AA fine on ATI Radeon, after changing the device_id. NOT!

    Im sorry but this is a typical PR response they have given you Fugger
    I agree.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  4. #4
    Registered User
    Join Date
    Sep 2009
    Posts
    17
    Quote Originally Posted by jaredpace View Post
    I agree.
    So you are saying the engine does natively support AA, and nV paid the developer to remove it?

    I'm going to have to disagree on that one.

  5. #5
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by OCguy View Post
    So you are saying the engine does natively support AA, and nV paid the developer to remove it?

    I'm going to have to disagree on that one.
    Yes the engine supports AA. It's they type of scene specific MSAA that is locked out. check out this thread to learn more about it:
    http://forum.beyond3d.com/showthread.php?t=54786

    Then you can agree/disagree when you're done!
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  6. #6
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Shadov View Post
    And this explains why Batman Demo could run AA fine on ATI Radeon, after changing the device_id. NOT!

    Im sorry but this is a typical PR response they have given you Fugger
    Regardless, it probably doesn't work exactly as it would with an ATI card. It's no secret that Unreal Engine 3 does not natively support AA (it uses deferred lighting) and the claim that being TWIMTBP certified locks out ATI from working with the developers is completely untrue.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  7. #7
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    473
    Quote Originally Posted by Shadov View Post
    And this explains why Batman Demo could run AA fine on ATI Radeon, after changing the device_id. NOT!

    Im sorry but this is a typical PR response they have given you Fugger
    Good first post. Respectful and I agree.
    Quote Originally Posted by Bobbylite View Post
    with great MHZ comes great responsibility
    CPU:Q6600 G0 @ 3.825
    Motherboard:Asus P5E X38
    Memory:2x2GB OCZ Reapers DDR2 1066
    Graphics Card:Asus 4850
    Hard Drive:2xSegate 500gb 32MB Cache raid0
    Power Supply:Xion 800W
    Case:3DAurora
    CPU cooling: D-tek Fuzion V2 (Quad insert removed)
    GPU cooling: mcw60
    Monitor:24" LG

  8. #8
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Im felling for nvidia here. They put in the effort, and AMD did not. AMD is calling unfair. I dont see how it is

  9. #9
    Banned
    Join Date
    Sep 2009
    Location
    Face Clinic, Harley Street
    Posts
    282
    in a way this seems knee jerk, and i would question the accessibilty of AMD to these titles that Nvidia is already 'Working On', i suspect every hurdle possible has been put in the way of AMD, similar to the Intel fiasco in the EU.

  10. #10
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    It's something called initiative, its a part of everyday life. You can sit back and wait for people to cater to you or you can go out and "make" things happen by taking the initiative.

    Don't blame the people putting forth effort and taking the initiative to make something happen because others may not think they need to put forth the same initiative to make things happen, as if everyone is entitled.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  11. #11
    Xtreme Member
    Join Date
    Nov 2006
    Posts
    331
    Quote Originally Posted by highoctane View Post
    It's something called initiative, its a part of everyday life. You can sit back and wait for people to cater to you or you can go out and "make" things happen by taking the initiative.

    Don't blame the people putting forth effort and taking the initiative to make something happen because others may not think they need to put forth the same initiative to make things happen, as if everyone is entitled.
    Yes, but I recall that AMD DID put the effort to make it right, yet no results. Se AMD is not the one to blame.

  12. #12
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    As the discussion about the response of NVIDIA has been moved to a new thread from the initial thread about this matter, I'm going to quote myself to not rewrite what I've said on the other one:

    Quote Originally Posted by Farinorco View Post
    So, what we knew:

    >UE3 has not any AA method implemented, and Batman:AA is coded over UE3.

    >NVIDIA has helped the developers of B:AA to implement a custom AA filter.

    >That custom AA filter developed in colaboration with Eidos is nothing exclusive to NVIDIA, since it's been proven to work with other D3D cards.

    >Since NVIDIA has helped to implement (or completely implemented) that AA filter, they feel with the right to not allow people with hw of other IHVs to run their code, and "encourage" other vendors to implement their own code if they want that feature running on their hw, even when that code is perfectly compatible with any standard hw.

    That bring us again to the main point of the discussion until now: is that right? Where those practices lead us, the consumers?

    It would be the same thing if DIRT2 DX11 features (or part of them) don't work on NVIDIA DX11 hw when they release any, since AMD has supported the developement of that features.

    It would be the same thing if OpenCL acceleration of Havok only works on AMD and Intel hw for the same reason.

    The same for OpenCL acceleration in Bullet Physics.

    Great for all consumers. There was a time where software coded over standard interfaces could be run on any hardware compliant with those standards. That was the whole point of those standards. There was a time... thanks, NVIDIA.
    And I would like to add something to the discussion. I see lots of the people who are defending both NVIDIA and Eidos basing their arguments on the fact of UE3 not having AA:

    A game engine is nothing else than a library (code implementing functions) to pack some of the work of doing a game. It includes some work generalizable to games in general, and then you (the developer) write the rest of the code of the game over (or under, or in) the game engine. That's one of the things about programming. You got some code, then you can use that code as a part of other programs, that can add code over, under or in it.

    Saying that UE3 don't support AA may be true (it was when it was launched, I don't know now, I assume it's the same). But the UE3 don't support the Batman 3D model too. Or probably other shaders used specifically on that game. I'm sure that the developers have written other code apart from the UE3. Heck, I'm so sure that if you try to run the UE3 directly without any more code, you won't be playing Batman AA.

    What I try to say, it's that there's no difference between the custom AA, the models used, any custom shader, or any game specific logic. It's all additional code that didn't came in the engine.

    If you think that helping a sw company to develope some code, and then not allowing compatible hw of other vendors to run that code is fine, then it's your respectable opinion. But saying that "this is a special case because it didn't came in the engine", it's saying nothing.

    I simply don't like a world where some sw is special to NVIDIA, some sw is special to ATI, some sw is special to Intel, some sw is special to AMD, and depending on the brands you chose you can run some things or some others.

    Supposedly, standardization was introduced to avoid this. And it was a great benefit for all consumers. We are getting back to a time we had left behind long ago. It's only my opinion, though.
    Last edited by Farinorco; 09-29-2009 at 01:58 PM.

  13. #13
    Xtreme Enthusiast
    Join Date
    Mar 2009
    Location
    Toronto ON
    Posts
    566
    Quote Originally Posted by Farinorco View Post
    So, what we knew:

    >UE3 has not any AA method implemented, and Batman:AA is coded over UE3.

    >NVIDIA has helped the developers of B:AA to implement a custom AA filter.

    >That custom AA filter developed in colaboration with Eidos is nothing exclusive to NVIDIA, since it's been proven to work with other D3D cards.

    >Since NVIDIA has helped to implement (or completely implemented) that AA filter, they feel with the right to not allow people with hw of other IHVs to run their code, and "encourage" other vendors to implement their own code if they want that feature running on their hw, even when that code is perfectly compatible with any standard hw.

    That bring us again to the main point of the discussion until now: is that right? Where those practices lead us, the consumers?

    It would be the same thing if DIRT2 DX11 features (or part of them) don't work on NVIDIA DX11 hw when they release any, since AMD has supported the developement of that features.

    It would be the same thing if OpenCL acceleration of Havok only works on AMD and Intel hw for the same reason.

    The same for OpenCL acceleration in Bullet Physics.

    Great for all consumers. There was a time where software coded over standard interfaces could be run on any hardware compliant with those standards. That was the whole point of those standards. There was a time... thanks, NVIDIA.


    And I would like to add something to the discussion. I see lots of the people who are defending both NVIDIA and Eidos basing their arguments on the fact of UE3 not having AA:

    A game engine is nothing else than a library (code implementing functions) to pack some of the work of doing a game. It includes some work generalizable to games in general, and then you (the developer) write the rest of the code of the game over (or under, or in) the game engine. That's one of the things about programming. You got some code, then you can use that code as a part of other programs, that can add code over, under or in it.

    Saying that UE3 don't support AA may be true (it was when it was launched, I don't know now, I assume it's the same). But the UE3 don't support the Batman 3D model too. Or probably other shaders used specifically on that game. I'm sure that the developers have written other code apart from the UE3. Heck, I'm so sure that if you try to run the UE3 directly without any more code, you won't be playing Batman AA.

    What I try to say, it's that there's no difference between the custom AA, the models used, any custom shader, or any game specific logic. It's all additional code that didn't came in the engine.

    If you think that helping a sw company to develope some code, and then not allowing compatible hw of other vendors to run that code is fine, then it's your respectable opinion. But saying that "this is a special case because it didn't came in the engine", it's saying nothing.

    I simply don't like a world where some sw is special to NVIDIA, some sw is special to ATI, some sw is special to Intel, some sw is special to AMD, and depending on the brands you chose you can run some things or some others.

    Supposedly, standardization was introduced to avoid this. And it was a great benefit for all consumers. We are getting back to a time we had left behind long ago. It's only my opinion, though.
    Some people are ONLY NVIDIA consumers so I guess they don't care. I believe if AMD would start using the same tactics, the same people would be screaming bloody murder.
    Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
    ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
    Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
    Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
    Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
    Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
    Sennheiser RS 180 - Cooler Master Cosmos S Case

  14. #14
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    ABout the hardware ID thing. Nvidia made the code work on there GPUs and therfore it was enabled. It was not checked on ATI hardware therefore it was not enabled. I dont think this is a case of fake blocks, rather rockstedy not wanting to enable features on untested cards.

  15. #15
    Quote Originally Posted by Bodkin View Post
    ABout the hardware ID thing. Nvidia made the code work on there GPUs and therfore it was enabled. It was not checked on ATI hardware therefore it was not enabled. I dont think this is a case of fake blocks, rather rockstedy not wanting to enable features on untested cards.
    Hmm, AA is such a new feature in gaming that everyone needs to hold hands of the software engineers to make it work... Right

    Furthermore can you imagine ATI telling the developers to go to hell when they requested hardware?

    Anyway this is kinda getting silly. I understand new directx 11 effects that need evaluation and testing, but AA?
    Last edited by Shadov; 09-29-2009 at 11:53 AM.

  16. #16
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Quote Originally Posted by Shadov View Post
    Hmm, AA is such a new feature in gaming that everyone needs to hold hands of the programing team to make it work...

    Im sorry but this is kinda getting silly. I understand new directx 11 effects, but AA?
    Im playing devils advocate here, im not any form of fanboi. But i suppose the argument against this is the unreal engine 3 has always had AA problems and if nvidia had not added there AA support mayby rocksteady would have not bothered at all

  17. #17
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Link to response? Or was it an email/pm?
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  18. #18
    Xtreme Addict
    Join Date
    Jun 2005
    Posts
    1,095
    Quote Originally Posted by LordEC911 View Post
    Link to response? Or was it an email/pm?
    Again, what are you implying? Fugger lies to everyone and fabricates a response from NVidia because he's a NVidia fan?

    So far people implied:

    1- NVidia paid money to Eidos, the makers of Batman:AA to deliberately cripple ATI graphics cards' performances by not enabling in game AA option for ATI cards.
    2- Eidos who is depended on gamers with any brand graphics cards said: "Wow! What a great idea! Let's abandon half of our customers and risk ourselves to be exposed, boycotted and possibly sued" and took the deal. Because as we all know, only the xtreme people like the ones in this forum can uncover such a dastartly pilot and realize there is whole option missing for one brand of graphics card. Eidos was sure that no one will ever notice.
    3- There is absolutely no other technical (or at least sensible) explanation why the developers disabled AA in game for ATI cards but to damage their performance. So no driver issues, no AA malfuction for ATI cards so leaving AA to CCC is a better option or any other explanation. The only reason is to be pure evil and kill ATI.
    4- AMD knew this, but being the poor sissy boys constantly bullied by the evil giant NVidia, could not do anything about it . People who sue each other for the color of their pants, just bent over and took it. No contacting Eidos and threatening to expose them, no filling complaints, nothing. Just a mere mention in a blog. So there is a new game being developed and ATI has no idea (never get to view the code or test the game on their cards) that the AA option is disabled for their cards until that game hits the stores.

    Did you guys really really think that NVidia could possibly get away with something like this?

  19. #19
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    North USA
    Posts
    670
    There is the DX Spec way to implement AA, and then there is this. This is one of those very rare moments in life where there is a right way to do something and a wrong way. This was done the wrong way. Story over.
    Asus P6T-DLX V2 1104 & i7 920 @ 4116 1.32v(Windows Reported) 1.3375v (BIOS Set) 196x20(1) HT OFF
    6GB OCZ Platinum DDR3 1600 3x2GB@ 7-7-7-24, 1.66v, 1568Mhz
    Sapphire 5870 @ 985/1245 1.2v
    X-Fi "Fatal1ty" & Klipsch ProMedia Ultra 5.1 Speaks/Beyerdynamic DT-880 Pro (2005 Model) and a mini3 amp
    WD 150GB Raptor (Games) & 2x WD 640GB (System)
    PC Power & Cooling 750w
    Homebrew watercooling on CPU and GPU
    and the best monitor ever made + a Samsung 226CW + Dell P2210 for eyefinity
    Windows 7 Utimate x64

  20. #20
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by Bodkin View Post
    ABout the hardware ID thing. Nvidia made the code work on there GPUs and therfore it was enabled. It was not checked on ATI hardware therefore it was not enabled. I dont think this is a case of fake blocks, rather rockstedy not wanting to enable features on untested cards.
    Since it's not a DX11 title, it's quite strange they didn't test it on the Radeons. I mean, the 4000 series is only what, 15-18 months old now? Surely they didn't have time to test it!

    I'm getting the same vibe as with the reason for disabling PhysX if ATI card present; there were three reasons (not in order of importance):

    - Unable to validate
    - Business reasons
    - Development expense

    This stinks IMO. I can understand no1 but the other two?!

    Source

  21. #21
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Quote Originally Posted by marten_larsson View Post
    Since it's not a DX11 title, it's quite strange they didn't test it on the Radeons. I mean, the 4000 series is only what, 15-18 months old now? Surely they didn't have time to test it!

    I'm getting the same vibe as with the reason for disabling PhysX if ATI card present; there were three reasons (not in order of importance):

    - Unable to validate
    - Business reasons
    - Development expense

    This stinks IMO. I can understand no1 but the other two?!

    Source
    I agree its wired they did not test on ATI hardware, but with TWIBTBP nvidia test your game only hundreds of configurations for you with little effort from the developer. ATI on the other hand dont seem to do much at all

  22. #22
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    The response was sent via email. I see that it was sent to several sites.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  23. #23
    Xtreme Member
    Join Date
    Jan 2005
    Location
    No(r)way
    Posts
    452
    So what's next? ATI have to start paying developers to add colour to games?
    Obsolescence be thy name

  24. #24
    Xtreme Cruncher
    Join Date
    May 2007
    Posts
    339
    Quote Originally Posted by Frodin View Post
    So what's next? ATI have to start paying developers to add colour to games?
    This is exactly the point.

    How do they distinguish between graphical features that are assumed to work on all platforms (e.g. colour), and those which they will only enable for tested validated hardware (e.g. this implementation of AA)?

  25. #25
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by Frodin View Post
    So what's next? ATI have to start paying developers to add colour to games?
    Nvidia: pioneering kickass, game changing features like color and AA.

Page 1 of 16 123411 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •