Page 2 of 18 FirstFirst 1234512 ... LastLast
Results 26 to 50 of 444

Thread: Nvidia responds to Batman:AA

  1. #26
    Xtreme Addict
    Join Date
    May 2008
    Location
    Ontario, Canada
    Posts
    1,051
    Lets not forget that Nvidia purposely forces CPU PhysX to only work on 1 core as to demonstrate how pathetically slow it is and how you NEED a gpu to do PhysX and thus need to buy nvidia cards.

    However there is a hack out there that properly distributes PhysX to additional cores, the ones that are 95% unused in most games, and it flows beautifully, regardless of whether you have ati or nvidia.

    Get the hack, enjoy physx and give nvidia a big finger.
    Computer: i7 2600k @ 4.7Ghz | Asus P8P67 Evo | Corsair LP 16gb 1600CL9 | Silver Arrow | Essence STX | Crucial m4 128gb | Silverstone Raven 3|

    Video: 2x Sapphire 6950 Toxic 2gb @ 6970 Switch @ 880 / 1350 | Asus VG248QE |

    Audio: ODAC+O2 => JH|13 Pro | STX => ATH-AD700X / Audioengine A5

  2. #27
    Registered User
    Join Date
    Mar 2009
    Posts
    72
    Quote Originally Posted by OCguy View Post
    Looks like the outrage was all for nothing.

    The response was standard PR to deflect the argument, claim the moral high ground while promoting their own products. I didn't see anything explaining why ATI cards run the Batman demo with AA just fine when you change the device_id.

  3. #28
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Quote Originally Posted by Bojamijams View Post
    Lets not forget that Nvidia purposely forces CPU PhysX to only work on 1 core as to demonstrate how pathetically slow it is and how you NEED a gpu to do PhysX and thus need to buy nvidia cards.

    However there is a hack out there that properly distributes PhysX to additional cores, the ones that are 95% unused in most games, and it flows beautifully, regardless of whether you have ati or nvidia.

    Get the hack, enjoy physx and give nvidia a big finger.
    Could not agree more, Nvidia are onto a winner with Physx as it is a big part of batman but if they keep it locked its screwed
    CPU: Intel 2500k (4.8ghz)
    Mobo: Asus P8P67 PRO
    GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
    Sound: Corsair SP2500 with X-Fi
    Storage: Intel X-25M g2 160GB + 1x1TB f1
    Case: Sivlerstone Raven RV02
    PSU: Corsair HX850
    Cooling: Custom loop: EK Supreme HF, EK 6970
    Screens: BenQ XL2410T 120hz


    Help for Heroes

  4. #29
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Ottawa, Canada
    Posts
    2,443
    It's like I said many times. Vote with your wallet. You as the consumer basically trains how business's act. If they get out of hand and product don't get sold then they shape up. If you bit$h but buy it anyway then you are really not doing anything but helping support their bad behaviour. I for one will not buy this game as it is an obvious stepping stone in hurting us as the consumer and will help cause a rift in games. The line was drawn in the sand for me.

  5. #30
    Registered User
    Join Date
    Sep 2009
    Posts
    17
    Quote Originally Posted by lockee View Post
    The response was standard PR to deflect the argument, claim the moral high ground while promoting their own products. I didn't see anything explaining why ATI cards run the Batman demo with AA just fine when you change the device_id.
    Because nV put up the money to make sure AA works on nV cards for that particular game, in an engine that does not normally support it.

    Anyone who doesnt believe they have a right to reap the benefits of that investment should not ever attempt to operate their own business.

  6. #31
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by OCguy View Post
    Because nV put up the money to make sure AA works on nV cards for that particular game, in an engine that does not normally support it.

    Anyone who doesnt believe they have a right to reap the benefits of that investment should not ever attempt to operate their own business.
    Well, then ATI should just tell Codemasters to disable DX11 on Nvidia hardware then? And DX10.1 support in HAWX? This is nothing that is beneficial to consumers and therefore NOT something we should accept!

  7. #32
    PI in the face
    Join Date
    Nov 2008
    Posts
    3,083
    Quote Originally Posted by OCguy View Post
    Because nV put up the money to make sure AA works on nV cards for that particular game, in an engine that does not normally support it.

    Anyone who doesnt believe they have a right to reap the benefits of that investment should not ever attempt to operate their own business.
    ^speaks the truth

    Its like complaining about someone that just made 100k on the stock market, while you sat on your hands. You could have invested and taken the time but you didnt so you dont reap the rewards.
    Quote Originally Posted by L0ud View Post
    So many opinions and so few screenshots

  8. #33
    no sleep, always tired TheGoat Eater's Avatar
    Join Date
    Oct 2006
    Location
    Iowa, USA
    Posts
    1,832
    Quote Originally Posted by FUGGER View Post
    I downloaded the demo and I must admit the Physx really make the games visual experience complete. Without it, it would be rather bland.
    I have that same feeling with UT3 - the visual experience are just not the same w/o Physx... I would certainly not be as happy with the game had UT3 not implemented Physx at all...

  9. #34
    Registered User
    Join Date
    Sep 2009
    Posts
    17
    Quote Originally Posted by marten_larsson View Post
    Well, then ATI should just tell Codemasters to disable DX11 on Nvidia hardware then? And DX10.1 support in HAWX? This is nothing that is beneficial to consumers and therefore NOT something we should accept!

    If there was a game that was going to be DX10 only, and ATi put up the $$ and worked with the developer to make sure that DX11 ran on their cards, I would not see a problem with it.

    Of course there is a benefit to the consumer. There wasnt going to be in-game AA at all, and now there is.

  10. #35
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    So this is what we know:

    - UE3 engine -> problems with regular AA (everyone knows that);

    - nVidia worked with devs so their costumers could use AA with B:AA;

    - their AA method is compatible with ATI cards (tricking the game into thinking you are using an nVidia card but using an ATI makes AA work - don't know if it flawless, however...);

    - apparently, AMD was expecting to have AA enabled to their hardware, but since AA was brought up by nVidia+devs, it was only available with nVidia hardware;

    If you ask me, I think AMD should publicly apologize nVidia+devs and start immediatly working with devs in order to make AA available with their hardware, in their own interest: if this situation gave them the will to complain, then it should give them aswell the will to fix the problem.
    I mean, in the first place, AMD should have contacted devs in order to know why AA was disabled in the game instead of coming to public making this assumptions...

    If it were in some other companies, some heads would roll...
    Are we there yet?

  11. #36
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Location
    Hawaii
    Posts
    611
    Isn't this game on the 360? Doesn't it have AA on the 360? Who made the GPU for the 360?

  12. #37
    Quote Originally Posted by TheGoat Eater View Post
    I have that same feeling with UT3 - the visual experience are just not the same w/o Physx... I would certainly not be as happy with the game had UT3 not implemented Physx at all...
    Personally I would be happy if Nvidia never touched the game, then we could have the same effects running optimized on a CPU, without special workarounds:

    http://www.youtube.com/watch?v=AUOr4cFWY-s (Batman with physics on a CPU)

  13. #38
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by OCguy View Post
    Of course there is a benefit to the consumer. There wasnt going to be in-game AA at all, and now there is.
    This.
    Are we there yet?

  14. #39

    ...

    Quote Originally Posted by Luka_Aveiro View Post
    This.
    Ok I vote that Battleforge DirectX 11 code path should be ATI exclusive, since Nvidia didnt even have hardware to support the devs.

    Like it as a consumer?

  15. #40
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by OCguy View Post
    If there was a game that was going to be DX10 only, and ATi put up the $$ and worked with the developer to make sure that DX11 ran on their cards, I would not see a problem with it.

    Of course there is a benefit to the consumer. There wasnt going to be in-game AA at all, and now there is.
    You mean like Dirt 2? I accept that PhysX don't run on ATI hardware, that's not something to complain about (but disabling it because of ATi card is), but not when it's something that obviously works(?). If Nvidia want to keep PC gaming alive this is not the way to do it...

  16. #41
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Darakian View Post
    Isn't this game on the 360? Doesn't it have AA on the 360? Who made the GPU for the 360?
    No AA on the 360 version...Trust me, I know.

    Quote Originally Posted by Shadov View Post
    Ok I vote that Battleforge DirectX 11 code path should be ATI exclusive, since Nvidia didnt even have hardware to support the devs.

    Like it as a consumer?
    I doubt anyone would care...battleforge isn't a good game to begin with.

    Although, saying NVidia hasn't put out DX11 developer cards is a grave mistake....
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  17. #42
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by OCguy View Post
    So you are saying the engine does natively support AA, and nV paid the developer to remove it?

    I'm going to have to disagree on that one.
    Yes the engine supports AA. It's they type of scene specific MSAA that is locked out. check out this thread to learn more about it:
    http://forum.beyond3d.com/showthread.php?t=54786

    Then you can agree/disagree when you're done!
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  18. #43
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by OCguy View Post
    Because nV put up the money to make sure AA works on nV cards for that particular game, in an engine that does not normally support it.

    Anyone who doesnt believe they have a right to reap the benefits of that investment should not ever attempt to operate their own business.
    NV are reaping the benefits of that investment by getting AA whether it run on ATI cards or not.

    But what your really saying is that only NV should benefit from it.

    Then you will be ok with ATI getting the guts ripped out of Dirt2 when run on NV cards.

  19. #44
    Xtreme Addict
    Join Date
    May 2008
    Location
    Ontario, Canada
    Posts
    1,051
    Most of are going to get a crack that lets Batman:AA have AA with ATI .. its been done before and it'll be done again
    Computer: i7 2600k @ 4.7Ghz | Asus P8P67 Evo | Corsair LP 16gb 1600CL9 | Silver Arrow | Essence STX | Crucial m4 128gb | Silverstone Raven 3|

    Video: 2x Sapphire 6950 Toxic 2gb @ 6970 Switch @ 880 / 1350 | Asus VG248QE |

    Audio: ODAC+O2 => JH|13 Pro | STX => ATH-AD700X / Audioengine A5

  20. #45
    Xtreme Addict
    Join Date
    Jan 2007
    Location
    Michigan
    Posts
    1,785
    Who knows if Batman is even a good comparison for what AMD is talking about... I mean, titles like AA3 had an AA box that doesn't work. Again based on the Unreal 3 engine.

    So I believe AMD is telling the truth because it does seem that if a developer is working with GPU company A *and being payed for it*, they're not exactly going to be forthcoming in working terms out with GPU company B. Even though nvidia is putting in the time and effort, they supposedly are paying for the development costs to the game developers... But I seem to remember ATi did this back before AMD bought them with (Half Life 2)...
    Current: AMD Threadripper 1950X @ 4.2GHz / EK Supremacy/ 360 EK Rad, EK-DBAY D5 PWM, 32GB G.Skill 3000MHz DDR4, AMD Vega 64 Wave, Samsung nVME SSDs
    Prior Build: Core i7 7700K @ 4.9GHz / Apogee XT/120.2 Magicool rad, 16GB G.Skill 3000MHz DDR4, AMD Saphire rx580 8GB, Samsung 850 Pro SSD

    Intel 4.5GHz LinX Stable Club

    Crunch with us, the XS WCG team

  21. #46
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by Shadov View Post
    Ok I vote that Battleforge DirectX 11 code path should be ATI exclusive, since Nvidia didnt even have hardware to support the devs.

    Like it as a consumer?
    DirectX is a bit different of AA, AF, vSync, Tripple buffering, Edge Detect AA, Tesselation, etc...

    Directx is an API, the rest are features supported or not by each GPU vendor and supported or not by each game engine.

    Able to see the trick here?


    Edit: I was watching this video http://www.youtube.com/watch?v=6GyKCM-Bpuw and the PhysX effects are really what makes the game shine.

    I wish these effects could be available for any hardware vendor, someone has to do something about it...
    Last edited by Luka_Aveiro; 09-29-2009 at 12:43 PM.
    Are we there yet?

  22. #47
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Rocksteady (Batman devhouse) already made a statement about the demo:

    The form of anti-aliasing implemented in Batman: Arkham Asylum uses features specific to NVIDIA cards. However, we are aware some users have managed to hack AA into the demo on ATI cards. We are speaking with ATI/AMD now to make sure it’s easier to enable some form of AA in the final game.
    http://forum.beyond3d.com/showthread.php?t=54786&page=2
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  23. #48
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Bayamon,PR
    Posts
    257
    Quote Originally Posted by Luka_Aveiro View Post
    So this is what we know:

    - UE3 engine -> problems with regular AA (everyone knows that);

    - nVidia worked with devs so their costumers could use AA with B:AA;

    - their AA method is compatible with ATI cards (tricking the game into thinking you are using an nVidia card but using an ATI makes AA work - don't know if it flawless, however...);

    - apparently, AMD was expecting to have AA enabled to their hardware, but since AA was brought up by nVidia+devs, it was only available with nVidia hardware;

    If you ask me, I think AMD should publicly apologize nVidia+devs and start immediatly working with devs in order to make AA available with their hardware, in their own interest: if this situation gave them the will to complain, then it should give them aswell the will to fix the problem.
    I mean, in the first place, AMD should have contacted devs in order to know why AA was disabled in the game instead of coming to public making this assumptions...

    If it were in some other companies, some heads would roll...
    Why should AMD apologize ? If anything Nvidia should for depriving the game for the consumers , monopolizing the market to make it so as in " If you want all the eye candy and physx get Nvida " Its like a kick in the face for us consumers . They should make it so to be compatible in all hardware . I don't care about physx , what I do care is dont make a game run half of its potential just because Im not using Nvidia , samething if ati did the same . Consumer wise is not good and not fair for our pockets .

  24. #49
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Bayamon,PR
    Posts
    257
    Quote Originally Posted by jaredpace View Post
    Rocksteady (Batman devhouse) already made a statement about the demo:



    http://forum.beyond3d.com/showthread.php?t=54786&page=2
    Some form of AA ? rofl they should not bother at all , if its not 100 % .

  25. #50
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Bojamijams View Post
    Most of are going to get a crack that lets Batman:AA have AA with ATI .. its been done before and it'll be done again
    That might be a bit difficult for the same reason that the PhysX hack only worked on the demo. They'd likely have to circumvent SecuROM, which is far from an impossible task but it would no doubt break the game's EULA or something.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

Page 2 of 18 FirstFirst 1234512 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •