MMM
Page 16 of 18 FirstFirst ... 6131415161718 LastLast
Results 376 to 400 of 444

Thread: Nvidia responds to Batman:AA

  1. #376
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by annihilat0r View Post
    You fail to make any sense in regards to your previous post: "they shouldn't involve 'me' in their tricks"

    Sorry but that sounds like there might have been tricks in the past that do not involve 'you' in it, which sounds funny.
    Not saying there have not been that does not mean i have to like it & lump it.
    But i have yet to be aware of such obvious right in people face before in stuff that intrests me.

    Your too busy analysing the exact words used instead of getting the message.

    I don't have the batman game so it does not effect "me" so it can really only mean others & others being the consumers because this is all about what's better for the consumers.
    Last edited by Final8ty; 10-04-2009 at 09:15 AM.

  2. #377
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by trinibwoy View Post
    Question for all the bleeding hearts. Nvidia spends far more money on developer support. You think ATI consumers should benefit equally. Why? What business model supports that strategy? Is ATI a poor orphan that needs to suckle from Nvidia's tits?
    Simple. ATI and Nvidia owners pay the same amount of money for the game. But ATI customers don't get all the features. And that's not because of an ATI limitation (like no physx), but simply because of a choice by the developers not to put forth the effort.

    Quote Originally Posted by trinibwoy View Post
    In what way? The only people pissed off are current owners of non-Nvidia cards. How does that hurt them?
    I don't have any ATI cards at the moment. But I'm not going to buy a game that I couldn't get the same enjoyment out of regardless of my future hardware choices.

  3. #378
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    The way it's meant to be played is nothing more than canned non-triggered events, that use special hooks, to make ancillary things look more realistic. PhysX inside games is a hoax. It's a marketing tool, nothing more. PhysX isn't about the game or advancing gameplay with real in-game physics. So even those who consider themselves fanboi of this technology, don't even understand Nvidia's business model, or what they are arguing about with TWIMTBP.

    The Batman AA thing is just another form of Nvidia trying to sell their products based on marketing PERCEPTION. Or to make those who bought Nvidia card, feel better about themselves.. which is why they bought the cards in the first place. Nvidia's slogan means nothing other, than Nvidia's own belief that you should play all the games, on their cards. So they convince a game maker to run specialized AA, instead of just AA... because Nvidia tossed them some extra $$ to slap their logo on Batman's BOX and marketing rights of TWIMTBP for their title.
    Last edited by Xoulz; 10-04-2009 at 11:02 AM.

  4. #379
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Ottawa, Canada
    Posts
    2,443
    Quote Originally Posted by trinibwoy View Post
    Let the world run you? Heh, are you on the wrong forum? This is a thread about a graphics card and a video game so you can save the grandiose proclamations for something else.
    That was grandiose? Here let me put it this way then......You gladly accept what others give and tell you without a fight. It was just a nicer way of saying your a and a pushover where they can give you a perceptual reason in their favor and it looks good to you.

    Quote Originally Posted by trinibwoy View Post
    Your rights as a consumer are no way impinged by what Nvidia is doing. Do you have a right to AA in Batman? You right is to spend your money on whatever products you want and for those products to work as advertised. You have no right to dictate what those products should be.
    Rights as a consumer? Naw I was talking about a right to protest what people are doing. That's a right too! You don't have to keep taking things with a smile you know. Sometimes the spine can be picked up from your socks and things can happen. Like a company changing their attitude to better the consumer pool instead of dividing it into smaller pieces.

    Quote Originally Posted by trinibwoy View Post
    The only thing I've seen Nvidia do so far that may be illegal is to disable Ageia PPUs when AMD cards are installed.
    Well at least you see something.

  5. #380
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Xoulz View Post
    The way it's meant to be played is nothing more than canned non-triggered events, that use special hooks, to make ancillary things look more realistic. PhysX inside games is a hoax. It's a marketing tool, nothing more. PhysX isn't about the game or advancing gameplay with real in-game physics. So even those who consider themselves fanboi of this technology, don't even understand Nvidia's business model, or what they are arguing about with TWIMTBP.

    The Batman AA thing is just another form of Nvidia trying to sell their products based on marketing PERCEPTION. Or to make those who bought Nvidia card, feel better about themselves.. which is why they bought the cards in the first place. Nvidia's slogan means nothing other, than Nvidia's own belief that you should play all the games, on their cards. So they convince a game maker to run specialized AA, instead of just AA... because Nvidia tossed them some extra $$ to slap their logo on Batman's BOX and marketing rights of TWIMTBP for their title.
    how can you judge the quality of an API based on games. with that perspective you probably think dx10, dx10.1 and dx11 suck.you have never developed any software with physx so i dont think you have any room to say it sucks and it is currently the only way to get physics on a gpu. opencl and directcompute are young and will gain steam later or at least when ATi can get a compiler working. contrary to your opinion gpu physx does improve the quality although not worth the cost to some. current gpu's are so powerful games cant really use them to the fullest extent. as for your view of marketing you might want to change that too. do you know who decided to market their cpu's in the early 90's?intel. look at them now 80% market share and nehalem. you can accuse nvidia of overbranding but isnt that true with almost everything else in this world? you just rant about AA but you really dont understand it at all. ATi and Nvidia have their own texture filtering algorithms. there is no such thing as specialized AA.
    Rege said "we had absolutely no time to go 'oh yeah, how can we screw ATI by the way?' Seriously, nobody ever has time to think about those kinds of things. In this situation, had we enabled something that was not tested on ATI GPUs [and broke the game as a result], there were a number of things that could have happened. The worst thing from my perspective is that the developers won't want our help in the future because we broke their game."

    btw this is pretty cool.

  6. #381
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by Final8ty View Post
    NV has been more about marketing then anything else.
    The AA issue is marketing on NVs part as somehow AA is an emerging technology.
    You see all over the web from review sites saying that Physx is being used more of a marketing tool than making any real physics changes to games that it is used in.

    Ther Dx11 titles in development are being made with AMDa invoment & sometimes AMDs money as well.

    It looks like you have fallen more for the marketing than the real facts.
    If Marketing can help a devloper get a custom made AA mode into a game that otherwise wouldn't have had it, then i'd say there's actual work going on. I never believe marketing, it's always a case of "lies, damn lies & statistics". The fact is AMD haven't done the right thing and not just complain but work to improve their situation. Havok hasn't exactly set the world on fire and Bullet is too new to have had any impact yet.

    Where are those DX11 titles then? Right now they are more "coming son" than "on sale now" which is what matters in business.

  7. #382
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Chumbucket843 View Post
    how can you judge the quality of an API based on games. with that perspective you probably think dx10, dx10.1 and dx11 suck.you have never developed any software with physx so i dont think you have any room to say it sucks and it is currently the only way to get physics on a gpu. opencl and directcompute are young and will gain steam later or at least when ATi can get a compiler working. contrary to your opinion gpu physx does improve the quality although not worth the cost to some. current gpu's are so powerful games cant really use them to the fullest extent. as for your view of marketing you might want to change that too. do you know who decided to market their cpu's in the early 90's?intel. look at them now 80% market share and nehalem. you can accuse nvidia of overbranding but isnt that true with almost everything else in this world? you just rant about AA but you really dont understand it at all. ATi and Nvidia have their own texture filtering algorithms. there is no such thing as specialized AA.


    btw this is pretty cool.


    Sorry, physical object within a game were around before Ageia or Nvidia's over hyped PhysX.

    Battlefield 2142 has physics, it adds to the game and anyone who plays it has use of it. Nothing new or even worth noting... as many games have physics.

    You don't have to have PhysX.



    Secondly, you make my point about Anti-aliasing. Cards should render smooth jaggies of whatever scene they are displaying. Why does the game itself have to work specifically with a certain vendor to make it work FASTER?

    Because it's an alliance between the two. To sell more games and create a fake need for a certain product. AA isn't some dark secret. There is no such thing as specialized AA?, then why is Batman and Nvidia trying to market themselves as being special?




    Quote Originally Posted by Chumbucket843 View Post

    btw this is pretty cool.
    That's all "fluff" not real physics.. please. That paper is superficial to gameplay..!
    Last edited by Xoulz; 10-04-2009 at 01:28 PM.

  8. #383
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by trinibwoy View Post
    Perhaps, but I can guarantee that when I use it, it's only in response to actual naivete.
    Not in this case, since you have said I'm naive because of saying something that I have not said. So...

    Quote Originally Posted by Solus Corvus View Post
    I don't have any ATI cards at the moment. But I'm not going to buy a game that I couldn't get the same enjoyment out of regardless of my future hardware choices.
    +1. But then you have to think about all those people who are "NVIDIA users" (their future hardware choices don't include the possibility of buying a non NVIDIA card).

    I suppose that what the rest of the world should do, is not buying that game. And made clear to everybody that if they are not absolutely sure about having only NVIDIA hw from now on, that they shouldn't too.

  9. #384
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by DeathReborn View Post
    If Marketing can help a devloper get a custom made AA mode into a game that otherwise wouldn't have had it, then i'd say there's actual work going on. I never believe marketing, it's always a case of "lies, damn lies & statistics". The fact is AMD haven't done the right thing and not just complain but work to improve their situation. Havok hasn't exactly set the world on fire and Bullet is too new to have had any impact yet.

    Where are those DX11 titles then? Right now they are more "coming son" than "on sale now" which is what matters in business.
    The DX11 titles not being on sale right now does not change the point that is being made neither does it change that fact of ATI/AMD money & involvement in them with DX11 & that ATI/AMD will not have any features turned off on NV dx11 hardware even tho NV paid no money or put any help in getting Dx11 supported in them.

    Talk about Double Standards.


    Might as well say why is anyone even talking about NVx300, right now they are more "coming son" than "on sale now" which is what matters in business.
    Last edited by Final8ty; 10-04-2009 at 01:41 PM.

  10. #385
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Xoulz View Post
    Sorry, physical object within a game were around before Ageia or Nvidia's over hyped PhysX.
    yes, and gpu accelerated physics brings a new level of realism to games. not saying that physx is the one that will do it but its a start.
    Battlefield 2142 has physics, it adds to the game and anyone who plays it has use of it. Nothing new or even worth noting... as many games have physics.

    You don't have to have PhysX.
    of course you dont have to have it.
    Secondly, you make my point about Anti-aliasing. Cards should render smooth jaggies of whatever scene they are displaying. Why does the game itself have to work specifically with a certain vendor to make it work FASTER?
    you clearly didnt read the quote. if nvidia allowed it to run on ATi they would have no control over the driver which could end up making it not work if ATi released newer drivers. besides that statement you said is not true. if you knew a thing or two about AA then you would know a mipmap at its native resolution has no artifacts. when you scale it up or down then it appears.
    Because it's an alliance between the two. To sell more games and create a fake need for a certain product. AA isn't some dark secret. There is no such thing as specialized AA?, then why is Batman and Nvidia trying to market themselves as being special?
    there is no need for physx. its just an addition to owning a geforce card. if there was a need for physx then no one would buy ATi cards. Unreal engine 3 does not have in game AA abilities. nvidia put it in to allow their users to achieve better quality graphics.


    That's all "fluff" not real physics.. it's superficial to gameplay..
    how is that not real physics? making a whole destructable world like a sandbox takes a lot of effort and time. just because its cloth physics doesnt mean its not real.

  11. #386
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Play HL2.Play Batman:AA.
    Physics in HL2 bring something to the user experience. PhysX in Batman:AA is just cosmetics.
    You can't play HL2 without it. You can play Batman:AA without.

  12. #387
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Xoulz View Post
    That's all "fluff" not real physics.. please. That paper is superficial to gameplay..!
    Please define "real" physics. And then define what it is that graphics cards do. If you answer both questions you'll realize how silly that statement is.

    Quote Originally Posted by Sadasius View Post
    You gladly accept what others give and tell you without a fight. It was just a nicer way of saying your a and a pushover where they can give you a perceptual reason in their favor and it looks good to you.

    Rights as a consumer? Naw I was talking about a right to protest what people are doing.
    Barking up the wrong tree there buddy. None of the above applies to me as I do own Nvidia hardware. So what exactly is there for me to cry and moan about? I should join my ATi owning brethren in the fight for graphics equality due to some false sense of self-righteousness?

  13. #388
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    PhysX is a nice thing after all, if it was given to you for free, I don't think you'd say no.
    But the way Nvidia introduced it is wrong IMO. It is legal ofc, but it tricked developers into using Nvidia tools instead of their own ones so only half of the game owners can enjoy it to fullest. So they didn't just offer something new, no, they obviously ed people using competition's products over. Like 75% of the stuff that can be done WITHOUT PhysX gets done WITH it to hurt the competition.
    But the latest actions regarding disabling PhysX for gamers who PAID for it are outrageous and there is NO excuse for such crap.
    Last edited by zalbard; 10-04-2009 at 03:39 PM.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  14. #389
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Chumbucket843 View Post
    yes, and gpu accelerated physics brings a new level of realism to games. not saying that physx is the one that will do it but its a start.

    of course you dont have to have it.

    you clearly didnt read the quote. if nvidia allowed it to run on ATi they would have no control over the driver which could end up making it not work if ATi released newer drivers. besides that statement you said is not true. if you knew a thing or two about AA then you would know a mipmap at its native resolution has no artifacts. when you scale it up or down then it appears.

    there is no need for physx. its just an addition to owning a geforce card. if there was a need for physx then no one would buy ATi cards. Unreal engine 3 does not have in game AA abilities. nvidia put it in to allow their users to achieve better quality graphics.




    BTW, nVidia didn't put the anti aliasing into Batman, the developers did. Nvidia is not a game developer or publisher.



    how is that not real physics? making a whole destructable world like a sandbox takes a lot of effort and time. just because its cloth physics doesnt mean its not real.


    Bro, you are mistaken!


    PhysX needs EXTRA video cards to equal what the CPU can already do. We don't need superficial paper floating around or ancillary breakable tiles in our games.. .

    We want real physical objects, such as what the idle cores of modern CPU can deliver. PhysX is so far behind in that respect, it's ridiculous.


    Here, please witness what no single Nvidia card can do. Superficial fluff such as animated glass breaking, or paper on the ground is moot, trivial and not worth mentioning. It's there as a novelty, to sell video cards for those who buy into the pseudo marketing hype!
    Last edited by Xoulz; 10-04-2009 at 04:19 PM.

  15. #390
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Chumbucket843 View Post
    how can you judge the quality of an API based on games. with that perspective you probably think dx10, dx10.1 and dx11 suck.you have never developed any software with physx so i dont think you have any room to say it sucks and it is currently the only way to get physics on a gpu. opencl and directcompute are young and will gain steam later or at least when ATi can get a compiler working. contrary to your opinion gpu physx does improve the quality although not worth the cost to some. current gpu's are so powerful games cant really use them to the fullest extent. as for your view of marketing you might want to change that too. do you know who decided to market their cpu's in the early 90's?intel. look at them now 80% market share and nehalem. you can accuse nvidia of overbranding but isnt that true with almost everything else in this world? you just rant about AA but you really dont understand it at all. ATi and Nvidia have their own texture filtering algorithms. there is no such thing as specialized AA.


    btw this is pretty cool.
    Really? I have to disable AA to run Oblivion at a crappy framerate with a bunch of mods. There are a ton of mods that I can't even run for it with what I'm using now as well. Same deal with Morrowind. Darkplaces can bring my GTX280 to it's knees. These are just some examples of older games that can stress my 280 and it clocks as well as most GTX285s from what I hear. I could use more GPU power.

    I understand your rant about different texture filtering algorithms but why is it when you change the vender ID on an ATI card AA works in this game?

  16. #391
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Brooklyn, NYC USA
    Posts
    278
    Quote Originally Posted by Farinorco View Post
    But you were under wrong assumptions. NVIDIA came out and reworded things to look a bit better, but the thing is like follows:

    >>UE3 doesn't allow using default MSAA because the deferred shading.

    >>If you want to use any AA with a deferred shading, you must implement your own custom filter (as if you write an ambient occlusion shader, a motion blur shader, or any other thing).

    >>Batman Arkham Assylum uses UE3, so no default AA.

    >>Eidos, in cooperation with their sponsors NVIDIA, have coded a custom AA filter.

    >>They have took the decision of not allowing that code to run in ATI hw (even when perfectly compatible) by a check that disables it if ATI hw is detected.

    What NVIDIA has said, read the communication in the first post of this thread, it's that "they are not disabling for ATI but enabling only for NVIDIA because the UE3 don't have default AA and they have had to program one", that "since they have program their own AA code, if ATI wants their cards running with AA, that they program their own code".

    The whole point of this discussion is, is that right? OK, NVIDIA has written (or helped to write) the AA code. So, is it right that they don't allow ATI users to run it? ATI is helping Codemasters to write code for new features implemented over DX11. Is it right if they ask Codemasters to include a check to not allow this code running on NVIDIA hw, since they have written it (or colaborated to it)?



    Then, in your opinion, that thread has no basis. There's no AA issue further than the lock to grey out the option for ATI users. The trick to allow ATI cards bypass that lock (cheating the game into thinking that they are not ATI cards by changing the names that the game gives to the ATI cards, to not be able to recognize the ATI cards as ATI cards) actually works.

    The basis of this thread, is that there's people here that thinks that as NVIDIA has written (or helped to write) that code, they are on their right to not allow ATI users to run it.

    Some others say that this kind of practices are harming to the consumer, because they limit the options it has as consumer, and introducing exclusivity in PC software is the last thing we need now, so we shouldn't stay smiling and give them a thumb up.

    That's the point of the discussion.



    NVIDIA is not lying in their communication. They are rewording things to give that impression. They are saying "we are not disabling anything, because the engine doesn't support it by default. We are introducing new code that wasn't there before, so what we are doing it's not disabling anything, but enabling only for us. It's our work, so if ATI wants AA, they should code their own".

    They are playing with the concept of "if I would not have coded it, neither you or me would have it. So if I code it and I stop you from use it, you are equal and I have the code, so I am not disabling for you, but enabling for me".

    The reality, it's that it's a code written over standard API that work on all standard hw, ATI included, and it's not working because a lock. ATI could say exactly the same about the code their are helping to code to Codemasters for GRID2, for example.

    Note NVIDIA haven't said at any moment that there is not a lock (they can't, because that point has already been demonstrated), but that since they and not ATI have helped to develope that code, their are on their rights. Basically.
    Sry. Been away from the thread for a while.

    Ay vey. I sware it seems like u are not trying to see my viewpoint on this lol.

    Ok number 1

    If u actually do read the post started by this thread, the statement made by Nvidia is stating they worked with the devs to make sure AA was supported in game with their hw. There is nothing in that statement that states Nvidia made it so AA is ONLY enabled when using their hardware. U see what i'm saying?

    If u go word for word in that statement u will see no such wording. U can ASSUME and it would be a justified assumption that Nvidia made sure AA only worked on their hw and their hw alone. Again if we are going word for word Nvidia never made that claim in the statement posted.

    number 2

    By your own statement u are justifying that there is basis for the thread per my opinion by indicating that there is an issue with AA with Ati cards.

    There's no AA issue further than the lock to grey out the option for ATI users..

    Your above statement shows u admit there is an issue. The lock. Hence in my opinion there is basis for the thread.

    number 2

    The question of whether or not Nvidia was wrong in only making sure AA works with their cards is just one of the points to this discussion. Other points are did Ati just slip up and didn't make an effort to insure the game would run with all the features it can with their cards? Did Nvidia purposely only allow AA in game with their cards only and locked out AA support for Ati cards? Were the devs playing favorites with the gpu companies? Does Nvidia make it a standard practice to gimp any users of their TWIMTBP games if using Ati cards? Will Laura ever tell Luke he is not the father of thier son? Sry. Couldn't help that last one lol

    I hear u @ rights. IMO Nvidia does not have to implement anything they dont want to in their sponsored games as long as it is within the scope of the law. Same goes for Ati. Is it right? That's per individual opinions. Honestly if ati had features that Nvidia didnt I go with Ati, and vice versa.

    I owned Ati cards and I liked them. I own not one once of stock in either Nv or Ati, I dont work for either, and have no standing exclusive contracts with either so I owe them no loyalty. Hell if Intel drops a sick video card I'd switch to them (Hurry up intel). I played Grid, and COD4 and thought both games looked better on Ati hardware then Nvidia. I played Batman and think physx does make the game look better so Nvidia for me when playing Batman.
    Xeon w3520 oc'd to 4.2ghz w/ htt enabled
    Foxconn Bloodrage (G32 Bios)
    2 x Nvidia 285 gtx's (overclock varies. Most recent core: 700/ shaders: 1550/ ram: 1230)
    6gbs G-skill Trident ddr3 2000mhz ram @ 1604mhz (timings: 6 7 6 20 T1)
    Silverstone ST1000 1000W PSU
    WD RaptorX 150gb, WD 1tb, Seagate Perpendicular 320gb (non raid config)
    2 x LG Sata 22x DVD Burner
    Swiftech Apogee XT Extreme Water-block, Feser X-Changer 360mm Xtreme Performance Radiator, Alphacool Cape Coolplex Pro 10 External Reservior, Swiftech MCP655™ 12 VDC Pump, 3x Scythe 120mm fans, 1/2in tubing
    Creative X-Fi Titanium Sound Card
    Lian Li X-500 Case
    And a partridge in a peared tree

  17. #392
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by Trembledust View Post
    Sry. Been away from the thread for a while.

    Ay vey. I sware it seems like u are not trying to see my viewpoint on this lol.

    Ok number 1

    If u actually do read the post started by this thread, the statement made by Nvidia is stating they worked with the devs to make sure AA was supported in game with their hw. There is nothing in that statement that states Nvidia made it so AA is ONLY enabled when using their hardware. U see what i'm saying?
    So, if NV didn't say, it didn't happened? Geez, that lock has been proven to exist. And there's nothing in what NV says that admits that proven fact, but there's also nothing in what NV says that denies this fact.

    Do you see what I'm saying?

    If u go word for word in that statement u will see no such wording. U can ASSUME and it would be a justified assumption that Nvidia made sure AA only worked on their hw and their hw alone. Again if we are going word for word Nvidia never made that claim in the statement posted.
    Again, no assumption since it has been proven true. It's a fact, and when it's matched with the NV's PR reply, gives you a pretty clear story.

    number 2

    By your own statement u are justifying that there is basis for the thread per my opinion by indicating that there is an issue with AA with Ati cards.

    There's no AA issue further than the lock to grey out the option for ATI users..

    Your above statement shows u admit there is an issue. The lock. Hence in my opinion there is basis for the thread.
    I was assuming that what you meant with "issue" was a problem/bug/malfunction that AMD could work with the developers to repair it, since you said that AMD should work with the developers to repair it...

    Of course, if in your definition of "issue" you include a lock put on there by developers because they're being supported by the competitor brand, then there is an issue with AA on AMD hw.

    But note the case is completely different since here AMD hardly can do anything to make it run. It's not a compatibility fault, or a bug. It's a premeditated decision taken by the developer.


    number 2

    The question of whether or not Nvidia was wrong in only making sure AA works with their cards is just one of the points to this discussion.
    Interesting rewording of the situation. "in only making sure AA works with their cards" instead of "in making sure AA works only with their cards". Could you (people defending NV) stop twisting words to make things appear as more innocuous than they are?

    Other points are did Ati just slip up and didn't make an effort to insure the game would run with all the features it can with their cards?
    If they are saying that if ATI don't support them, their game won't run on ATI, I don't know why there's not a lock that prevent the game to run on an ATI... moreover, I'll insist about the IHVs not being the ones that develope the sw at any of their stages. Accusing a IHV of being the culprit of something not working in a certain sw when the problem it's in the sw itself and not the hw or its drivers, is laughable.

    Did Nvidia purposely only allow AA in game with their cards only and locked out AA support for Ati cards?
    Yeah. You're getting into it.

    Does Nvidia make it a standard practice to gimp any users of their TWIMTBP games if using Ati cards?
    Not until now, that I know of. That's the reason why this specific case makes me annoyed, and I think that if they think people is not feeling rejection for the idea, they will feel free to make it again on other titles.

    I hear u @ rights. IMO Nvidia does not have to implement anything they dont want to in their sponsored games as long as it is within the scope of the law. Same goes for Ati. Is it right? That's per individual opinions. Honestly if ati had features that Nvidia didnt I go with Ati, and vice versa.
    But features should be in hw, and used or not used by sw. Making sw exclusivity based on "If it is competitors hw, then exit to windows" it's not the way to go.

    For example: NVIDIA had CUDA, ATI had Stream. NVIDIA has achieved to get CUDA used because of PhysX in a couple titles, good for them. Nothing for ATI Stream, bad for them.

    For example: ATI has had a hw tesselator since HD2900 and XBox 360, and the only game where has been used has been "Viva Pinata" for XBox 360. Bad for them.

    Those are hw features, that may be on use or not on use. And that can make you decide between one piece of hw or another.

    If they start now to make standard code not to be standard by inserting "if" sentences in the code, we are not walking on a good path. We will be far more limited than ever before because of our hw choices. NV don't care about that. If they think they are going to do well out of the deal against their competitors, they will go for it, even if the people who lose with it are us, the consumers. But we should care about it.
    Last edited by Farinorco; 10-05-2009 at 12:49 AM.

  18. #393
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Brooklyn, NYC USA
    Posts
    278
    Quote Originally Posted by Farinorco View Post
    So, if NV didn't say, it didn't happened? Geez, that lock has been proven to exist. And there's nothing in what NV says that admits that proven fact, but there's also nothing in what NV says that denies this fact.

    Do you see what I'm saying?



    Again, no assumption since it has been proven true. It's a fact, and when it's matched with the NV's PR reply, gives you a pretty clear story.



    I was assuming that what you meant with "issue" was a problem/bug/malfunction that AMD could work with the developers to repair it, since you said that AMD should work with the developers to repair it...

    Of course, if in your definition of "issue" you include a lock put on there by developers because they're being supported by the competitor brand, then there is an issue with AA on AMD hw.

    But note the case is completely different since here AMD hardly can do anything to make it run. It's not a compatibility fault, or a bug. It's a premeditated decision taken by the developer.




    Interesting rewording of the situation. "in only making sure AA works with their cards" instead of "in making sure AA works only with their cards". Could you (people defending NV) stop twisting words to make things appear as more innocuous than they are?



    If they are saying that if ATI don't support them, their game won't run on ATI, I don't know why there's not a lock that prevent the game to run on an ATI... moreover, I'll insist about the IHVs not being the ones that develope the sw at any of their stages. Accusing a IHV of being the culprit of something not working in a certain sw when the problem it's in the sw itself and not the hw or its drivers, is laughable.



    Yeah. You're getting into it.



    Not until now, that I know of. That's the reason why this specific case makes me annoyed, and I think that if they think people is not feeling rejection for the idea, they will feel free to make it again on other titles.



    But features should be in hw, and used or not used by sw. Making sw exclusivity based on "If it is competitors hw, then exit to windows" it's not the way to go.

    For example: NVIDIA had CUDA, ATI had Stream. NVIDIA has achieved to get CUDA used because of PhysX in a couple titles, good for them. Nothing for ATI Stream, bad for them.

    For example: ATI has had a hw tesselator since HD2900 and XBox 360, and the only game where has been used has been "Viva Pinata" for XBox 360. Bad for them.

    Those are hw features, that may be on use or not on use. And that can make you decide between one piece of hw or another.

    If they start now to make standard code not to be standard by inserting "if" sentences in the code, we are not walking on a good path. We will be far more limited than ever before because of our hw choices. NV don't care about that. If they think they are going to do well out of the deal against their competitors, they will go for it, even if the people who lose with it are us, the consumers. But we should care about it.

    So, if NV didn't say, it didn't happened? Geez, that lock has been proven to exist. And there's nothing in what NV says that admits that proven fact, but there's also nothing in what NV says that denies this fact.

    Do you see what I'm saying?


    Are u putting words in my mouth after u repramanded someone in an earlier post to not put words in your mouth? Kinda hypocritical of u don't u think? Shame on u.


    Ok honestly I don't see the point in continuing this debate as it seems u already made up your mind that Nvidia is guilty of intentionally gimping their competition by placing a line of code which prevents in game AA support on Ati hardware.

    I came across this article in researching the issue further.



    http://www.bit-tech.net/news/hardwar...-accusations/1



    With the statements put out so far I am not ready to accuse Nvidia of intentionally gimping Ati hardware with Batman. I'm not saying Nvidia maybe didn't intentionally do it either.

    Maybe the code was put in place to possibly prevent any additional issues that may have occured when using Ati hw if the line was not put in with the game. Nvidia stated they didn't have time to test the hardware on Ati cards. Its possible they are saying if Ati had dedicated a team to work on the game with the devs at the time prior to the pc release of the game there wouldn't have been an AA issue with Ati's hardware.

    I need more evidence to outright say Nvidia purposley gimped AA in the game with Ati hw. From all the statements made to date on this matter IMO it seems Ati just didn't put forth enough effort to make sure the game ran without issue on their hardware. Whether that is true or not...who knows?

    And the devs are working with Ati now to solve the issue. That alone gives me pause on Ati's statements that Nvidia purposely blocked their attempts with working with the devs on the game. But again i'm just going by what has been put out to date on the issue.

    And not for nothing to kinda back up a little about my Ati theory, Ati does not have the resources, and man power that Nvidia has. They may have had their peoples working on other projects and couldn't really focus on Batman at that time.

    Ok i'm done. Have a good one.
    Last edited by Trembledust; 10-05-2009 at 10:37 PM.
    Xeon w3520 oc'd to 4.2ghz w/ htt enabled
    Foxconn Bloodrage (G32 Bios)
    2 x Nvidia 285 gtx's (overclock varies. Most recent core: 700/ shaders: 1550/ ram: 1230)
    6gbs G-skill Trident ddr3 2000mhz ram @ 1604mhz (timings: 6 7 6 20 T1)
    Silverstone ST1000 1000W PSU
    WD RaptorX 150gb, WD 1tb, Seagate Perpendicular 320gb (non raid config)
    2 x LG Sata 22x DVD Burner
    Swiftech Apogee XT Extreme Water-block, Feser X-Changer 360mm Xtreme Performance Radiator, Alphacool Cape Coolplex Pro 10 External Reservior, Swiftech MCP655™ 12 VDC Pump, 3x Scythe 120mm fans, 1/2in tubing
    Creative X-Fi Titanium Sound Card
    Lian Li X-500 Case
    And a partridge in a peared tree

  19. #394
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by Trembledust View Post
    I'm sorry. I was under the assumption the thread topic was the fact Nvidia came out and said they did not purposely prevent AA from working on Ati Hardware on Batman Arkyum Asylum
    Quote Originally Posted by Trembledust View Post
    If u actually do read the post started by this thread, the statement made by Nvidia is stating they worked with the devs to make sure AA was supported in game with their hw. There is nothing in that statement that states Nvidia made it so AA is ONLY enabled when using their hardware. U see what i'm saying?

    If u go word for word in that statement u will see no such wording. U can ASSUME and it would be a justified assumption that Nvidia made sure AA only worked on their hw and their hw alone. Again if we are going word for word Nvidia never made that claim in the statement posted.
    Quote Originally Posted by Farinorco View Post
    So, if NV didn't say, it didn't happened? Geez, that lock has been proven to exist. And there's nothing in what NV says that admits that proven fact, but there's also nothing in what NV says that denies this fact.
    Quote Originally Posted by Trembledust View Post
    Are u putting words in my mouth after u repramanded someone in an earlier post to not put words in your mouth? Kinda hypocritical of u don't u think? Shame on u.
    I'm sorry, I thought you were implying that we were only making assumptions about the presence of a lock preventing the game to run the AA filter when ATI hw is detected, even though there are proofs that demonstrate it, because NVIDIA haven't said themselves that's true. My fault, then...

  20. #395
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    “In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.”
    http://www.pcper.com/article.php?aid=791
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  21. #396
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    funny how they declined, then send hate mail asking why they didnt get AA support

  22. #397
    no sleep, always tired TheGoat Eater's Avatar
    Join Date
    Oct 2006
    Location
    Iowa, USA
    Posts
    1,832
    Quote Originally Posted by FUGGER View Post
    “In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.”
    http://www.pcper.com/article.php?aid=791
    then amd really dropped the ball - and a big ball at that!

  23. #398
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Wow, I'd love to have all the weight that Eidos seems to have, to be able to come to an IHV and say "hey, I offer to you to program for free a couple features for my engine, or otherwise, they won't work on your hardware".

    It's incredible. I can't say more. What a disrespectful attitude to their customers.

    NOTE: I say that because it's fairly obvious that a solution specific to each architecture wasn't needed since one programmed over a standard API compatible with every standard hardware was found and implemented, so what they were asking was IHVs to do their work for them?
    Last edited by Farinorco; 10-06-2009 at 07:54 AM.

  24. #399
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Cape Town - South Africa
    Posts
    261
    Thanks for the info FUGGER.

  25. #400
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Brooklyn, NYC USA
    Posts
    278
    Good stuff Fugger. Thanks. Dude...Let it go lmao @ Farinorco
    Last edited by Trembledust; 10-06-2009 at 09:06 AM.
    Xeon w3520 oc'd to 4.2ghz w/ htt enabled
    Foxconn Bloodrage (G32 Bios)
    2 x Nvidia 285 gtx's (overclock varies. Most recent core: 700/ shaders: 1550/ ram: 1230)
    6gbs G-skill Trident ddr3 2000mhz ram @ 1604mhz (timings: 6 7 6 20 T1)
    Silverstone ST1000 1000W PSU
    WD RaptorX 150gb, WD 1tb, Seagate Perpendicular 320gb (non raid config)
    2 x LG Sata 22x DVD Burner
    Swiftech Apogee XT Extreme Water-block, Feser X-Changer 360mm Xtreme Performance Radiator, Alphacool Cape Coolplex Pro 10 External Reservior, Swiftech MCP655™ 12 VDC Pump, 3x Scythe 120mm fans, 1/2in tubing
    Creative X-Fi Titanium Sound Card
    Lian Li X-500 Case
    And a partridge in a peared tree

Page 16 of 18 FirstFirst ... 6131415161718 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •