It would seem the people at Nvidia and Intel both went to the same business school. The school of, "Screw your competitors and consumers at any cost". :down:
It would seem the people at Nvidia and Intel both went to the same business school. The school of, "Screw your competitors and consumers at any cost". :down:
nV is pathetic, nuff said.
I know fist hand Nvidia gives developers the support they need, this has been in place for quite a while.
Saying Nvidia is "buying" developers is false to stupidity. Nvidia may pay for the ad space for TWIMTBP logo (I am not sure), much like Gigabyte logo at the top of this site. I am not paid to screw the other manufactures, I use ASUS in my gaming rig.
DX10.1 wasn't shady at all? and it changed your gaming experience any?
We will always see the top rivals battle it out, keeps us happy with the best products faster to market at a fair price.
Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.
Have you read this at all?! They paid devs to detect competitors card and gimp it. How the fsck is that OK?
SimBy, nope but you did miss "Nvidia working closely with game developers validating in game features"
Key word, validating.
I don't understand how some people is defending this.
Some of you are saying that NVIDIA paying to improve the performance of some games in their hardware is fair, is business. I agree. This is not what is happening.
Purposedly damaging in the software the performance or features of that software on hardware of competitors is not fair, and it's not something that can benefit in any way to the consumers. That is: us.
I don't understand that anybody that is not a NVIDIA employee, has not a huge investment in NVIDIA shares, or has some kind of romantic relation with a NVIDIA employee, can accept that and even less defend it.
This is not the first place where I read about Batman:AA and its purposedly AA disabling when an ATi card is installed on the system.
This thing is an absolutely embarrasing thing for both NVIDIA and the developers of the game. I don't know about the other games that are named here (I doubt an ATi representant is exactly unbiased), but made in one title, who knows about others...
That thing is little short to include a wait function of 10ms each iteration of the game loop if a competitor product is detected to damage it's performance.
I'm not by any means an ATi defender, or an NVIDIA hater, but as a consumer, things like this turn my stomach.
people are getting mad cause a hardware company is offering time and money to help games get smooth results. whats so bad? disabling the AA ingame was done since they cannot verify with ATI that it will not cause any issues (this is my interpretation of if, since i do not know if anyone who help build B:AA could have done this by simply putting it on an ATI rig and testing onsite)
when in the last 5 years has any game run more than 20% better on one companies hardware? (considering both cards are about equal on average)
Isn't the issue 'where do you draw the line'?
What if they disabled anything other than 'low' texture quality on ATi hardware, saying they hadn't validated it for ATi hardware?
Surely the point of a platform / API etc., is that you can pretty much abstract the hardware and get on with the coding without worrying about the bare metal? Or am I just being naive?
I am not the game developer, this issue I am sure was something they went over and maybe found a problem. They must have taken into account the backlash this would cause too, so an answer from the game developers should be forthcoming given this is a hot topic.
Ummm, all the complainers talking about "ruining the game experience" need to get a clue. ATi card owners get the exact same excellent experience that console owners do. What Nvidia did was get more features added for their customers. How exactly are Nvidia's customers getting screwed here?
Well I'm sure nvidia tosses money at game developers rather than works with them. I can't imagine Nvidia devoting that much time and effort. If Nvidia was really wanting to play fair then ATI owners would be able to use their old Nvidia cards for Physx
Glow9,
"I can't imagine Nvidia devoting that much time and effort."
They do.
Are you sure about that? I'm tempted to put a lot more faith in Glow9's imagination and his feelings about the whole thing, facts be damned!
as far as they can go without making obvious asses out of themselves. i think they are close to that limit, but have not hit it yet
To other peole:
im all for nvidia paying to help get a game developed, nvidia is really big on trying to get features that keep a fanbase so they always have their dedicated customers.
if physx became important (which it has a possibility too, but doubtful) i would buy an nvidia card. if needed CUDA just a few times, then i would buy nvidia. but im just a simple gamer who looks at $ per frame, and is very familiar with the OCing tools for ATI cards, and have not found a reason to try learning new tricks that would be needed when switching to an nvidia card.
who here honestly expects ATI and Nvidia tech demos to work on their opponents cards?
The EU will be VERY interested on studying this case.
:p:
Nvidia can go around tainting crops, causing wildfires and slashing tires etc. For all i care, i refuse to support ATi after having a X800XT P.E on *Order for 17 months.
No, people is getting annoyed because of detected sabotage.
I find logical that NVIDIA invests money to help developers to optimize their code, and I find logical that NVIDIA don't invest a single minute in optimizations for the competitors. That's all good. Sabotaging it's not.
If hardware companies are allowed to pay money to software developers to damage their performance (or features) that should work perfectly on competitor's hardware, we will end up with "games for NVIDIA cards" and "games for ATi cards", like if it were "games for XBox 360" and "games for PS3".
That's no good.
That would be a very, very bad excuse that any game developer would refute in a heartbeat. I, as a hobbyist game programmer myself, can assure that there's no need to contact with any hardware maker to verify if some code works on their hardware. That's the exact reason why intermediate APIs such as Direct3D, OpenGL, or the high level shader languages of both exist.Quote:
disabling the AA ingame was done since they cannot verify with ATI that it will not cause any issues (this is my interpretation of if, since i do not know if anyone who help build B:AA could have done this by simply putting it on an ATI rig and testing onsite)
Yes, it can happen that some code that should work on certain hardware, doesn't because bugs on either side (your code, their hardware). Then, disabling that code would be a shoddy but quick (sometimes deadlines are like prisons) solution, at least in a provisional way.
But disabling it just in case? That makes me laugh. And why they don't disable the rest of the shaders too? You know, that AA filtering has been proven to work well with ATi hw making use of some tricks (including the crack to pass the Securerom in the game, of course)...
i dont see sabotage, i just see a company helping out. nothing more. unless we get the exact reason from a B:AA developer for why AA was disabled, its all speculation. and im speculating in favor of the idea that this is trying to take a mole hill and make a mountain.
AA was not disabled. It was simply enabled only for Nvidia hardware. Big difference there. So nobody would have gotten AA. Now Nvidia users get it. ATi users don't lose anything in the deal.
No. That AA code is shader code (AFAIK).
I think you know enough (more than me I think) about those things, so:
If you have a piece of code, written with a standard API, that should work on any hardware compatible with that API, and you include an absolutely unneeded line such as (either way it's the same):
If myHW is detected, then papapapa...
or
If not competitorsHW is detected, then papapapa...
Do you really think that nobody is losing nothing with "the deal"? Come on man...
Including extra code to ensure that certain features that should work on standard hw only do with some brand of hw, is not disabling it?
You're playing with the words Trinibwoy!
How you can't enabled AA with ATI cards and you can enabled AA with Nvidia cards implied nobody would have gotten AA? Wich games is out now without AA option?
It's regular hardware accelerated AA.
An API just asks the hardware and driver to do something. There's no guarantee at all that it does it properly. If what you said was true there would never be bugs in games that only affect one vendor's hardware.Quote:
If you have a piece of code, written with a standard API, that should work on any hardware compatible with that API
It's not disabling it if you wouldn't have gotten it in the first place. What you're saying is that adding it for Nvidia but not adding it for ATi is the same as disabling. That's obviously incorrect.Quote:
Including extra code to ensure that certain features that should work on standard hw only do with some brand of hw, is not disabling it?