Originally Posted by
Farinorco
So, what we knew:
>UE3 has not any AA method implemented, and Batman:AA is coded over UE3.
>NVIDIA has helped the developers of B:AA to implement a custom AA filter.
>That custom AA filter developed in colaboration with Eidos is nothing exclusive to NVIDIA, since it's been proven to work with other D3D cards.
>Since NVIDIA has helped to implement (or completely implemented) that AA filter, they feel with the right to not allow people with hw of other IHVs to run their code, and "encourage" other vendors to implement their own code if they want that feature running on their hw, even when that code is perfectly compatible with any standard hw.
That bring us again to the main point of the discussion until now: is that right? Where those practices lead us, the consumers?
It would be the same thing if DIRT2 DX11 features (or part of them) don't work on NVIDIA DX11 hw when they release any, since AMD has supported the developement of that features.
It would be the same thing if OpenCL acceleration of Havok only works on AMD and Intel hw for the same reason.
The same for OpenCL acceleration in Bullet Physics.
Great for all consumers. There was a time where software coded over standard interfaces could be run on any hardware compliant with those standards. That was the whole point of those standards. There was a time... thanks, NVIDIA.
And I would like to add something to the discussion. I see lots of the people who are defending both NVIDIA and Eidos basing their arguments on the fact of UE3 not having AA:
A game engine is nothing else than a library (code implementing functions) to pack some of the work of doing a game. It includes some work generalizable to games in general, and then you (the developer) write the rest of the code of the game over (or under, or in) the game engine. That's one of the things about programming. You got some code, then you can use that code as a part of other programs, that can add code over, under or in it.
Saying that UE3 don't support AA may be true (it was when it was launched, I don't know now, I assume it's the same). But the UE3 don't support the Batman 3D model too. Or probably other shaders used specifically on that game. I'm sure that the developers have written other code apart from the UE3. Heck, I'm so sure that if you try to run the UE3 directly without any more code, you won't be playing Batman AA.
What I try to say, it's that there's no difference between the custom AA, the models used, any custom shader, or any game specific logic. It's all additional code that didn't came in the engine.
If you think that helping a sw company to develope some code, and then not allowing compatible hw of other vendors to run that code is fine, then it's your respectable opinion. But saying that "this is a special case because it didn't came in the engine", it's saying nothing.
I simply don't like a world where some sw is special to NVIDIA, some sw is special to ATI, some sw is special to Intel, some sw is special to AMD, and depending on the brands you chose you can run some things or some others.
Supposedly, standardization was introduced to avoid this. And it was a great benefit for all consumers. We are getting back to a time we had left behind long ago. It's only my opinion, though.
Bookmarks