Posted this;
http://forums.eidosgames.com/showpos...58&postcount=1
I also have an email into Eidos and Nvidia.
Posted this;
http://forums.eidosgames.com/showpos...58&postcount=1
I also have an email into Eidos and Nvidia.
Its always a one way street with issues around here, maybe if ATI would simply pony up the developer would have included support for them as well.
From a different perspective maybe Nvidia paid for and helped the developer to include that support for them and if ATI want's the same then they can pay and work with the developer to support them with the same options.
What makes this solely Nvidia's fault, its not Nvidia's game to make executive decisions of what to support and what not to support unless they bankrolled the whole project themselves.
Why not just disable the entire game in ATI hardware. Afterall, they can't possibly "validate" ever other GPU feature that is being used.
Pure nonsense. Nvidia is essentially saying, hey we are not confident in our hardware enough to let it stand on its own merits, so we'll cripple features on our competitors hardware instead. The absolute LAST thing PC gaming needs is "developer relations" coaxing game devs to favour certain hardware AND detect and disable features on competitors cards, where does the line get drawn?
And to the people defending Nvidia's actions, there is a major difference between Nvidia working with game devs to squeeze the most of their hardware, and outright turning off STANDARD features if a competitors hardware is detected. I guess there are certain Nvidia fanboys that have actually convinced themselves that Nvidia's actions are entirely innocuous.
There is no need to defend Nvidia or ATI, this is the developers choice in the end regardless.
Once some form or "real" confirmation regarding the issue is made from people who are actually in the know then its simply fuel for another hatefest for the haters of Nvidia and pretty much the same group post hating on them in pretty much every Nvidia related thread.
If I had a dime every time I felt like the developers of a game purposely shackled the game in one way or another I would be rich. Stop blaming Nvidia for something they had nothing to do with, it was obviously the game developers using Physx as a marketing tool for their own gains.
i remember when ati did this with valve based games, half life 2 in particular ran slow on nvidia hardware but when the games was tricked into running the ati shader path for the game on nvidia cards the performance shot up considerably.
people tend to forget that things like this have always happened in the past and will continue to happen in the future. i wont be surprised if DIRT 2 has some ati bias in it.
Ian McNaughton would not blog about the issue if the detection was not taking place. He would probably be fired for making baseless claims.
But Nvidia will never own up to this, they will just claim the detection is there to "improve the user experience on Nvidia hardware, and to ensure a decent playable experience on other hardware".
Proof? you just speculate!
I will give you two facts :
1. A.Creed was a TWIMTBP's game http://fr.nzone.com/object/nzone_ass...d_home_fr.html
2. Hawks was not a TWIMTBP's game http://hawxgame.fr.ubi.com/
Both games were from Ubi Soft, both were initially DX10.1, but one was it DX10.1 support dropped ;)
Surely because ATI has not invested in the game (DilTech and you seem to think this), or maybe because Nvidia has invested more (like i think...).
Another guy who try to rewrite history...:shakes:
Nvidia FX series run the game in DX8.1 mode and you can force them to run in DX9.0 mode by using ATI code path. That's true.
But Nvidia 6 séries run the game without this trick in DX9.0 mode and where behind ATI X8xx series :yepp:
ATI has not pay enough to have code line :
{If Nvidia_DX10.1 card run DX9.0 code
end}
What a pity :rofl:
The developers should only concentrate on the the API provided by the OS makers. And the hardware makers' job is to ensure their hardware support the latest API. That is called abstraction and isolation, that is how the divisions of labor work in IT. That is the design principle in everything from hardware to software systems.
Now the problem is that, nvidia is not compettive enough to ensure their cards being conformed to the newest tech, they have to pay some dirty money to make their :banana::banana::banana::banana::banana:ty card less :banana::banana::banana::banana:ty, nvidia is the rule broker here.
EDIT: Removed insults, removed dragonso from this section.
Keep it clean and civil.
No warnings, you will be removed from this section permanently .
But I think the point that AMD is making is that why should you have to work with developers to get something like DX10.1 or any other STANDARD feature working?
Think of it this way, AMD likes the approach that, there is an open standard that can be adhered too. For example, like DirectX, you simply make your card work with DirectX, and any developer doesn't need your hardware. All they need is to program for the standard, and if the cards adhere to the standard, it works.
No need for payouts, special relationships for code paths that add some new dust particles or run .1% faster, or is optimized for your card. If something is good, then we should raise the bar right?
I understand there is some point where you introduce new features and such and this would require to work with developers to bring a new feature to the game.
But there is a difference in philosophy here between Nvidia and AMD, where Nvidia wants to be the influence making proprietary features or ways of doing things, and AMD prefers to work with others to make better standards and then do better in these standards.
In case that I am not making my point very good. On one hand, you have AMD with a great feature (Tesselation) for their cards, they didn't go to developers and pay them money to get Tesselation to work on their games, or require Tesselation be the new standard through marketing. They went to Microsoft and said look at this feature, we want to bring this to you so that you can make it a standard feature (of DX11) so developers only need to work with DX11 to get the feature to work on any DX11 capable cards.
Thne you have Nvidia, on the other side, who has a great feature (PhysX) who go out to developers and pay them money to get the game to work with PhysX then go out marking how great PhysX is and how it is works in this game. They don't try to improve the standard for consumers, they simply try to get more people to buy their cards and do it their way. They even pay developers to de-activate standard features of DX not work on competitors cards.
Actually they are, which is kinda sad that certain people keep posting on the contrary even though they KNOW better.
Sure maybe their relations aren't on the same level as Nvidia but they do have a specific department setup for this type of thing.
The problem is the stuff that goes on behind the scene and for some reason certain devs not wanting AMD/ATi's take on situations or their help/support.
Yes, they are in constant communication with the vast majority of game devs.
Link between D3D and HL2 code path?
Give me something to think ATI does it?
3 Last ATI sponsored game (Battleforge, Left for Dead and Hawkx) work perfectly on Nvidia hardware.
This link http://www.pcgameshardware.com/aid,6...iewed/Reviews/ prove that fact even more.
I'm really surprised some people approve of the above. I wonder if the same people would say it's OK if the situation was reversed.Quote:
Additionally, the in-game AA option was removed when ATI cards are detected.
We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo.
By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced.
This option is not available for the retail game as there is a secure rom.
The developers should be coding for the API, that's the whole point of having DX and OGL, they are standards. I think Nvidia should look into resurrecting glide ;)
Rocksteady statement here: (it was made between the time of the demo & retail release)
http://forum.beyond3d.com/showpost.p...3&postcount=38Quote:
The form of anti-aliasing implemented in Batman: Arkham Asylum uses features specific to NVIDIA cards. However, we are aware some users have managed to hack AA into the demo on ATI cards. We are speaking with ATI/AMD now to make sure it’s easier to enable some form of AA in the final game.
Perhaps Microsoft needs to start regulating the game industry,they make the DX after all - how come no one
has mentioned them....:p:
Too many things are optional in DX9,hopefully from DX11
and up it will be strict enough to not have this kind of PissX. ;)
Wow, you really remember your history wrong.
The nVidia cards (FX 5xxx series) were automatically put on the DX8.1 path instead of the "proper" "ati" shader path as you put it, or rather the DX9.0 shader path, because performance on the DX9.0 shader path was HORRIBLE on the nVidia cards.
So, yes, abit of image quality(dx8.1 vs dx9.0) on the fx 5xxx series was sacrificed, but for good reason.
Absolutely, it's called a vendor supporting their :banana::banana::banana::banana:ing product. Why would you as a software developer want to work with advanced functionality (only available by one of the two vendors) when they won't give you the time of day with regards to support?
You can't really excuse AMD/ATI for "not being big enough" or not being in a financial position to provide partner support for their products. This is a basic part of being a technology vendor and they have failed miserably in both incarnations of the company.