I know very little in the software department, but this kind of thing really doesn't make sense to me.

If i get this right, we have Nvidia with a development team who help improve performance of there hardware with games, but from the sounds of this spend money on stopping features or supressing the technology used as well.

- Why not spend that money on bringing out a good competing card at the same time?

With RE5, if some can explain to me how ATI had a issue with that to begin with. Yes i know it was a driver related issue, but ATI run a R500 in the Xbox 360 which it ran fine on. That was pretty much a x1900 chip with some edram.

Also, am i the only one thinking that most games are built upon Direct 3D or OpenGL. So why don;t microsoft have a set development model the companies must agree with. (thinking like the OSI model in wireless). Then we wouldn't get these silly issues.

It's like we got to buy a dual 16x PCI-e slot motherboard so we can have one card from each company.

sorry about the rant but i guess what im getting to is, why can't we have a set standard that has to be stuck too? - Paul