That's a hilarious statement.
Yes, Eyefinity is great for pushing the limits of current graphics cards by asking them to render a huge amount of pixels and it really ups the immersive experience when gaming. However, much like NVIDIA's 3D Vision it relies on driver support which means it may or may not work with all games. As such, you can't determine success or failure based on a product's support of a niche market.
The main issue we now have is that current game engines just can't push the current generation of DX11 GPUs far enough. On the plus side, DX11 doesn't seem to be turning out to be the amazingly efficient API many promised so the extra rendering power will come in handy in certain situations. If you thought DiRT took a huge hit when rendering the DX11 code path, wait 'till you see what happens in AvP. This also brings into doubt the real selling point of lower-end cards that support a next-gen API but that happens in every generation.
All this means is that consumers won't have to spend the amount of money they used to when searching for an optimal gaming experience. I remember spending over $600 for an 7800 GTX which could play most games at high resolutions with IQ settings near max. However, stepping down to even a 7800 GT at $450 meant sacrificing that optimal gaming experience. Now, you can buy a $299 HD 5850 or whatever NVIDIA comes out with to compete with it and you can play every single current game at the highest detail settings while still sitting pretty when upcoming games are released. What's not to like about that?
Basically, what I am saying is that having massively powerful high end cards makes their offshoot mid-range products all the more appealing from a price / performance perspective. So, I can't see why anyone would call the possibility ultra high-end Fermi performance a "failure"
This doesn't make Fermi-based cards a failure, it just means that both Fermi and ATI's current generation of cards may have a longevity that far surpasses the G80.






Bookmarks