Quote Originally Posted by RPGWiZaRD View Post
That would hurt GTX260 sales a bit too much, dropping it to something like say 150~170 EUR / 220~240 USD.



I seriously doubt it, like INQ said it's more likely that they have to drop the clocks a bit in order to keep power draw realisticly.


NV is not in a good situation right now, for customers there aren't any huge problems here but company ain't doing so well.
The GTX-260's are already selling at that price range (as low as $200 after rebates).

http://www.newegg.com/Product/Produc...82E16814127361

It all depends on how much Nvidia gets out of the 55nm process. It certainly looks like Nvidia did a re-spin of the GT200b in order to optimize the clock speeds. If the core is at say, 700MHz and the shaders at 1600MHz, Nvidia will have won most of us back over from ATI.


Quote Originally Posted by Stevethegreat View Post
If nVidia manages to put out a GX2 in competitive prices, then I don't see where INQ's bashing comes from. I mean they can very well be paid from ATI to say nonsense, but there is no greater folly than mocking the engineering feat of a GX2 GT200 card (which BTW will obliterate the 4780x2 and make INQ cry). Lest nVidia puts out those cards and they're kings again, which is rather sad, but things won't get better for ATI by mocking nVidia, they would have to get their act together once again, as their claims on the marker will start dwindling shortly after.
Nah, although the INQ does bash Nvidia *hard*, it was a great article by the INQ nonetheless. It certainly does sound very plausible.

Let's look at the reality for a minute.. do you really think that 55nm is going to bring more than a generous claim of "15%" power savings that the INQ made? It was generous of the INQ to say that, for Nvidia's sake. Do you think it would actually bring more than a 50 MHz boost in clock speed without increasing power usage? Apparently, 55nm only allowed the 9800GTX+ a 68 MHz clock increase using the same power envelope. The 9800GTX+ still failed to show a clear lead over the single-slot HD 4850, which was a great disappointment for Nvidia.