Hehe, so G92b became 9800GTX+, but when GT200 shrinks it becomes a new model, GTX 270/290? *hint hint* ;)
//Andreas
Printable View
Hehe, so G92b became 9800GTX+, but when GT200 shrinks it becomes a new model, GTX 270/290? *hint hint* ;)
//Andreas
Guys...next person who as much as drops the fan-word gets dropped like the dumb blond in just about any horror movie. I mean, the only reason everyone gets bent out of shape here is because there's too many squares in the circle.
Anyway, a lot of it is INQ bs(funny how we all knew it was the inq WAY before we even saw the source), but I'd be shocked if NVidia only got another 50mhz or so out of this thing. Look how high the 280's and 260's clock on 65nm without pretty much any effort at all. Also, I agree with Delph, plain jane shrinks from NVidia have been getting the b suffix, while shrinks with optimizations have gotten totally new codenames(G80->G92, which DID bring some changes, although nothing big).
Either way, I'm just going to put it like this... Haven't ANY of you learned after the past generation of graphics cards not to waste your time fighting over predictions? First G80 utterly blind-sided the vast majority of the people here, while you guys argued that it would only be 48 pipes(and not even unified), then everyone thought the R600 would be the best card since sliced bread(we all see how that turned out), then that the GTX-280 would crush all(it did actually come in at what was expected, ~2x the G80), and then the RV770 showing us the definition of bang for buck(and the R700 reminding us that ATi WILL keep pricing high on the high-end for as long as they can get away with it, just like NVidia). It's one of the reasons for the most part I've stopped bothering discussing them, but I still watch them, as even when you have the greatest sources possible there's always info that changes between pre-release and release.