Someone is in denial.
Printable View
The 9800gtx (not with a + at the end) and the 8800gts 512mb are pretty much the same. The 9800gtx I guess being higher binned.
Take a look at the first few 8800gts 512's that came out. The cores could be clocked to 900mhz in some cases, They were beasts.
Interface
Interface PCI Express 2.0 x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce 9800 GTX+
Core clock 738MHz
Stream Processors 128
Memory
Memory Clock 2200MHz
Memory Size 512MB
Memory Interface 256-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.1
Ports
HDMI 1 via Adapter
DVI 2
TV-Out HDTV / S-Video Out
General
RAMDAC 400 MHz
Max Resolution 2560 x 1600
SLI Supported Yes
Cooler With Fan
Power Connector 2 x 6 Pin
Dual-Link DVI Supported Yes
HDCP Ready Yes
-------------------------------------
Interface
Interface PCI Express 2.0 x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce 8800GTS (G92)
Core clock 678MHz
Stream Processors 128
Memory
Memory Clock 1944MHz
Memory Size 512MB
Memory Interface 256-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.0
Ports
DVI 2
TV-Out HDTV / S-Video Out
VIVO No
General
Tuner None
RAMDAC 400 MHz
Max Resolution 2560 x 1600
SLI Supported Yes
Cooler With Fan
Dual-Link DVI Supported Yes
HDCP Ready Yes
Ya I know that is the 9800gtx+ but it doesn't matter. That is just a die shrink. If they took the 8800gts 512 and gave it a die shrink and called it the 9800gtx then I don't think I could complain about that...
The whole 8800gts 512 and 9800gtx wasn't a big deal. I am a little pissed off about the whole 8800gt die shrink but you can't tell if your getting that 8800gt or not because they didn't put a + after it -_-
Clock speeds haven't changed.. which means process hasn't changed.. which means will be getting failed 280s with one non-functional TPC = new GTX260.
The reason they didn't do this from the beginning is because it would have cut deeply in the sales of the 280 if they would have kept prices the same. But with the current threat they had no choice but to take this route.
This will answer the 4870, but in no way would it compete with the 4850x2...
My guess is performance would be similar to a slightly overclocked 260.
If I'm not sure I'm not mentioning dates, just "timeframes" which shall be good... but sometimes delays or other stuff occur.
When I have 100% concrete info, I'm posting dates, or stating that it's 100% sure.
I just wish nVIDIA will keep their "promise" and get that thing out before Mid-September so we can have some more fun :D ( yes, I'm a hardware whore :p: )
Mid September? Hmmm....I'd put Late September/First Week October as the safer bet.
It taped out in the beginning of June, and I think Nvidia will be careful in bringing in a card to the market that will not only be powerful, but will continue to be powerful.
Remember that making the GT200b faster than the 4870 X2 will involve making it faster than GTX 280 SLI in some games at 2560 res.
Perkam
Okay, back to the truth in this news. They aren't really upgrading, the GTX260 is just the processors not deemed to be good enough to be a GTX280, and is binned as such.
I'm thinking that nvidia was caught somewhat unawares by the 4800 series and now is scrambling a bit to compete better. They sat too long on the G80. Reminds me of the geforce 4 series and then the 9700 coming out. At least the GTX260/280 doesn't suck like the FX did though :p:
Sorry totally blanked on the meaning of clusters when I read your post...
Because GTX 280 has 32ROPs, 512bit, 1GB GDDR3, and IIRC TMU number is different. missing hardware like these in the VGA BIOS is quite a motive, i think.
About laser locking, i don't know man, that would be a nice subject for Myth Busters to get undisclosed. :p: i personally think that doesn't make much sense, that would make a bad binned GPU even more expensive...
They're physically laser locked - its been stated before but Nvidia has a machine for it
Is the laser locking thing an proven fact ?
IIRC the arch is highly modular, meaning they could produce different GPU's, instead of just making fully fledged cores and then cutting down the partially defective ones to use in lower models.
Yup, the $ame ca$e with G80 variant$, i've never $ee any 8800 GT$ 640 MB modded into a full fledged 8800 GTX. (ye$, the "s" button i$ broken on my KB, poor me)
Ever heard Opera'$ $tart from the previou$ opened web page and auto login ? ;) Thi$ A4Tech KB-28G ha$ been a great companion for the pa$t 3 yr$, it de$erve$ a retirement finally. :D Happy incoming Ramadhan brother, hopefully Hi$ ble$$ing will be be$towed upon you and your love one$ in that holy month. :)
Thx for the info, appreciated. :)
You don't need laser cutting to blank out parts.
eFuse and overvolts to burn out parts of the GPU being accessed, are the new black.
The laser locking is a proven fact, I've tried.
This is the same deal that happened with the 8800GTX, and the 8800 Ultra. nvidia noticed that some chips could actually perform higher than the 8800GTX standard, and then charged us more for the same chips... At the begging of the 8800GTX's there were quite a few lucky guys who got essentially an "8800 ultra", I remember because I was one of the lucky few who got one; though my second 8800GTX wouldn't clock like it.