212 SP @ 2000 MHZ ----> 9900GTX
160 SP @ 1800 MHZ ----> 9900GT (I want this )
if this or close to it happen my Dream come true
212 SP @ 2000 MHZ ----> 9900GTX
160 SP @ 1800 MHZ ----> 9900GT (I want this )
if this or close to it happen my Dream come true
Ummm maybe because chipset and motherboard manufacturers and developers need long lead times in order to support the chips when they come out? x86 is also an industry standard so Intel can't just work in the shadows and come out at the last minute with what they were doing.
A GPU is vastly different. You don't need to know anything about its inner workings in order to support it. You have a standardized instruction set (DirectX/OpenGL) and standardized interfaces (PCIe). It has nothing to do with respect for customers
Sweet Jesus... 1 Billion!
ʇɐɥʇ ǝʞıl pɐǝɥ ɹnoʎ ƃuıuɹnʇ ǝq ʇ,uop
GDDR3 is far cheaper and much more widely available than GDDR5. Also, Nvidia is concerned about the cost of their chips not the cost of the cards. If they use GDDR5 they don't get any benefit from using more expensive memory. Expensive chip + cheap memory for the same net performance and price to the consumer is an overall win for the chip maker. It's the AIB's that get squeezed.
cool news. Wonder if it will need pcie 2.0 out the door for full performance.
and where are the games to take advantage of a gpu like this? lot's of card's but no new games lately...sloww
~
Whether we NEED it or not, bet your ass that we'll be told its the only way to get full functionality.
Who remembers nVidia say the G80 would only work in PCI-E x16 slots? <- interesting wording!
I also saw an e-note from a GPU companys tech support to an 8800GT owner stating "Hi, thanks for your message, yes these cards are not compatible with PCI-E GEN 1.0 0r 1.1. They both will only run on PCI-E GEN 2.0 mainboards."
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
Maybe not, but 512-bit GDDR3 is a far safer bet from Nvidia's perspective as they have more control. If AMD could do it with R600 I don't see why Nvidia couldn't do it now. Going with GDDR5 puts your fate in the hands of the memory manufacturers. Personally, I would prefer GDDR5 on 256-bit for the obvious reasons.
1 billion sounds great, but its only ~250 million more than g92 has.
AMD Phenom II X6 1055T@3.5GHz@Scythe Mugen 2 <-> ASRock 970 Extreme4 <-> 8GB DDR3-1333 <-> Sapphire HD7870@1100/1300 <-> Samsung F3 <-> Win8.1 x64 <-> Acer Slim Line S243HL <-> BQT E9-CM 480W
So when will this monster be out?
70 to 80% faster than a 8800gtx in demanding games,retail for $500?
So a single card that is 30% than 9800gx2?
Exactly what I need
Last edited by EternityZX9; 04-12-2008 at 07:03 AM.
Intel Core i7 7700K | MSI Z270 XPOWER G.T. | EVGA 1080Ti SC2 | 16GB DDR4 G.Skill Trident Z 3200 | Samsung S27A950D | 3 x Samsung 850 EVO (250GB, 2 x 2TB) | EVGA Supernova P2 1200w | Coolermaster Cosmos II
Last edited by EternityZX9; 04-12-2008 at 07:18 AM.
Intel Core i7 7700K | MSI Z270 XPOWER G.T. | EVGA 1080Ti SC2 | 16GB DDR4 G.Skill Trident Z 3200 | Samsung S27A950D | 3 x Samsung 850 EVO (250GB, 2 x 2TB) | EVGA Supernova P2 1200w | Coolermaster Cosmos II
I'd like to go ahead and confirm that I've been hearing similar from my sources as bench. Big surprise, huh bench? We always seem to have the same info on nvidia parts.
Anyway, I'm hearing a late may-early june release, just thought I'd add that.
BTW, for those who forget..one of the number one reason for high-end shortages in the past has been because companies go for high-end ram near it's initial launch, and stuck with not being able to get said ram. Remember the 6800Ultra and X800XT/PE? Both had countless supply problems because of going for what was the best ram at the time.
Sticking with GDDR3 means ram will be in high supply, meaning NVidia can pump out as many cards as they can make gpus.
really? I mean geforce 9 is pretty screwed up right now, but victor wang has gotten a hold of a 9800gts (whatever the hell it is) and a 9800gt is supposed to coming on the way. I think the big reason for geforce 9 being so screwed up is because of the 8800gt being too powerful eliminating the old performance gap between the gtx and gts cards
1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile
2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W
umm...
erm...
hmm...
how to put this...
You made a mistake
G92b isn't a new design, its just a 55nm version of g92 and has nothing to do with gt200. G92b will be high end mobile and desktop performance with g96b and g98 being performance+mainstream mobile and mainstream+low end desktop. GT200 will be high end desktop and probably will have its own naming scheme, which is exactly what benchzowner said so I don't see what the problem is
Big coincidence
Wanna try our "luck" on the other side ? ( ATi )
They're talking about 8800GTS going EOL.
Not the naming scheme generally
That's what I said. And as Ali_G clarified
The G92b will be a follow up to the G92 cards that we have now, but built on TSMC's 55nm Low-K process.
The GT200 will be the "real new" thing.
Sources are indicating a transistor count near 1.3bil
Even if it ends up being 1bil... still it'd be:
319m over the G80 [ I'm only counting the G80 core, not with the nV-IO ]
246m over the G92
wait
wait
the GT200 cards come in june.. right ???
and there name is 9900GT and 9900GTX ???
source fudzilla.com
if this wrong i think i will have heart attack
fudzilla what i can say anymore than that
Bookmarks