Nvidia is going to release G92-B in 55nn in June. I don´t understand how can they launch G92-B and GT200 (65nm???) at the same time.
There is Fud somewere.
Nvidia is going to release G92-B in 55nn in June. I don´t understand how can they launch G92-B and GT200 (65nm???) at the same time.
There is Fud somewere.
Last edited by v_rr; 04-18-2008 at 03:23 PM.
well, both will be more powerfull than the 9800gtx
(assuming a stock clock increase in the 9800gt)
and to release two new very powerfull cards like this at once shows that the gt-200 must have a
large performance increase over the 9800gt to justify the price difference
it's shaping up that the gt-200 will be the dominant card to stay for a while
the 8800gtx has held its own for a long time and this seems to be a true successor to it unlike the 9800gtx
Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.
Under 600mm2 could mean 580mm2 tooHave you heard the die size is bigger than G80 or about the same (G80 ha 484mm2 without NVIO)?
And i ask once again - have you heard something about performance?About 60-70% faster than G80/G92? 2X faster than G80/G92? I`m talking about real world situations of course
Come on you can tell us
![]()
Hmm near 600mm2? Very HUGE die.I`ve thought rather about 500mm2. I hope it won`t have problems with stability and temps.
When i have asked you about performance i was thinking about real worldnumbers which this GPU can do not "only" estimationIf it will be 2xGF9800GTX or more it would be great. That`s exactly what we need
![]()
I don't have any real numbers, like 3D Mark scores, etc.
Even if I "gather" some, still I won't be able to post them ( for various reasons )
OK i know you couldn`t tell us any real numbers at present even if you know them![]()
BTW.
I`ve just read on PCInlife forum something about GT200. It`s not specs but that guy is saying GT200-300 (there will be other GT200 variations like G92-270, G92-400?) is GF9900GTXSo it seems there will be no GF10 series with GT200? It`s strange because if GT200 is very likely to bring very big performance impact it should get new "series" name.
Another info is that GT200 is 55nm GPU!! So if you are saying GT200 is very big GPU (about 550mm2) it must (I think) have about 1.2B-1.3B trannies at least.
The question is if he is right with his view. Well, he was right 1.5 years ago with G80 and a few months ago with G92 (he said about mystery G90 too that it could be never released) so then he is maybe right with GT200 too![]()
Here is the link
http://209.85.135.104/translate_c?hl...language_tools
![]()
GT200 is 65nm.
This is a fact, wanna doubt it ? Do. But don't expect it to turn out being 65nm.
At least not in launch time ( a 55nm refresh in the near future is possible, and kinda expected ).
GT200 will be used at least for the flagship ( top end ) product and a high end product ( like G80 in the past with the GTX & the GTS ).
As for the naming scheme... Nobody knows for sure.
Could be a nVIDIA mix bag again.
Just like they renamed some G92 parts to GeForce 9.... they could be using GF9 for both G92b ( 55nm G92 ) and the GT200.
Nobody knows...
Here is my guess based on what we know now...
Probably still on 65nm as that is a "safe bet"
240 Shader Processors for 24 SP per cluster is interesting.
120 Texture Filtering and Texture Adressing Units is normal still same ratio as before.
512Bit is possible if die size is at least 400mm2, and with 1 Billion to maybe 1.3 Billion trannies, this die size is possible on the 65nm node.
GDDR3 is a bit disappointing, maybe that is just an initial relase and the memory controller supports GDDR4 and GDDR5 for a later refresh if those memories become available at better pricing.
This is very impressive this is basically a 9800 GX2 on a single die. It will likely be in same ballpark pricing 500-600 USD debuting at what the 8800 GTX did, with it's monster die size.
If it comes out early summer like June or July it probably will be 65nm as Nvidia hasn't tested 55nm at this point yet, but I presume, this could be as late as November too, optimally this would be best Late Q3, Ealry Q4 in the September and October timeframe. If it is Oct/Nov it could possibly be 55nm at G92b would have been out for awhile, by then.
Last edited by coldpower27; 04-19-2008 at 10:17 AM.
My PC (It get's the job done)
|CPU: Core i7 970 Gulftown B1 Stepping 3.2/133 @ Stock | Heatsink & Fan: Thermalright Ultra 120 Extreme + Yate Loon Fan | Motherboard: ASUS P6X58D-E Intel X58 Chipset + ICH10R BIOS 303 | Memory: 3x2GB Patriot PV236G1600LLKB DDR3-1600 8-8-8-24 RAM | Video Card: eVGA 02G-P4-2670-KR GeForce GTX 670 2048MB GK107 | Case: Antec P183 |Power Supply: Corsair CMPSU-750TX 750W ATX v2.2 |Optical Burner: LG WH14NS40 | Hard Drive 1: Intel SSDSA2MH160G2K5 160GB (149GB) G2 SSD SATA2 | Hard Drive 2: WD SSC-D0256SC-2100 256GB (238GB) SSD SATA2| Hard Drive 3: Hitachi HDP725050GLA360 500GB (465GB) 7200RPM 16MB SATA2 | Hard Drive 4: Seagate ST3500320AS 500GB (465GB) 7200 RPM 32MB SATA2 | Hard Drive 5: Samsung HD103UJ 1TB (931GB) 7200 RPM 32MB SATA2 | Hard Drive 6: Samsung HX-DU020EB 2TB (1862GB) 5400 RPM 32MB USB2.0 | Sound Card: Auzentech X-Fi Prelude 7.1 | Speakers: Logitech Z-5500 Digital 5.1 | Monitor: Dell 3007WFP-HC 30inch S-IPS LCD 8ms @ 2560x1600 60HZ | Mouse: Logitech G500 | Keyboard: Logitech G510 | ISP: Rogers Ultimate 150/10 Mbps | Cable Modem: Hitron Gateway CDE-30364 | OS: Windows 7 Pro SP1 64Bit | Web Browser: Firefox 20/Chrome 26/Internet Explorer 10 | GPU Drivers: ForceWare 306.97 64Bit WHQL |
![]()
Haha, won't bet a 1000 USD, but Nvidia hasn't introduced something that adds major changes on a completely new and untested process.
6800 Ultra was a on the mature 130nm testing by the Geforce 5800/5900 Series.
7800 GTX was on a mature 110nm tested on the 6600 Series.
7900 Series was not critical on 90nm as it is only a shrink of an existing core with minor tweaks.
8800 GTX was on a mature 90nm pioneered by the 7900 Series.
G92 was a shrink of an existing core with few changes, that were already tested before on other products.
If GT200 is late enough, then it could indeed be 55nm.
My PC (It get's the job done)
|CPU: Core i7 970 Gulftown B1 Stepping 3.2/133 @ Stock | Heatsink & Fan: Thermalright Ultra 120 Extreme + Yate Loon Fan | Motherboard: ASUS P6X58D-E Intel X58 Chipset + ICH10R BIOS 303 | Memory: 3x2GB Patriot PV236G1600LLKB DDR3-1600 8-8-8-24 RAM | Video Card: eVGA 02G-P4-2670-KR GeForce GTX 670 2048MB GK107 | Case: Antec P183 |Power Supply: Corsair CMPSU-750TX 750W ATX v2.2 |Optical Burner: LG WH14NS40 | Hard Drive 1: Intel SSDSA2MH160G2K5 160GB (149GB) G2 SSD SATA2 | Hard Drive 2: WD SSC-D0256SC-2100 256GB (238GB) SSD SATA2| Hard Drive 3: Hitachi HDP725050GLA360 500GB (465GB) 7200RPM 16MB SATA2 | Hard Drive 4: Seagate ST3500320AS 500GB (465GB) 7200 RPM 32MB SATA2 | Hard Drive 5: Samsung HD103UJ 1TB (931GB) 7200 RPM 32MB SATA2 | Hard Drive 6: Samsung HX-DU020EB 2TB (1862GB) 5400 RPM 32MB USB2.0 | Sound Card: Auzentech X-Fi Prelude 7.1 | Speakers: Logitech Z-5500 Digital 5.1 | Monitor: Dell 3007WFP-HC 30inch S-IPS LCD 8ms @ 2560x1600 60HZ | Mouse: Logitech G500 | Keyboard: Logitech G510 | ISP: Rogers Ultimate 150/10 Mbps | Cable Modem: Hitron Gateway CDE-30364 | OS: Windows 7 Pro SP1 64Bit | Web Browser: Firefox 20/Chrome 26/Internet Explorer 10 | GPU Drivers: ForceWare 306.97 64Bit WHQL |
![]()
The bandwidth provided by GDDR3-2000 to GDDR3-2200 and 512BIT bus is more than enough.
GDDR4 + nVIDIA = never.
GDDR5 = yes, but it would only make the card more expensive.
Maybe when they move to 55nm.
Why won't they use GDDR4?
i3-8100 | GTX 970
Ryzen 5 1600 | RX 580
Assume nothing; Question everything
Apart from the higher price, there's something involving ATi on this [ can't disclose any info sorry ]
GDDR3 is more than adequate in this case.
Usual suspects: i5-750 & H212+ | Biostar T5XE CFX-SLI | 4GB RAndoM | 4850 + AC S1 + 120@5V + modded stock for VRAM/VRM | Seasonic S12-600 | 7200.12 | P180 | U2311H & S2253BW | MX518
mITX media & to-be-server machine: A330ION | Seasonic SFX | WD600BEVS boot & WD15EARS data
Laptops: Lifebook T4215 tablet, Vaio TX3XP
Bike: ZX6R
The only parts out in the wild are the developer cards. The odds of a developer leaking real bench numbers is about the same as the RIAA suddenly gaining common sense... No developer in their right(or wrong, for that matter) mind would kill their relationship with a company that supplies them with hardware to program their games on just to be the first to give numbers on a part.![]()
I'd be outright shocked though if it wasn't double the performance of a single 8800 GTX.
65nm die size is more logical to me, it's what NVIDIA's been doing in the past when launching a new series, first new gen on same die size as previous gen and then launch a refresh some half year+ later. 65nm process is going smooth now and there's no need to take the risk of jumping directly to 55nm which could cause some problems with getting high enough volumes and the last thing NVIDIA wants with GT-200 after all this stagnation with G92 etc is to have low supply for the next gen series when people are eager to upgrade to something faster, they need to have enough cards shipped this time around or retailers & customers will get very disappointed and a lot would simply choose HD 4xxx series instead.
Myself will probably wait for the refresh like a 9900GT 55nm or whatever there's gonna be for a hopefully a lot more affordable price and nice performance/price ratio, provided I CAN RESIST GT-200 for that long.![]()
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Well but if launch date is getting closer and closer then real benchmarks numbers leaks very oftenAs you remember about month before launching G80 there were some 3D Mark 06 and games numbers leaked. Only a few not many but still. I hope in next two or three weeks we could get some performance info of GT200 confirmed with some real world numbers. I believe we will be suprised as much as we were 1.5 years ago with G80 performance
![]()
![]()
gt200 pic!!
it's a vespa!
twister hammerhead gt200 !
gocart!
oo, electric scooter - very environmentally friendly - also a gt200
yes there is a theme
a stainless gate valve"gt200"
8800gt![]()
Last edited by adamsleath; 04-20-2008 at 03:40 AM.
i7 3610QM 1.2-3.2GHz
You should send those pics to NVIDIA marketing department.![]()
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
i couldnt find any pics of an NVIDIA gt-200![]()
i7 3610QM 1.2-3.2GHz
Bookmarks