I hope it has a cooler that exhausts air out the back of the case like previous trends. Oh yeah...hopefully 40%+ performance in games compared to the 9800gtx.
Printable View
I hope it has a cooler that exhausts air out the back of the case like previous trends. Oh yeah...hopefully 40%+ performance in games compared to the 9800gtx.
Would be nice if nV finally put some sense in PCB layouts - compared to Radeons their boards are horrible; parallel PWMs are scattered around the board, unified GDDR vDD & vDDQ, using old/obsolete or non-documented "top secret" phase controllers - that's what I'm talking about. It would be nice to see them achieve the level of masterpieces ATi has pulled recently - such as the OEM R600 board. This 210mm long beauty is over an inch shorter than 8800GTS but sports far superior 6-phase vGPU PWM with everything in one single clean row, dedicated vDD and vDDQ, double-width video RAM bus, no huge arrays of caps and no dead space.
Attachment 76515 Attachment 76516
There's only 3 things I would change in this board to make it perfect:
- R600 GPU
- R600 GPU
- R600 GPU
It's beautiful, eh? :D
No, one RV770 core wont be any near to 3 R600 cores.
Speaking of the oem r600, remember the 2900xtx cooler? That was a work of art, something like 4 heat pipes and a blower at the end. I wonder if nvidia will have to implement something like that for gt200 as its definitely going to be hot (literally)
i'm gonna wait till i see some numbers
Ok what am i missing here, what about GDDR4?
1.3B? Well with 1.3B the specs you have said are very possible but a few days ago Jeff Brown has said (according to some sites) GT200 is "about 1B" GPU. On the other hand what does it mean - "about 1B"? It could be 900M or 1.1B too ro at best 1.2B but 1.3B is too far away from 1B beacuse it`s 1/3 more than the rumours said.
I`m not saying you and your source are wrong - Kirk said a few months before G80 launch that they aren`t going to US architecture:p:
Back to your GT200 specs --> hmm you don`t believe to your source?;) :D (you have said "if the specs are true:rolleyes: )
How dou you know that?:)
About 1B could be 0.9B, 1.1B... and let's say as they progress 1.2B
By the way, as far as I remember, those statements were about the newest Quadro.
[quote=Barys;2918877]Back to your GT200 specs --> hmm you don`t believe to your source?;) :D (you have said "if the specs are true:rolleyes: )
Sometimes not everything is correct, so before saying it's 100% correct & sure, better make sure it's 100% correct.
Since it's not easy ( or not at all ) to verify those and that it's in fact the next high end, and not a future project,etc... you say "if".
P.S. For the radeon... I just happen to know... from another source.
mind telling us what the actual number is then (for the rv770)?
:up:Quote:
Nvidia Describes Next-Generation Graphics Processor.
Nvidia’s GT200 Set to Be Unveiled in Summer
vidia Corp., the world’s largest provider of discrete graphics processors, said at a press conference that it plans to launch its next-generation graphics processor this Summer. The new product code-named GT200 is projected to dethrone the current flagship GeForce 9800 GX2, but it is not clear whether it will also bring new features.
The new graphics processing unit (GPU) for high-end market from Nvidia is projected to consist of about one billion of transistors and feature “around” 200 unified shader processors (more likely to be 192 or 240), reports Golem.de web-site citing Jeff Brown, general manager of the professional solutions group at Nvidia. It is unclear whether Nvidia GT200 sports DirectX 10.1 feature-set or are compatible with DirectX 10 only.
Nvidia’s latest high-end GPUs, which are code-named G92 and G80, consist of 754 and 681 million of transistors, respectively. The previous-generation high-end DirectX 9 graphics chips – G70 and G71 – featured 304 and 278 million of transistors, which means that the current generation high-end chips are more than two times larger compared to predecessors in terms of transistor count. Considering these facts, one billion of transistors inside GT200 is not a surprise.
The new-generation Nvidia GT200 is projected to be released in Summer and Nvidia will start to brief the media about the product starting from May, the web-site reports without providing any additional details.
It is also not known whether Nvidia plans to continue using the GeForce brand-name for GT200, or intends to introduce a new trademark.
http://www.xbitlabs.com/news/video/d...Processor.html
Briefing in may...explains why my sources told me may-june.
errr man this is confusing. What's happening??
G92b in May-June and then in 2 months we have GT200??
Or GT200 still in Q4'08??
OOPSIE-DAISIE!!! My bad! I do not know why I said that--I must have been thinking that one R700 board with two RV770 cores would be equal to 3 R600 cores at the very least (since one RV770 is 50% faster than one R600), and then later did the math completely wrong!
Oh well, nobody's perfect, right? We aint as reliable as calculators (yeah, I need to use a calculator when adding 2+2).
I don't think it will be released in June, but even if it will be, then it'll be just
a paper-launch. Just look at G92-8800GT, they were in low supply even
months after they were launched. IMO, and I'm just guessing here, July-Aug
launch with Aug-Sept mass availability - coinciding with back-to-school
season.
I want more than 512mb and more than 256bit, cuz I be going sli on a 30 inch screen this winter, or maybe even xfire if the new ATI cards are finally worth a damn.
I really want all this to be true...
NVidia have had "Next Gen" samples out for quite some time, I heard of the earliest sample all the way back last summer, which is why I thought we'd have already on a consumer level. Thing is, rarely do they tell developers what the test samples are, they just give them prototype cards. As such, one can never be sure that what they see is what will be released(as was the case with the beginning prototype G80's), nor do they know if it is the next card to be released or if something else will come first(leading to claims of a new next-gen when it turns out to be a refresh).
Anyway, I've been hearing roughly the same as B.Z. stated earlier in the thread, only difference is the release time-line. Generally speaking, his sources and my sources agree most of the way down the line. :up:
The GT-200 should be 512bit with 1gb of ram. ;)
Just had a short conversation with my source.
He's now claiming a mid-June release for the GT200 high end part!
I'll try to "validate" the info from another contact of mine in the graphics industry.
Of course, anybody can come up with specs speculations and release date speculations, but this is not my game.
As you can read in the previous pages however, what I've posted ( as some might think it's just my speculation ) or I've been told, have been also given and posted by 2 more members ( not 2 out of the game members ;) ).
That's all I have to say ;)
I hope your right, I'm waiting for them.
And thanks for the heads up.
nvidia smacks rv770 around the head with something i presume. :|
ati does something good....-> smacked down by nvida:slapass:
amd does something good....-> smacked down by intel.
smack, smack ,smack. you've been naughty children and deserve to be punished (said in school marm voice)
Great news BenchZowner let's hope it's true but personally I always expected a summer release as Q4 wouldn't make sense to me as customers would get tired of getting spammed with the same old G92 cards and a lot would switch off to ATI especially when HD 4000 cards would be released and there would be no competition. They need to have something new fighting against HD 4000 series.
The earlier rumors could also very well be true then where it said 9800GX2 cards would get short lived with new highend card in a few months.
June eh? :D
Matches what I've been hearing, like I stated earlier before you said you were hearing Q4. ;)
The part is ready, prototypes have been out for awhile, NVidia have just been playing sit and wait and perfecting die-shrinks until AMD finally got close to ready to launch the 4870. :yepp:
So they plan to run with the GT200 name for model numbers.
Fresh name system and fresh architecture is what we need.
FINALLY.
well the first post says why...
"Jeff Brown stated openly and without asking to have been that the next architecture actually "GT-200".
The GPU will consist of approximately one billion transistors, and it was "pure logic, no memory as CPUs," said Nvidia Manager.
This enormous effort circuit coincides with previous rumors, according to which the GT-200 about 200 shader units.
Previous G80 and G92 GPUs bring a maximum of 128 of the computing engines."
thank goodness that new architectures are finally coming soon after this era of die shrinks
I say Gt-200 will be based off G92/G92b/G94. The major difference is that it will finally be 55nm, 1ghz core, 48-56rops, 256+ Shader units, 6ns 2400mhz+ memory speed (maybe gddr4/5), 512bit memory bus, 1gb & 2gb Vmem, Increased TMUS, etc. The 9800gt provided the drop to 55nm, while the 9-series in general made a head start on the revamped PCB lay-out and redesigned PWM circuitry. The bricks have been laid, and as said before the prototypes have been out now for a while. The logical progression from the timeless GTX g80.
Think 8800 GTS [g92] in all its possible glory.
1 If Nvidia was all "ready" and perfecting die shrinks this beast wouldn't be coming in at 65nm. 65nm tells me they were rushed because the 7xx series moved ahead of their predicted timeline.
2 It's DX10 only, not 10.1. Why? Because the rumour mill would be singing if it was 10.1 and it's not. The preview of 10.1 providing a 20% boost (probably less than that in real life, but there is AN improvement) shows that you may want 10.1. Nvidia is counting on the raw power of the chip to overshadow the lack of 10.1 support. That and they want to be sure you'll want a GT210 when they finally get the 10.1 support in.
I like the chip, don't get me wrong. But it highlights some things that Nvidia never does well, die shrinks being among them. Why do I say that? Because the original 8800 series took way too long to get die shrunk, over a year. That's not expertise, it's bumbling.
It's a good chip but it's the only serious one they'll have till 2010...
Your speculation about GT200 is Far Off.
P.S. There will be no nVIDIA card with GDDR4 ever.
And yes, there's a reason behind that ( not that GDDR4 is bad, but something else ) [ however, GDDR5 is better :p: ]
55nm process isn't mature enough yet for so complicated architectures.
And if they can do it work fine @ 65nm, then there's no problem for them ( and as always, leaves an open window for a 55nm refresh later on ;) )
Who said so ?Quote:
Originally Posted by Anemone
DX10 or DX10.1 ( chances are 10.1 ) it'll be a great GPU.
Like you said... rumour mill ;)Quote:
Originally Posted by Anemone
does anyone else enjoy the speculation and rumour banter of graphics cards moreso than the releases themself?
(I admit I think this way)
I mean, when the 9900gtx (or gt200) is released that i'll just get my excitement jollies from the next big card
instead of the newly released tech
sure i'll buy it ...but will it play *insert latest scenester game here*?
:lol:
(i hope I wasn't too offtopic in this post)
Depends who starts the speculation. The last page in this thread has me really excited. I pay close attention when BenchZowner comes out of the woodwork with information. This coupled with the fact that DilTech is hearing the same thing, has caught my attention more than FUD could ever hope to.
Let's hope this is all true and that AMD releases their new card ASAP, so we can see this GT200 monster.
Because so large a chip @ 65nm is going to be a barn burner. Now, to be fair, around here, that's not such a big worry because we know how to cool things. But it will still be hot and suck power @ 65...
I'll honestly be surprised if it's 10.1. Don't really care if ddr3/4/5 as long as it's got the bandwidth to drive 2560 with ease :)
Even more interesting will be if they can manage a slightly cut down mobile version that they can then use in a 9900GX2 fashion. I can dream anyway :)
By the way, the die size is ( from what I've been told ) under 600mm2
Nvidia is going to release G92-B in 55nn in June. I don´t understand how can they launch G92-B and GT200 (65nm???) at the same time.
There is Fud somewere.
well, both will be more powerfull than the 9800gtx
(assuming a stock clock increase in the 9800gt)
and to release two new very powerfull cards like this at once shows that the gt-200 must have a
large performance increase over the 9800gt to justify the price difference
it's shaping up that the gt-200 will be the dominant card to stay for a while
the 8800gtx has held its own for a long time and this seems to be a true successor to it unlike the 9800gtx
wow!! Monstrous!
BTW
Here is similar monster (596mm2)
http://images.dailytech.com/nimage/3...tecito_die.jpg
Itanium 2 (Montecito)
Under 600mm2 could mean 580mm2 too:) Have you heard the die size is bigger than G80 or about the same (G80 ha 484mm2 without NVIO)?
And i ask once again - have you heard something about performance?:rolleyes: About 60-70% faster than G80/G92? 2X faster than G80/G92? I`m talking about real world situations of course:) Come on you can tell us:D
Hmm near 600mm2? Very HUGE die.:( I`ve thought rather about 500mm2. I hope it won`t have problems with stability and temps.
When i have asked you about performance i was thinking about real worldnumbers which this GPU can do not "only" estimation :) If it will be 2xGF9800GTX or more it would be great. That`s exactly what we need:D
I don't have any real numbers, like 3D Mark scores, etc.
Even if I "gather" some, still I won't be able to post them ( for various reasons )
OK i know you couldn`t tell us any real numbers at present even if you know them;)
BTW.
I`ve just read on PCInlife forum something about GT200. It`s not specs but that guy is saying GT200-300 (there will be other GT200 variations like G92-270, G92-400?) is GF9900GTX:confused: So it seems there will be no GF10 series with GT200? It`s strange because if GT200 is very likely to bring very big performance impact it should get new "series" name.
Another info is that GT200 is 55nm GPU!! So if you are saying GT200 is very big GPU (about 550mm2) it must (I think) have about 1.2B-1.3B trannies at least.
The question is if he is right with his view. Well, he was right 1.5 years ago with G80 and a few months ago with G92 (he said about mystery G90 too that it could be never released) so then he is maybe right with GT200 too:)
Here is the link
http://209.85.135.104/translate_c?hl...language_tools
;)
GT200 is 65nm.
This is a fact, wanna doubt it ? Do. But don't expect it to turn out being 65nm.
At least not in launch time ( a 55nm refresh in the near future is possible, and kinda expected ).
GT200 will be used at least for the flagship ( top end ) product and a high end product ( like G80 in the past with the GTX & the GTS ).
As for the naming scheme... Nobody knows for sure.
Could be a nVIDIA mix bag again.
Just like they renamed some G92 parts to GeForce 9.... they could be using GF9 for both G92b ( 55nm G92 ) and the GT200.
Nobody knows...
Here is my guess based on what we know now...
Probably still on 65nm as that is a "safe bet"
240 Shader Processors for 24 SP per cluster is interesting.
120 Texture Filtering and Texture Adressing Units is normal still same ratio as before.
512Bit is possible if die size is at least 400mm2, and with 1 Billion to maybe 1.3 Billion trannies, this die size is possible on the 65nm node.
GDDR3 is a bit disappointing, maybe that is just an initial relase and the memory controller supports GDDR4 and GDDR5 for a later refresh if those memories become available at better pricing.
This is very impressive this is basically a 9800 GX2 on a single die. It will likely be in same ballpark pricing 500-600 USD debuting at what the 8800 GTX did, with it's monster die size.
If it comes out early summer like June or July it probably will be 65nm as Nvidia hasn't tested 55nm at this point yet, but I presume, this could be as late as November too, optimally this would be best Late Q3, Ealry Q4 in the September and October timeframe. If it is Oct/Nov it could possibly be 55nm at G92b would have been out for awhile, by then.
Haha, won't bet a 1000 USD, but Nvidia hasn't introduced something that adds major changes on a completely new and untested process.
6800 Ultra was a on the mature 130nm testing by the Geforce 5800/5900 Series.
7800 GTX was on a mature 110nm tested on the 6600 Series.
7900 Series was not critical on 90nm as it is only a shrink of an existing core with minor tweaks.
8800 GTX was on a mature 90nm pioneered by the 7900 Series.
G92 was a shrink of an existing core with few changes, that were already tested before on other products.
If GT200 is late enough, then it could indeed be 55nm.
The bandwidth provided by GDDR3-2000 to GDDR3-2200 and 512BIT bus is more than enough.
GDDR4 + nVIDIA = never.
GDDR5 = yes, but it would only make the card more expensive.
Maybe when they move to 55nm.
Why won't they use GDDR4?
Apart from the higher price, there's something involving ATi on this [ can't disclose any info sorry ]
GDDR3 is more than adequate in this case.
The only parts out in the wild are the developer cards. The odds of a developer leaking real bench numbers is about the same as the RIAA suddenly gaining common sense... No developer in their right(or wrong, for that matter) mind would kill their relationship with a company that supplies them with hardware to program their games on just to be the first to give numbers on a part. ;)
I'd be outright shocked though if it wasn't double the performance of a single 8800 GTX.
65nm die size is more logical to me, it's what NVIDIA's been doing in the past when launching a new series, first new gen on same die size as previous gen and then launch a refresh some half year+ later. 65nm process is going smooth now and there's no need to take the risk of jumping directly to 55nm which could cause some problems with getting high enough volumes and the last thing NVIDIA wants with GT-200 after all this stagnation with G92 etc is to have low supply for the next gen series when people are eager to upgrade to something faster, they need to have enough cards shipped this time around or retailers & customers will get very disappointed and a lot would simply choose HD 4xxx series instead.
Myself will probably wait for the refresh like a 9900GT 55nm or whatever there's gonna be for a hopefully a lot more affordable price and nice performance/price ratio, provided I CAN RESIST GT-200 for that long. :p:
Well but if launch date is getting closer and closer then real benchmarks numbers leaks very often:) As you remember about month before launching G80 there were some 3D Mark 06 and games numbers leaked. Only a few not many but still. I hope in next two or three weeks we could get some performance info of GT200 confirmed with some real world numbers. I believe we will be suprised as much as we were 1.5 years ago with G80 performance:) ;)
http://www.ilovebryanmurphy.com/wp-c...spaGT200_1.jpg
gt200 pic!!
it's a vespa!
http://www.menintools.com/images/gt200carbond.JPG
twister hammerhead gt200 !
gocart!
http://www.electrikmotion.com/04gt2002reva.jpg
oo, electric scooter - very environmentally friendly - also a gt200
yes there is a theme :)
http://tw.commerce.com.tw/datas/a/7/...1098173724.jpg
a stainless gate valve :confused: "gt200"
http://www.expreview.com/img/topic/8...2mb_core01.jpg
8800gt:)
You should send those pics to NVIDIA marketing department. :rofl:
i couldnt find any pics of an NVIDIA gt-200 :(
Isn't 9800GT going to be 55nm? Then it would make sense for GT-200 to be 55nm :confused:
Those usually occur when partners get finalized(or close to finalized) hardware, as it's harder to trace those leaks. That phase will be sometime in may, but presently it's still in the prototype phase. ;)
As for the G80, the time from prototype to leaked benches was around a year, this time will be about half that.
Why is nVidia rushing all these new products when most games in 2008 will be console ports, and current hardware should run them just fine :confused: milking people for money?
no console will match those games on fullhd !Quote:
what about
Alone in the Dark
Stalker Clear Sky
Far Cry 2
Alan Wake
pc will become again the most realistic platform :D
"Earlier, we revealed that Nvidia is planning to release GeForce 9800 GT based on 55nm G92 core in July and now we learned a little more details on this new card. The new 9800GT card is codenamed G92-280 and it is using a new P393-C00 PCB that supports Hybrid Power. There will be 256MB and 512MB GDDR3 versions that clocked at 900MHz and 975MHz respectively. The core clock has not been determined yet but it should clock higher than the current GeForce 8800 GT which is at 600MHz default. Outputs are 2 x DVI-I and HDTV."
Source; Vr-Zone
why would nvidia still be playing with the G9x? are they trying to perfect something with them thats going to be present with the GT-200?
Nice? I would have used acceptable.
Now G92b will have a "nice" margin. A ~320mm2 65nm chip that had yield problems early on doesn't make it sound too "nice."
It sounds like instead of Nvidia trying to cut down GT200 and get a lowend/midrange cards out of it, they are simply shrinking current GPUs down to 55nm and renaming them, again.
obviously. and this is to 'compete' with the ever lagging ati. will nvidia even need to bring out gt200 to counter 4870? :shrug:Quote:
they are simply shrinking current GPUs down to 55nm and renaming them, again
Considering the question that people have been asking for the past month or two(GT200 vs 4870), I think it's perfectly reasonable that if he's given info to share it.
I think most of us know that AMD isn't going to be able to compete in the high-end gpu wise this year, but there's always those who don't seem to realize it asking what people think. :yepp:
He has already shared/shown he knows little to nothing about what AMD/ATi is working on, so I don't understand what you are trying to say.
HD4870x2 should be the question.
No one should be asking about a sub $300 card vs a +$600 card...
The real question is what magic AMD/ATi was able to work with R700, MCM, shared frame buffer, CF or no CF on a card, etc.
@LordEC911
ATi 96SPs ( aka "marketing" 480 )
Even if they make something totally unexpected ( like 24 ROPs / 32 TMUs / 32 TFUs ), even then the 4870X2 won't be able to compete with the mid-high-end GT200 part... and depending on the scenario not even the 9800GX2.
Like DilTech said, it's highly unlikely ( and clear to most people "in the know" ) that for once again, AMD won't have a real high end part to compete with nVIDIA at that segment.
I'm not so sure I agree here. RV670 is not a bad product, it's a mid-range chip, but not a bad one. Price/performance is what they are competing with, and considering the market shares are stable or even moving in AMD's favor there, I would have to say that it has delivered what AMD wanted it to.
Not to mention 780G, which is a damn good IGP.
//Andreas
http://www.legitreviews.com/images/r...3850_naked.jpg
this is the rv770 pic, posted elsewhere. what is it? is it fake? 4850 heatsink???
haha 3850 - i typed in "4870 picture" into google and that's what comes up.
where are these gt200's and 48xx's?; hidden away? , they are nowhere.
LoL and this is where I call BS on you.
You may know what Nvidia is releasing but you have no idea what AMD will compete with. You make it sound like AMD NEEDS more ROPs.
HD4870x2 may or may not be able to compete with GT200 but it should be a very interesting product and I have a feeling it might surprise some people.
yep; i'll be surprised to see either one of these :hehe:Quote:
HD4870x2 may or may not be able to compete with GT200 but it should be a very interesting product and I have a feeling it might surprise some people.