64 ROPs ?
Nope, don't think so.
Looks like somebody gathered various info available on the net and "compiled" a GPU-z screenshot.
Printable View
64 ROPs ?
Nope, don't think so.
Looks like somebody gathered various info available on the net and "compiled" a GPU-z screenshot.
Yes, 64 ROPs@695MHz would be overkill even for 160GB/s, but maybe this is Nvidias "fix" for the slow 8xAA resolve ;)
I may have missed some posts but wasnt this gonna be 55nm? :shrug:
Nope.
G92b is 55nm.
GT200 is 65nm.
Hmm but maybe this is GPU-Z bug only? Do you emember the GF8600 aka G84 screenshots? GPU-Z showed that it has 64SP even it was real G84 and a few weeks after (in new version of GPU-Z) it showed a correct value - 32SP. Maybe there is the same problem and with newer version of GPU-Z there will be 32ROPs?:)
GT200 is under 600mm2 ( yes, it's big )
It'll be a decent heater for those cold winter nights ( more than likely ).
But it'll be a monster performance-wise.
And a 55nm refresh shall be out in 3 to 6 months ( as usual )
By that time ATI could be playing in 45nm, and that´s a whole new story because it´s a new full node and not one half-node of 65nm.
GT200 could be a monster in performance but don´t forget that it will have to fight R700 witch is 2x RV770. 1GPU versus 2GPU
But yes it could be a monster in performance single GPU. No doubt about that.
Man... If these turn out to be quite a bit better than the G92 then I hope eVGA extends the step up to 4 months again or that it comes out before July 6th!
Is it 100% confirmed what you`re saying there?:) Is there any info about refresh too? I mean has your source told you something about GT200@55nm because i think even GT200 is performance monster NVIDIA have to push this GPU in 55nm as fast as they could to reduce production costs.
BTW. I`ve read on Chiphell Forum that GT200 is about 2xRv770 performance according to reliable sources:) I hope it`s true.
Nothing can be confirmed 100% right now.
And even if it was possible, it would be very hard ( for the person ) to let it go in public.
He didn't mention anything about 55nm GT200, but logic says max 6 months after initial release, the 55nm refresh should arrive.
at 600mm2 it will use more power and be hotter than a 2900xt
i have the feeling that the single cored 9900gtx will be competitive with the 4870x2
Whether or not that matters depends on pricing. It can go both ways.
If the 9900GTX is $499, it will need to be faster than a price-cut 9800GX2 to jusify its price. If it is around $399, it has the ability to catch the 4870 off-guard, but presents a dilemma where it would be slower than the 9800 GX2, but significantly faster than one 9800GTX. At $299, it will try to decisively outclass the 9800GTX in an effort for Nvidia to usher in a new 9900GX2 for the performance crown.
With pricing and final specs in the air, Nvidia has past performance on its back helping it to scare up engineers at AMD into thinking the GT-200 is an insane chip, which is great for us consumers who will get the best performance at the lowest price :p:
As per ATI pricing and for a glimpse of what kind of performance they are aiming for, the 4870 should end up faster than the 9800 GTX and the 4870 X2 should end up faster than the 9800 GX2. That may be wishful thinking but ATI wants to survive next round that is a bare minimum.
Perkam
The 9900 gtx will be 1gb 512 bit?
Exciting thread!
More tough decisions to make if building a new rig soon...
Do i get an 9600GT/ 8800GT to last until the 9900GTX/GTS are out or do I just buy a 8800GTS/9800GTX that will last me a year or more?!
Damn Technology!
the following is an approximation of my thoughts in july 2008:
holy crap, 9900gtx is beast. it will cost $599, 4870x2 and 9800gx2 wont touch it, it embodies the change you would expect from a new generation of cards....but it isnt lol, still a geforce 9. expect nvidia to have the best enthusiast card for another year or so. also, on the features front, hardware physics is nice.
No way, get real. 600mm^2 is not going to happen. The 8800GTX was about as big as it gets (about 480mm^2, I think).
Imagine the power consumption of a 600mm^2 die made at 65nm?!? I do not know if even 2x 8-pin PCI-E power connectors would be enough to power up this card (heck, Nvidia might have to put 3x 8-pin connectors)!
That is why I do not think it would be any more than 500mm^2, even if Nvidia cannot get it made at 55nm yet. Also, it would not make sense for Nvidia to do it at 65nm if 55nm is now already available and useful for saving costs, heat, and power consumption, and more...
It's under 600mm2 but over 530mm2 :p:
Well, considering how fast ATI is coming out with these cards (2900 to 3800 to 4800), and how quickly it is trying to take away the lead the 9800 GX2 has established (ATI cards are set to come out in May), it would be foolhardy to assume ATI will sit back and watch the GT-200 keep the performance crown JUST because it represents a newer generation of cards.
I have no doubt that the GT-200 WILL take the crown, but expect ATI to respond quickly, even if it isnt the most convincing response i.e. they will release a card that gives close performance to it at a lower price range, but it might not beat it. I am not talking about the 4870 X2 here, which may be shortlived thanks to the GT-200. The 480/32/16 config is cheap and easy for ATI, and will most likely give it a sufficient boost to battle the 9800 series, as ATI likes to play catch up with Nvidia in an annoying tit-for-tat way.
However, the RV670 and RV770 represent architectures that have a lot in common with each other, and if Nvidia were to make the GT-200 a vastly powerful card (a single card beating the 9800GX2), ATI, while still on the same type of architecture, can probably go 640/32/16 or 480 /32/32 or any similar configuration with little difficulty considering how fast they are ramping up to smaller processes (45nm later this year). The worst case scenario for ATI would be that the GT-200 is far too powerful far too soon, forcing it to play a pricing game with Nvidia (which it can do considering the similar architecture to RV670), but with the inability to take a faster card out for at least 5-6 months after release (being stuck with possibly slower 48x0 until end of 2008). Worse case scenario for Nvidia is that it doesnt blow the 4870 at $299 or 4870 X2 at $499 out of the water (whichever price range it ends up in), which is unlikely, knowing Nvidia.
I would hate to see this being decided on a drivers war X(
Perkam
and if ati keeps making financial losses, they are at a considerable tax advantage...i mean do they even have to pay tax with losses like they have...seriously? i think it's a tax dodge :para:
& this is not an ati bash/troll/rant, but seriously how does ati manage to make such vast losses year after year without accruing some sort of capital? they must have heaps of capital tied up somewhere...must be lots of ati dudes being paid however-good for them, but i reckon declaring large losses is a machiavellian way of evading bucketloads of tax.
knock knock it's the IRS
I find it strange as people say something like "Upcoming X is definitely going to be faster than Y by the other manufacturer", when no one really knows how X and Y perform. Accouding to last few rounds X has been on par or way better than Y, but when looking at the improvement Y has made past few months, it might be very possible that Y performs better than X. Also if we look back few years, X whipped Y quite badly few times.
Time will tell. No one really knows what X and Y are really in terms of price/performance/TDP.
I don't get it why would nV stay on 65nm with "GT200" when 55nm seems to work fine for ATi.
Btw, what are these giga tixels in the fake screenshot?
:P
Maybe they get more working chips from the wafer when using 65nm process right now, or maybe they are very confidient with GT200, and want to take your money first, then shrink to 55nm and take your money again? There are people who are going to buy both 65nm and 55nm versions anyway. :p Maybe the chip is so damn complex and not som compatible with 55nm process without some major changes(= huge delays).
I don't know really, someone shall enlighten us!
...someone sucks at using MS Paint it seems. (looking at the screenshot, especially the flaw after the GTixels/s. :D)
Another GPU-Z screenshot NVIDIAs next gen GPU.
http://forum.beyond3d.com/showpost.p...&postcount=662
This time there is no GT200 name but D10E and has less SP (only 192) with much slower clock.
I'm still waiting for the "TRUE" next gen video card.........but every other game now is a damn console port :mad: and the extra power is going to be needed only for Crysis which i have beat 3 times already and do not plan on playing anymore.
I'm waiting for space aliens to come in the middle of the night and enlighten me on the meaning of life, but so far it hasn't happened.
http://www.vr-zone.com/articles/GeFo...Slot/5740.html
Another rumour. They have said GT200 architecture might be Dual-Chip. I hope the don`t mean it`s something like GF9800GX2.
If you need space aliens to tell you the meaning of human life, I imagine you have no idea we have an entire field dedicated to philosophy :p:
I am surprised at the amount of disinformation on the chip. It is the reason I pulled the last GT-200 (while I was still mod :p ).
Hopefully concrete info comes out (as it always does XD) right before ATIs offerings come to shelves.
Pekram
If you read further in that thread, the VR-zone administrator says he has the schematics for the 9900GTX (or GT200) cooler, and is offering to put them up if ppl ask.
I'm no expert but would that give any clues as to what the pcb design might look like? I'm pretty sure that a dual-pcb card would be obvious straight away. A dual-chip (single pcb) however, might be a little more difficult to make out.
BenchZowner seems to know his stuff, maybe he can shed some light?;)
edit: here's the link to the "news" thread: http://forums.vr-zone.com/showthread...=268638&page=2
Yeah, the "TRUE" being something like G80 to G70? Not gonna happen. They are not going to redesign the whole new architecture every time there is new series. It takes years. They have plans to release new stuff within a year or so. So they are working on a new architecture, which will be out in few years. Until then they keep modifying and improving the current one as much as possible whith costs as low as possible.
But yeah, not anyone does really care if it is "true" or not, as long as it performs well. ...and once they get that 280 min FPS @ Crysis @ max @ 2560x1600 @ 24x MSAA + 24x AF, they will STILL cry for the new toys which need to be AT LEAST 270% faster than the old ones, and HAVE to OC +70% or otherwise it is "recycled garbage" (I am not referring to anyone here!). :p
..I dont even start talking about consoles and consoleports, other than that they all suck indeed. :p
If that pic is true NVIDIA will get serious trouble
http://www.bilgiustam.com/512mb-ati-...2006-testleri/
http://www.bilgiustam.com/ati-hd2900...rk06-testleri/
Fake site, fake benches -.-
But these were leaked too early :) .
SM2.0 Score : 8847
HDR/SM3.0 Score : 10612
CPU Score : 5074
Total : 21381
http://www.guru3d.com/imageview.php?image=11344
They forgot to add the X2 after the 4870 :p:
100% fake.
Do you think the 9900 gtx will outperform the 9800gx2 in not so high resolutions like 1680x1050 or 1920x1200?
I need a 10900 card.
how many more weeeeeeeeeeeks with no proof?
some dribbles in June and some retail cards in July.
The whole point is to be as confusing as possible. Keep in mind they released a G92 8800GTS.So in that case it went from G80 to G92 but kept the exact same name. Not a stretch at all that 9900GTX would be G200. The fact that it confuses you is only an argument in its favor.
Nvidia has 1.5 million 8600GT's for sale, so there's no sence to announce new generation until AMD starts hardly beating Nvidia with 4000 series
very nice. July sure seems kinda of quick to be putting a refresh out. Or a new card all together. Let's hope they arent priced out of orbit, somewhere around 400 bucks would be fine. I almost bought the 9800GTX but decided to wait, all I have been lately is project reality, cod4 and rfactor racing. So no need for a new card
The point I am making is the exact opposite...that the name has nothing to do with the chipset. G80 and G92 both were 8x00 series with some cards overlapping in the exact same name (8800GTS), G92 cards are also 9x00 series, thus, there is no correlation between chipset and name anymore....and so it is no leap at all that g200 could be a 9900 series card. It's still going to have the highest number out there so it doesn't have anything to do with downplaying and you cannot make any inference about performance.
I'm just thinking that if these were really gonna hit in June (a month away now) we'd probably have more info... Too bad
Yes, but the G92 was basically a rehash of the G80, not a whole new core like the GT200 is supposed to be. I would expect Nvidia to introduce a new series for the next big thing after the 8800GTX. The 8 series and 9 series are basically the same thing right now, and associating a completely new core with older ones by sticking with the 9xxx name isn't exactly the best marketing strategy IMO.
Anyway, as long as there is a good enough increase in performance over the 8800Ultra I could care less about the name. My 7900GT is struggling nowadays...
that's highly unlikely gt200 would be an entirely new architecture, that takes years to happen, not just one generation. I mean ati and nvidia up 'til g80 and the r600 were just suping up their old designs until the pipeline design just simply maxed out, and I expect them to do the same this time around. But I think the r600 was designed to be a bit more modular giving ati a slight advantage but g80 was more efficient.
Who knows, all I know is that as long as the r700 and gt200 gtx or whatever it will be perform well, I'll be happy. And who ever is expecting another 100+% increase like that of g70 to g80 should be prepared to wait as my guess is that won't happen for another 2 years or so once the r600 and g80 designs are completely replaced.
Some promising rumours from Asia :) Some guy from PCInlife Forum has said that GT200 is GF9900 series and it is singlecore GPU. It has 512-bit memory bus and 32 ROPs but unknown number of SPs yet. 550-600W PSU recommended to run this monster :( It probably won`t have DX10.1 support but only DX10.
But the most important thing is about it`s performance which is supposed to be muuuuuch greater than GF8800GTX/GF9800GTX and even GF9800GX2. He has said that GF9900GTX in SLI mode runs Crysis smoothly in 2560x1600 4xAA and Very High details!!
Single card should allow to play Crysis in 1920x1200,Very high details and 4xAA!!!!
Only i can say is if it`s true it is a so much powerful GPU and maybe even more powerful than G80 when was released 1,5 years ago.
PS. I hope these info is true. :)
PS2. These info is from "Phk" who is reliable source. (at least he was right about G80 and G9x specs)
I think Nvidia would be stupide if they really name G100 the GF9900 series.
if so, 4870 ought to be significiently more powerful than the third radeon generation, and the results are not incredible
And I don't trust PHK either.
He's blatantly known for twisting G80 vs R600 results, sucking up to mods, and more BS.
Aka don't trust fanboys. ;)
what is the 4870 anyway? no changes in arch apart from the bridge that would share mem?
I wouldn't count on shared ram; it seems quite probable though that RV770 has an internal PCIe "bridge" that bifurbicates lanes, and has the connector onboard.
I'm counting on a lot more SPs since that's the way the original Radeon R600 architecture should have gone. The shaders are extremely small and you can pack quite a number without increasing much size.
Still, this is the GT200 thread so I wouldn't really talk too much. :cool:
What are you talking about? Did you read his posts about 1,5 year ago and about month before G80 was released? He gave some 3D Mark results of G80 which were unbelievable for many at first but they were true.
The same is when there were rumours about G92. He first said that G92 WON`T be a high-end part GPU but "only" performance-mainstream and it was true too.
He said a few months ago GF9800GTX will be based on G92 and he was right. The same is about G94 aka GF9600GT.
So i`dont understand why do you think what he is taling is BS.
I don`t say he is most reliable person in the world and what he says is always 100% true but i based on what he said in the past and it was true so i don`t have any reason to not believe him at all. Maybe some details will be different in GT200 when released i don`t know but there is possibility that what he has said it`s true.
PS. Once again about GT200 performance. When i have written GF9900GTX SLI runs Crysis at 2560x1600 VH det. and 4xAA smoothly i have thought there will be about 40-50 fps at least but it seems PHK was talking about 25-30fps which means it will only playable at this resolution with those cards. Too bad but it still seems to be very powerful.
Everyone can repeat truths. G92 and G94 were squarely predictable (the duh kind). G80 was available for developers before so it's fairly easy to get an approximation.
When everyone is aiming at that ballpark in performance, everyone's basically gotta be right. :ROTF: If we go by classic nVidia, GT200 should not be 9900. 9900 should be G92b (55nm) "refreshes" to bump up clocks and all. G92b is a very expandable asset, if RV770 comes too fast then they'll release GT200 and probably steer away the hype. :)
It does kinda pester me that Crysis 2560 4x needs 2 cards for just 30fps, given that I was personally giving 1 GT200 the expectation of 1920 4x. (It's one GT200 or 2 RV770, no hotter than that for me)
Well any proof on that?
You are saying everyone can repeat the truth. So maybe he is doing this at present with GT200?
You are saying G80 was available fo developers. OK but GT200 could be available too. It`s about 2 months when it will be launched so don`t you think most of developers have had them already?:)
Barys... you're taking this too seriously man.
Relax, knowing or not means nothing, until you can lay your hands on the product yourself.
OK i know what you are talking about but i just want to say you can`t say if this or that info is 100% true or not. :) I don`t say PHKs info is 100% but he is reliable source (as i say G80&G92 threads on his forum). You have said some intresting info too and some parts of them are the same with PHKs info.
It is still only rumour mill even, as you`ve said, we won`t see any official benchmarks.:)
GeForce 9900 GTX (GT200) specs leaked :
http://forums.vr-zone.com/showthread.php?t=271801
Well, if we take a look at the specs we see the aren`t impressive at all:( Moreover they are much different than BenchZowners specs.
If these are true i wonder how this card in SLI could run Crysis smoothly in 2560x1600 with VH details and AA4x enbaled? About 50% better specs give 100% or more pefrormance increase? It`s not possible imho if GT200 is nothing more but G80/G92 with 512-bit MC, 32ROPs and more SPs. To get such a big performance hit GT200 has to have some major architectural changes over it`s predecessors.
How are the specs only 50% better?
Double the ROPs, double the memory bus width, and +50% SP... Also, considering the negligible performance delta between the 112SP and 128SP G92s it safe to assume SPs weren't the source of the G92's bottleneck. So perhaps a near doubling of performance isn't too far fetched?
OK but you compare G92 to GT200 but if you take G80 specs against these rumoured GT200 specs then there is no big difference but another thing is that NVIDIA could do some major architectural improvements (shader performance increase like NV40-->G7x etc.) and it could bring such a big performance improvements.:)
240W !??!?!
Think I will just buy a ATI card (if they perform) or wait for the die shrink.
And wouldn't GDDR3 be kind of surprising?
I dont really thrust this...
the card is literally faster than we can imagine if that crysis bench is correct.
im guessing its fake though unfortunately.
So it would be the same size as G92? Doesn't seem likely when we look at the figures for ROPs and SPs.Quote:
Originally Posted on VR-Zone http://www.xtremesystems.org/forums/...s/viewpost.gif
330-350mm2 die size
Frank M,
VR-Zone doesn't claim it's MCM, infact they say the exact opposite.
regardsQuote:
Updated specs
65nm process
MC 512
1GB GDDR3
240 SP
32 ROP
6+8Pin
Seems like GT200 got two versions. One is 65nm and the other is 55nm.
Probably 65nm is for first wave of GT200 cards and 55nm is for later batches.
http://we.pcinlife.com/thread-929091-1-1.html
http://forums.vr-zone.com/showpost.p...3&postcount=33