Source - FudzillaQuote:
Originally Posted by Fudzilla
PS. Guys, don't shoot the messenger.
Printable View
Source - FudzillaQuote:
Originally Posted by Fudzilla
PS. Guys, don't shoot the messenger.
Nvidia is going dual crazy i tell you...
:shakes: How many times will they release the same bloody thing. It seems we get a new G92 card every month now.
I'm guessing it will be called something like 9850GX2
amd might not have the performance crown, but stunts like that from nvidia make amd cards much more attractive.
before q3? so thats like a few weeks?
I think it would be REALLY stupid to buy this card instead of waiting for GT200. This is just another G92-like card, GT200 is gonna be more different.
Just to clarify, the chip FUD calls D10U-30 here is the same chip VR-Zone and Expreview are calling GT200.
There is some ambiguity among industry sources at the moment.
//Andreas
Looks like Nvidias tactic of releasing so many cards and code names that they confuse people enough to hide the real next gen card seems to be working (Though I wonder by now if that even exists).
Smake and mirrors...
Whats this, the 9801GX2?
Is this the same naming scheme that gave us 4 different 8800GTS cards, then a 9800GTX that was an overclocked version of the above? :-p
What Nvidia is doing is called market saturation. With so many cards in so many price-points they give the consumers no choice other than to consider an Nvidia product when looking for a new GPU. While you guys may not like it, from a business perspective they are in a position that most other companies only dream of.
How so?
I'm not gonna to take notice until proper info is out instead of relying on FUD.
But if this is true and this is a dual-card... it is estentially going to be 2x9800GTX something the GX2 should really be already. Does smell alot like 7series so they better bloody have a good card that they are developing all this time to for the 10,000 series or equivilant.
There is a market gap for them to fit a dual 9800gtx card in, well in UK markets. 8800GTX/Ultra was in the top £400->low/mid £500, and with the 9800GX2 @ £390ish then its quite probably this article could prove true.
If You don't know already, The 9800GX2 is 2x9800GTX, Look up the revisions on the chips, The 9800GX2 actually has a Higher revised GPUs on it.
So I bet they will be a 9850GTX and a 9850GX2, then the GT200 will be 9900GTX, 9900GX2, and I bet they will make a Revision of the GT200 too, So a 9950GTX, 9950GX2
OMG 999 = 666, we'll gonna die :rofl: :ROTF: :rofl:
lets count:
3850, 3870 and 3870x2, thats 4 in total (3850 comes with 256 or 512 MB)
thats 4 cards from 1 chip.
nvidia has 8800 GTS 320, 512, 640, 8800 GS (256, 512, 1024), 8800 GT (256, 512, 1024) 8800 GTX, 8800 Ultra, 9600, 9800 GTX, 9800GX2
thats 14 cards from 3 chips (G80, G92, G94? (whatever 9600 is)).
excluded 2400 - 3600 and 7100 - 8600 to keep list reasonable.
It Might be 3 chips from Nvidia, but the Revisions of the chips really make the Card, G80 is mainly the 8800GTX/8800Ultra now days, G92 is 8800GTS 512MB, 8800GT,9800GX2, 9800GTX, 9800GT. 8800GS and 9600GT I'm not sure what they are.
But the Revision scale for Nvidia goes like this I do believe
8800GT -> 8800GTS -> 9800GTX -> 9800GX2 -> 9800GT
8800GTX -> 8800Ultra
8800GS -> 9600GT or whatever, like i said I'm not sure what the 88GS or the 96GT is.
The 9600 GT is based on the G94 chip, but I don't know what the 8800 GS is based on.
I won't really mind all these cards coming out if Nv was more like Intel and told use what was coming out when so at least we could plan stuff. All this maybe it's coming out ... maybe it's not ... is just crap.
Huh?
8800GTS G80 320MB & 640Mb <- both are discontinued
8800GTS 512MB
8800GS 384MB
8800GT 512MB
8800GTX 768MB
8800Ultra 768MB <- discontinued
9600GT 512MB
9800GTX 512MB
9800GX2 2×512MB
There's only 1 type of reference cards of 8800GS and GT, 384MB and 512MB respectively. So there's really only 7 different cards, not 14.
G92-150
2900GT, 2900 Pro, 2900XT 512MB & 2900XT 1024MB are also still available for sale but are all discontinued.
Right now they continue to manufacture (though may not be released yet) the following, excluding the 8400, 8500, 8600, 9500, 2400, 2600, 34xx & 3650.
Nvidia
8800GS (renaming it to 9600GSO apparently)
9600GT
8800GT
8800GTS (512MB)
9800GTS
9800GTX
9800GX2
Lowest Bus Width: 192bit
AMD
3690
3830
3850
3870
3850X2
3870X2
Lowest Bus Width: 128bit
AMD also has the RV670 A12 revision chips which may bring a PE card. So it's 7 for nVidia & 6 for AMD. If you want to cover VRAM then you have to consider GDDR4 as a seperate card to the GDDR3 versions.
yea, but also the 8800gtx and 8800gts 512mb are essentially totally unnecessary now that the 9800gtx is here, and you can say the same of the 8800gs because of the 9600gt.
the only relevant nvidia cards in today's market are:
8800gt 512MB
9600GT 512MB
9800GTX 512MB
9800GX2 2×512MB
So? is the GT200, the next big step (like 8800GTX was) and what we all have been waiting for?
Sorry for beeing lazy, and not having researched on this yet! but now I'm a little bit confused with all these code-names? :confused2
or....it could be FUD totally and utterly confused not understanding that the D10U-30 is actually the GT200 and the mem speculation is just that, speculation. I really dont see Nvidia releasing another array of cards on top of the 9800's then releasing the new GT200 end of the year sometime. Thats 3 generations in 1 year, or reworked gen's should I say G92-G92 rework-G92 rework-GT200 in 1 year, i dought this speculation of FUD highly. Who knows, it may happen, but it just doesnt seem likely at all.
Maybe the gddr3 cards they are talking about is a new mid-low range area, i dont know, we will see I guess.
nvidia = fail
before launch 9800 GX2 Nvidia announced better improved version of that dual monster ... will be in the same construction but small 55nm G92 chips and only one power connector ... this is maybe what about talking Fuddzilla
Source [H]ardOCP
Our NVIDIA GPU EOL Rumors post got some people talking yesterday and asking questions, so we wanted to help out with some more specific answers.
We mentioned that the GeForce 8800 GT will be renamed the “9800 GT” gaining some Hybrid SLI features. Along with that, you will see the GeForce 9800 GT represent a die shrink to 55nm as well. It seems as though the 55nm changes will be running and will possibly stretch across both model numbers. Yes, smaller, cooler GT cards which is certainly a good thing. It is also being rumored that the 9800 GTS will be a 55nm part as well.
9900 GTX and 9900 GTS are rumored to be brand new parts, not re-spins of current GPUs, but this is far from being confirmed through multiple sources. I am expecting to see these parts return to $450+ levels depending on what we see AMD do with R700.
It is still just a bit too early for solid details but we will of course pass along what we can when we can.
OK so there's a possibility we see the usual die shrinked midrange cards just b4 the new series is launched. So 9900 series would be GT200 I take it. Hopefully they'll be at least as good as 7900 series was compared to 7800. But I hope NVIDIA moves along after this series and brings something new instead of depending on the G80 arch. Currently GPU market is as stagnated as it never has been before.
OMFG!! So fail, this is completely retarded, do they expect us to run out and buy a new card (that is ironically called the same thing) every 2 months for their stupid die shrinks. Now if we could send our "original" 8800GT's back to nvidia for a new core, then this would sound slightly comprehensible, but its not, I cant even fathom that they are actually even doing this. The messed up thing is every time I wanna buy it too, but im not going to. I'm not buying another card until its a proper newly architectured card.
nVidia is in a unique position...not many companies are even capable of doing this because of stiff competition. nV doesn't really have much coming from ATi at the high end. From a business point of view, I'm not upset with nV. But the XSer in my definitely wants noticeably more powerful cards available rather than just slight upgrades.
If you're gonna count all the discontinued and G80 cards may as well count 2900 ATI series
Since no one else corrected you, I guess I will...
Those numbers are NOT different revisions/respins but are simply different bins.
While they might signify a "better" core it doesn't mean it is the overall "best" core. It could be a specific bin that has the lowest temps with the stock volts and stock clocks, or it could be something completely different.
I think the 3dfx virus is finally taking over nVidia.
Well, it’s natural for us to expect enormous improvements after 16 months, and it’s natural too for nv to squeeze the current architecture to it’s max.
Don’t forget that this is business and every new core design implies big investments – investments that every company wishes to avoid if possible.
haha NVIDIA is so gunning for the twelve year olds.
But mom!! Jimmy has a 9800 gt and I’ve just got a 8800 gt. His plays so much faster then mine!
Honestly I hope and pray that larabee comes out and kicks major NV butt. last product I bought from them was a 6800 gt. a few r700s will probably make their way to my home too.
http://farm2.static.flickr.com/1143/...cc0a5f61_o.gif
Assuming that this rumor is true..
I'd like to place a bet on my own guess for the fun of it:
1. a) 384-bit bus bandwidth with 12 memory chips will be used once again. It took Nvidia 4 years to design this architecture, so Nvidia probably would want to use it again. Like with a current Quadro card, it could come in 768MB and 1.5GB "flavors". That would make sense to stick with cheaper DDR3 memory that is rated at up to 2.4GHz.
OR: b) less likely, but it could be possible that Nvidia has already designed a 512-bit bus bandwidth. Such a massive and complicated card like 9800GTX's PCB could be Nvidia's "attempt" at designing one that actually enables full 512-bit memory. This would be even better (and more honorable) of Nvidia although cheap DDR3 memory is still being used (although up to 2GB)!
2. Remember the G90? No, the G90 never came out--instead, a more "mainstream" G92 came out, followed by G94. Methinks, a G90 is being re-done on 55nm instead of 65nm, with a few improvements (hopefully full DX10.1 and SM 4.1 support). Heck, there's a weird new name for it: D10U-30, instead of D9E on 65nm. Nvidia wants a cheaper way to make $600 video cards ASAP to replace those expensive GX2 boards. SPECS: No less than 24 ROP's like the likes of 8800GTX. At least 725MHz core thanks to 55nm. All other specs are increased by 50% like shaders, TMU's, etc.. so that now there are 96 bilinear texels per clock, 48 bilinear FP16 texels per clock, and 192 shaders (stream processors). It should be enough to break the 1 TERAFLOPS record for a single chip. Of course, it should be overclockable to at least 800-850Mhz on average. The jump from the 9800GTX to this D10U-30 should be very comparable to the jump from a 6800 Ultra to a 7800GTX. And the die size will be over 400mm^2--slightly larger than a 9800GTX, but still a tiny bit smaller than an 8800GTX. Power consumption is somewhat higher than 8800GTX, though--but when factoring in improved idle consumption, it is no worse than an 8800 Ultra overall. I would be surprised if Nvidia *EVER* made a chip bigger than 8800GTX.
If ALL of the above is correct (of course either a or b for the first part), would I win something? Let's pool in a bet, like $1 each?
"D10U-20 also in the works
Q2 release
Can it be that Nvidia will bring a real next generation product just a quarter after it released its 9800 series? Well, we don’t have the answer to that particular question, but as we reported here, Nvidia is working on a new GPU codenamed D10U-30. The D10U-30 will feature 1,024MB of GDDR3 memory and we learned that there will be one more SKU below it.
The second one is codenamed D10U-20 and it will have 896MB of memory, again of the GDDR3 flavor. This new card indicates that Nvidia can play with the memory configuration and that the new chip might support more than the regular 256-bit memory interface.
This one might support 384-bit or some other memory configurations, but we still don’t have enough details about it. It looks like Nvidia doesn’t feel that going for GDDR4 is necessary and it looks like the company will rather jump directly from GDDR3 to GDDR5."
http://www.fudzilla.com/index.php?op...=6702&Itemid=1
It sounds like a repeat of the 8800GTS/GTX G80 releases, but with 16 memory chips this time around. I am thinking that 16 memory chips will be used in a full 512-bit bandwidth configuration, while 14 chips will allow for 448-bit bandwidth (because 14 chips at 64MB each add up to 896MB). It certainly does look like a similar PCB design as the one that 9800GTX is using now will be used for D10U chips--otherwise, I do not see why Nvidia went to all the trouble of re-designing such a complex PCB after a quite successful G92 GTS that could already overclock to 800MHz (aside from adding Tri-SLI capability, the PCB looks far more complicated than that of an 8800 Ultra!)... Methinks, this 9800GTX PCB will have 16 memory chips (still only 64MB each chip) on the front and back, and with a slightly bigger chip. It took Nvidia 4 years to develop the G80 architecture that scales the memory bandwidth with the number of chips used, and it makes sense that Nvidia wants to keep on taking advantage of the design.
Hence, no need for more expensive GDDR4 memory or more dense GDDR3 chips. Nvidia saves money in this area, at least. I can see why Nvidia wanted to dumb down the 9800GTX in nearly every aspect possible, to save all the thunder for Nvidia's next surprise that is due out in very short time. Nvidia knows that the R770 will not be a half-baked chip this time around.
9900GTS and 9900GTX, perhaps?
what? more than 256-bit? surely that's a pipedream:p:
but it will have to compete with rv770's i spose, as you say.
The only other way this 896MB of memory could make sense is if Nvidia used more dense 128MB chips, but only 7 out of 8 chips. It could still use 448-bit bandwidth. And the 1GB version would use 512-bit bandwidth with only 8 memory chips, like Radeon's HD2900XT (which would explain 9800GTX's incredibly complex PCB). Remember, there was a 256-bit version of HD2900 that used the same PCB as HD2900XT, so the 9800GTX could be as well using the same PCB design that will be used for D10U chips. If the codename is D10, perhaps Nvidia will actually call it Geforce 10000 Ultra or something like that.
I wouldn't be surprised if Nvidia did that just to up its stock value (that fell over 40% within the past few months---OUCH).
hmm sounds like its time to get rid of the 9800gx2's
if it has a huge amount of ram, its gonna suck for ppl on 32 bit OS'es. less and less of that 4 gig system ram being usable
Uhhhhh... I do not think so. Not even Crysis uses more than 2GB of system ram. In DX10 mode at 1600x1200 w/ 4xFSAA, Crysis uses only 1GB, not even 1.5GB. Having 4GB of ram just helps with map loading times and the swap buffers so that when exiting the game back into windows it can quickly resume normal operation. I do not think a 32-bit OS would be a problem for video cards with up to 2GB of video RAM.
True the game itself normally tops out at 1.3GB Recorded, But You gotta remember how much memory vista uses, It normally is around 30%-40% on any setup. Then There is background apps the user is using, So i wouldn't say 4GB is useless at all. Its very much a benefit.
unless im mistaken (offtopic entirely) The video ram is only addressed by the gpu memory controller anyway, most of which are 256 bit wide, so 2gb addressing should be irrelevant to the OS you are using.
Or did i miss a critical step somewhere?
right... Sorrow13 said it nicely!
@LordEC911, I could be wrong. But I do not see how the GT200 could be any bigger than 500 mm^2 (or less than 450mm^2 if made on 55nm)--and it is quite possible for Nvidia to fit that chip onto whatever PCB design.
A rumor was quoted: "Nvidia will have a hard time fitting that chip onto some PCB"--now that I think about it, you are probably right that it will be a different PCB design.
Sorry, while that may sound logical, you are wrong.
Windows does take the VRAM into account.
Unfortunately I only know of a test in my languege, but they showed that
with cards with bigger buffers, the available memory is less with 32bit
windows.
I'll tell you my experience with 4 gigs of ram installed on a 32 bit OS...
After installing my 2 9800 GX2s, I had 1.7 gigs of available system memory
1.7!!!
Needless to say, I am now running x64
Nvidia First 55nm Desktop Graphics; GeForce 9800 GT
http://www.vr-zone.com/articles/Nvid...0_GT/5714.htmlQuote:
VR-Zone first revealed Nvidia's plan to shift from 65nm to 55nm to lower costs and we have also told you a series of 55nm based mobile graphics GPUs based on G94b and G96b. Now we learned that the first desktop graphics card to be based on 55nm G92b core will be GeForce 9800 GT and it will be launched in July along with GeForce 9900 series. Apparently, GeForce 9800 GTS will be OEM only, not for channel.
mo interesting gnu's
Well, my SLI setup with 1GB of video ram total is doing just fine with 2 GB of memory in WinXP 32-bit. I did not notice any difference when I added a second card for SLI. The only game that came dangerously close to the 2GB limit was The Witcher, but that was fixed with patch 1.2 which reduced approximately 500MB of memory usage. Guys, Crysis only uses up to 1.3GB of system memory at the very most. It eliminated all stability problems with such resource hogs (the Witcher, Oblivion with mods, etc..) when I set the virtual memory to a fixed 3GB buffer space when gaming in 32-bit.
(Of course, Vista needs a 4GB sucker, whether 32-bit or 64-bit)
More rumors
Quote:
GT200-30 is taped out
And works
Looks that GT200 is also going to be ready for late Q2 launch. We only have some limited data but the chip that we call GT200 is taped out and the prototypes of the card are already up and running.
The device is known as Nvidia GT200-300 and this naming scheme might implicate that this is something that we've seen listed as D10U-30.
We can only speculate is this a 55 or 65nm or if this card has one or two chips but we can confirm that this configuration has 1024MB of memory.
The chip however is called GT200-300. The shocking part is that both R700 and new GT200 might be launched at roughly the same time, or very close to each other.
http://www.fudzilla.com/index.php?op...=6786&Itemid=1
Quote:
Nvidia confirms GT-200 with 1 billion transistors
Actually Nvidia wanted to talk on 10 April 2008 during a press meeting in Munich only about new initiatives approximately around its Quadro Grafikkarten. Rather casually one confirmed then however still some data approximately around the next GPU generation, which was to follow the G92-Chips.
Jeff Brown stated openly and without asking to have been that the next architecture actually "GT-200". The GPU will consist of approximately one billion transistors, and it was "pure logic, no memory as CPUs," said Nvidia Manager. This enormous effort circuit coincides with previous rumors, according to which the GT-200 about 200 shader units. Previous G80 and G92 GPUs bring a maximum of 128 of the computing engines.
Furthermore, said Brown nor that Nvidia itself in May 2008 closer to the GT-200 express will. Whether an official architecture-intentioned idea, or product confidentiality agreements with selected press representatives, including several vesting period presented, Jeff Brown, however, was still open.
http://translate.google.com/translat...hl=en&ie=UTF-8
regardsQuote:
GT200 also known as G100, or GeForce Next is the next-generation flagship NVIDIA graphics core, the main specifications are as follows:
Process: 65 nm
The number of transistors: About 1.5 billion
Core Area: about 600 mm2
Core :550-650 MHz frequency
Stream Processor: 240
Stream Processor Frequency: 1.5 GHz
Texture modules: 80
Grating of the processor: 32
Memory interface: 512 - bit
Memory specifications: GDDR3
Clocked :1.0-1.1 GHz
Maximum thermal design power (TDP): 200W above
Properties: 100% increase over G80
http://bbs.chiphell.com/viewthread.p...extra=page%3D1
http://www.google.com/translate?u=ht...=pt-PT&ie=UTF8
Umm yes please! Please let it be true! xD
240 SPs sounds a bit much IMO
as well as 200W+ TDP
and 100% perf boost over G80 (well perhaps in 1920 x 1200+ or with AA & AF it would have such advantage or close anyway)
But the rest I dunno, sounds highly plausible and what many people seem to be expecting. At least it would be able to run Crysis properly at last lol. :rofl: