Page 3 of 3 FirstFirst 123
Results 51 to 73 of 73

Thread: Next Nvidia card is D10U-30 comes before GT200

  1. #51
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Assuming that this rumor is true..

    I'd like to place a bet on my own guess for the fun of it:

    1. a) 384-bit bus bandwidth with 12 memory chips will be used once again. It took Nvidia 4 years to design this architecture, so Nvidia probably would want to use it again. Like with a current Quadro card, it could come in 768MB and 1.5GB "flavors". That would make sense to stick with cheaper DDR3 memory that is rated at up to 2.4GHz.
    OR: b) less likely, but it could be possible that Nvidia has already designed a 512-bit bus bandwidth. Such a massive and complicated card like 9800GTX's PCB could be Nvidia's "attempt" at designing one that actually enables full 512-bit memory. This would be even better (and more honorable) of Nvidia although cheap DDR3 memory is still being used (although up to 2GB)!

    2. Remember the G90? No, the G90 never came out--instead, a more "mainstream" G92 came out, followed by G94. Methinks, a G90 is being re-done on 55nm instead of 65nm, with a few improvements (hopefully full DX10.1 and SM 4.1 support). Heck, there's a weird new name for it: D10U-30, instead of D9E on 65nm. Nvidia wants a cheaper way to make $600 video cards ASAP to replace those expensive GX2 boards. SPECS: No less than 24 ROP's like the likes of 8800GTX. At least 725MHz core thanks to 55nm. All other specs are increased by 50% like shaders, TMU's, etc.. so that now there are 96 bilinear texels per clock, 48 bilinear FP16 texels per clock, and 192 shaders (stream processors). It should be enough to break the 1 TERAFLOPS record for a single chip. Of course, it should be overclockable to at least 800-850Mhz on average. The jump from the 9800GTX to this D10U-30 should be very comparable to the jump from a 6800 Ultra to a 7800GTX. And the die size will be over 400mm^2--slightly larger than a 9800GTX, but still a tiny bit smaller than an 8800GTX. Power consumption is somewhat higher than 8800GTX, though--but when factoring in improved idle consumption, it is no worse than an 8800 Ultra overall. I would be surprised if Nvidia *EVER* made a chip bigger than 8800GTX.

    If ALL of the above is correct (of course either a or b for the first part), would I win something? Let's pool in a bet, like $1 each?
    Last edited by Bo_Fox; 04-05-2008 at 02:29 AM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  2. #52
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    "D10U-20 also in the works
    Q2 release



    Can it be that Nvidia will bring a real next generation product just a quarter after it released its 9800 series? Well, we don’t have the answer to that particular question, but as we reported here, Nvidia is working on a new GPU codenamed D10U-30. The D10U-30 will feature 1,024MB of GDDR3 memory and we learned that there will be one more SKU below it.

    The second one is codenamed D10U-20 and it will have 896MB of memory, again of the GDDR3 flavor. This new card indicates that Nvidia can play with the memory configuration and that the new chip might support more than the regular 256-bit memory interface.

    This one might support 384-bit or some other memory configurations, but we still don’t have enough details about it. It looks like Nvidia doesn’t feel that going for GDDR4 is necessary and it looks like the company will rather jump directly from GDDR3 to GDDR5."


    http://www.fudzilla.com/index.php?op...=6702&Itemid=1

    It sounds like a repeat of the 8800GTS/GTX G80 releases, but with 16 memory chips this time around. I am thinking that 16 memory chips will be used in a full 512-bit bandwidth configuration, while 14 chips will allow for 448-bit bandwidth (because 14 chips at 64MB each add up to 896MB). It certainly does look like a similar PCB design as the one that 9800GTX is using now will be used for D10U chips--otherwise, I do not see why Nvidia went to all the trouble of re-designing such a complex PCB after a quite successful G92 GTS that could already overclock to 800MHz (aside from adding Tri-SLI capability, the PCB looks far more complicated than that of an 8800 Ultra!)... Methinks, this 9800GTX PCB will have 16 memory chips (still only 64MB each chip) on the front and back, and with a slightly bigger chip. It took Nvidia 4 years to develop the G80 architecture that scales the memory bandwidth with the number of chips used, and it makes sense that Nvidia wants to keep on taking advantage of the design.

    Hence, no need for more expensive GDDR4 memory or more dense GDDR3 chips. Nvidia saves money in this area, at least. I can see why Nvidia wanted to dumb down the 9800GTX in nearly every aspect possible, to save all the thunder for Nvidia's next surprise that is due out in very short time. Nvidia knows that the R770 will not be a half-baked chip this time around.

    9900GTS and 9900GTX, perhaps?
    Last edited by Bo_Fox; 04-08-2008 at 12:18 PM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  3. #53
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    what? more than 256-bit? surely that's a pipedream

    but it will have to compete with rv770's i spose, as you say.
    Last edited by adamsleath; 04-08-2008 at 12:41 PM.
    i7 3610QM 1.2-3.2GHz

  4. #54
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    The only other way this 896MB of memory could make sense is if Nvidia used more dense 128MB chips, but only 7 out of 8 chips. It could still use 448-bit bandwidth. And the 1GB version would use 512-bit bandwidth with only 8 memory chips, like Radeon's HD2900XT (which would explain 9800GTX's incredibly complex PCB). Remember, there was a 256-bit version of HD2900 that used the same PCB as HD2900XT, so the 9800GTX could be as well using the same PCB design that will be used for D10U chips. If the codename is D10, perhaps Nvidia will actually call it Geforce 10000 Ultra or something like that.

    I wouldn't be surprised if Nvidia did that just to up its stock value (that fell over 40% within the past few months---OUCH).

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  5. #55
    Xtreme Member
    Join Date
    Dec 2005
    Location
    Kichroa !
    Posts
    231
    hmm sounds like its time to get rid of the 9800gx2's
    cpu : e6850 @ 3.5ghz
    cooler : boxed
    mem : balistix 8500 @ 1000mhz 4-4-4-4
    board : asus p5k-e wifi
    graphics : 7800gtx
    harddisk : wd raptor 74gb
    dvd : nec 7173A
    dvdrw : optiarc 18x
    psu : enermax liberty 620W
    screen : dell 3007wfp hc

  6. #56
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    if it has a huge amount of ram, its gonna suck for ppl on 32 bit OS'es. less and less of that 4 gig system ram being usable
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  7. #57
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by grimREEFER View Post
    if it has a huge amount of ram, its gonna suck for ppl on 32 bit OS'es. less and less of that 4 gig system ram being usable
    Uhhhhh... I do not think so. Not even Crysis uses more than 2GB of system ram. In DX10 mode at 1600x1200 w/ 4xFSAA, Crysis uses only 1GB, not even 1.5GB. Having 4GB of ram just helps with map loading times and the swap buffers so that when exiting the game back into windows it can quickly resume normal operation. I do not think a 32-bit OS would be a problem for video cards with up to 2GB of video RAM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  8. #58
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,550
    Quote Originally Posted by grimREEFER View Post
    if it has a huge amount of ram, its gonna suck for ppl on 32 bit OS'es. less and less of that 4 gig system ram being usable

  9. #59
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Bo_Fox View Post
    The only other way this 896MB of memory could make sense is if Nvidia used more dense 128MB chips, but only 7 out of 8 chips. It could still use 448-bit bandwidth. And the 1GB version would use 512-bit bandwidth with only 8 memory chips, like Radeon's HD2900XT (which would explain 9800GTX's incredibly complex PCB). Remember, there was a 256-bit version of HD2900 that used the same PCB as HD2900XT, so the 9800GTX could be as well using the same PCB design that will be used for D10U chips. If the codename is D10, perhaps Nvidia will actually call it Geforce 10000 Ultra or something like that.

    I wouldn't be surprised if Nvidia did that just to up its stock value (that fell over 40% within the past few months---OUCH).
    GT200/G200/G100 is not going to be fitting on a 9800GTX PCB.
    You really think the GT200 will be a similar size compared to G92?

  10. #60
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by Bo_Fox View Post
    Uhhhhh... I do not think so. Not even Crysis uses more than 2GB of system ram. In DX10 mode at 1600x1200 w/ 4xFSAA, Crysis uses only 1GB, not even 1.5GB. Having 4GB of ram just helps with map loading times and the swap buffers so that when exiting the game back into windows it can quickly resume normal operation. I do not think a 32-bit OS would be a problem for video cards with up to 2GB of video RAM.
    True the game itself normally tops out at 1.3GB Recorded, But You gotta remember how much memory vista uses, It normally is around 30%-40% on any setup. Then There is background apps the user is using, So i wouldn't say 4GB is useless at all. Its very much a benefit.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  11. #61
    Xtreme Member
    Join Date
    Dec 2005
    Location
    NC, USA
    Posts
    285
    unless im mistaken (offtopic entirely) The video ram is only addressed by the gpu memory controller anyway, most of which are 256 bit wide, so 2gb addressing should be irrelevant to the OS you are using.

    Or did i miss a critical step somewhere?
    Current: 2500k@ 4.0 // 8g Gskill // TZ68k+ // 5850 @ 900
    Best way to catch me is by PM // AKA Harmavoidance0


  12. #62
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    right... Sorrow13 said it nicely!

    @LordEC911, I could be wrong. But I do not see how the GT200 could be any bigger than 500 mm^2 (or less than 450mm^2 if made on 55nm)--and it is quite possible for Nvidia to fit that chip onto whatever PCB design.

    A rumor was quoted: "Nvidia will have a hard time fitting that chip onto some PCB"--now that I think about it, you are probably right that it will be a different PCB design.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  13. #63
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Sorrow13 View Post
    unless im mistaken (offtopic entirely) The video ram is only addressed by the gpu memory controller anyway, most of which are 256 bit wide, so 2gb addressing should be irrelevant to the OS you are using.

    Or did i miss a critical step somewhere?
    Address space is address space.

    Start with 4GB of memory, then plug in 2 9800GX2's to your 32bit OS and watch how much system RAM you have according to Windows, before and after

  14. #64
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by Bo_Fox View Post
    right... Sorrow13 said it nicely!
    Nah, I think KoHaN69 said it better lol.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  15. #65
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    Quote Originally Posted by LyP0 View Post
    hmm sounds like its time to get rid of the 9800gx2's
    I don't think so. I wouldn't be surprised if evga offered another 120 day step up program.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  16. #66
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    eu/hungary/budapest.tmp
    Posts
    1,591
    Quote Originally Posted by Sorrow13 View Post
    unless im mistaken (offtopic entirely) The video ram is only addressed by the gpu memory controller anyway, most of which are 256 bit wide, so 2gb addressing should be irrelevant to the OS you are using.
    Quote Originally Posted by Bo_Fox View Post
    right... Sorrow13 said it nicely!
    Sorry, while that may sound logical, you are wrong.
    Windows does take the VRAM into account.
    Unfortunately I only know of a test in my languege, but they showed that
    with cards with bigger buffers, the available memory is less with 32bit
    windows.
    Usual suspects: i5-750 & H212+ | Biostar T5XE CFX-SLI | 4GB RAndoM | 4850 + AC S1 + 120@5V + modded stock for VRAM/VRM | Seasonic S12-600 | 7200.12 | P180 | U2311H & S2253BW | MX518
    mITX media & to-be-server machine: A330ION | Seasonic SFX | WD600BEVS boot & WD15EARS data
    Laptops: Lifebook T4215 tablet, Vaio TX3XP
    Bike: ZX6R

  17. #67
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Sr7 View Post
    Address space is address space.

    Start with 4GB of memory, then plug in 2 9800GX2's to your 32bit OS and watch how much system RAM you have according to Windows, before and after
    Yup, I run two video cards and after installing the 2nd video card in Vista32 or XP available system memory was reduced, I have a max of like 2.8gb with 4gb installed.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  18. #68
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    Los Angeles, CA
    Posts
    628
    I'll tell you my experience with 4 gigs of ram installed on a 32 bit OS...

    After installing my 2 9800 GX2s, I had 1.7 gigs of available system memory
    1.7!!!

    Needless to say, I am now running x64
    2600k @ 5ghz / z68 Pro / 8gb Ripjaws X / GTX 580 SLi
    2x Inferno raid 0 / WD 1TB Black / Thermaltake 1200w
    Dell 3008 WFP / Dual Loop WC MM UFO

  19. #69
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    498
    Nvidia First 55nm Desktop Graphics; GeForce 9800 GT

    VR-Zone first revealed Nvidia's plan to shift from 65nm to 55nm to lower costs and we have also told you a series of 55nm based mobile graphics GPUs based on G94b and G96b. Now we learned that the first desktop graphics card to be based on 55nm G92b core will be GeForce 9800 GT and it will be launched in July along with GeForce 9900 series. Apparently, GeForce 9800 GTS will be OEM only, not for channel.
    http://www.vr-zone.com/articles/Nvid...0_GT/5714.html
    Faceman


  20. #70
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    mo interesting gnu's
    i7 3610QM 1.2-3.2GHz

  21. #71
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by sonofander View Post
    I'll tell you my experience with 4 gigs of ram installed on a 32 bit OS...

    After installing my 2 9800 GX2s, I had 1.7 gigs of available system memory
    1.7!!!

    Needless to say, I am now running x64
    Well, my SLI setup with 1GB of video ram total is doing just fine with 2 GB of memory in WinXP 32-bit. I did not notice any difference when I added a second card for SLI. The only game that came dangerously close to the 2GB limit was The Witcher, but that was fixed with patch 1.2 which reduced approximately 500MB of memory usage. Guys, Crysis only uses up to 1.3GB of system memory at the very most. It eliminated all stability problems with such resource hogs (the Witcher, Oblivion with mods, etc..) when I set the virtual memory to a fixed 3GB buffer space when gaming in 32-bit.

    (Of course, Vista needs a 4GB sucker, whether 32-bit or 64-bit)
    Last edited by Bo_Fox; 04-13-2008 at 09:50 PM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  22. #72
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    More rumors

    GT200-30 is taped out


    And works

    Looks that GT200 is also going to be ready for late Q2 launch. We only have some limited data but the chip that we call GT200 is taped out and the prototypes of the card are already up and running.

    The device is known as Nvidia GT200-300 and this naming scheme might implicate that this is something that we've seen listed as D10U-30.

    We can only speculate is this a 55 or 65nm or if this card has one or two chips but we can confirm that this configuration has 1024MB of memory.

    The chip however is called GT200-300. The shocking part is that both R700 and new GT200 might be launched at roughly the same time, or very close to each other.

    http://www.fudzilla.com/index.php?op...=6786&Itemid=1

    Nvidia confirms GT-200 with 1 billion transistors


    Actually Nvidia wanted to talk on 10 April 2008 during a press meeting in Munich only about new initiatives approximately around its Quadro Grafikkarten. Rather casually one confirmed then however still some data approximately around the next GPU generation, which was to follow the G92-Chips.

    Jeff Brown stated openly and without asking to have been that the next architecture actually "GT-200". The GPU will consist of approximately one billion transistors, and it was "pure logic, no memory as CPUs," said Nvidia Manager. This enormous effort circuit coincides with previous rumors, according to which the GT-200 about 200 shader units. Previous G80 and G92 GPUs bring a maximum of 128 of the computing engines.

    Furthermore, said Brown nor that Nvidia itself in May 2008 closer to the GT-200 express will. Whether an official architecture-intentioned idea, or product confidentiality agreements with selected press representatives, including several vesting period presented, Jeff Brown, however, was still open.

    http://translate.google.com/translat...hl=en&ie=UTF-8



    GT200 also known as G100, or GeForce Next is the next-generation flagship NVIDIA graphics core, the main specifications are as follows:

    Process: 65 nm
    The number of transistors: About 1.5 billion
    Core Area: about 600 mm2
    Core :550-650 MHz frequency
    Stream Processor: 240
    Stream Processor Frequency: 1.5 GHz
    Texture modules: 80
    Grating of the processor: 32
    Memory interface: 512 - bit
    Memory specifications: GDDR3
    Clocked :1.0-1.1 GHz
    Maximum thermal design power (TDP): 200W above
    Properties: 100% increase over G80

    http://bbs.chiphell.com/viewthread.p...extra=page%3D1

    http://www.google.com/translate?u=ht...=pt-PT&ie=UTF8
    regards
    Last edited by mascaras; 04-15-2008 at 12:29 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  23. #73
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Umm yes please! Please let it be true! xD

    240 SPs sounds a bit much IMO
    as well as 200W+ TDP
    and 100% perf boost over G80 (well perhaps in 1920 x 1200+ or with AA & AF it would have such advantage or close anyway)

    But the rest I dunno, sounds highly plausible and what many people seem to be expecting. At least it would be able to run Crysis properly at last lol.
    Last edited by RPGWiZaRD; 04-14-2008 at 08:43 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

Page 3 of 3 FirstFirst 123

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •