Page 1 of 2 12 LastLast
Results 1 to 25 of 26

Thread: UPDATE: 55nm GT206 GPU powers both GTX290 and Quadro FX 5800

  1. #1
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Kuwait
    Posts
    616

    UPDATE: 55nm GT206 GPU powers both GTX290 and Quadro FX 5800


    The honor of being the first product powered by 55nm G200-302 chip (a.k.a. GT206/212) went to Quadro FX 4800/5800, products that launched with a lot of fanfare earlier today.

    Besides Quadro FX 4800 and 5800, the new 55nm GPU will also power GeForce GTX 270 and 290. Essentially, we’re talking about the same parts. Quadro FX 4800 is nothing more but GTX270 with double the amount of video memory, while Quadro FX 5800 is equal to GTX290, but with four times the video memory. ATI is not sleeping, as the company is preparing an RV790 part , beefed-up version of already existing RV770 chip.

    G200-302 Rev A2 begun manufacturing back in September, and the first parts are now finding their way to mass manufacturing. The chip features a die size of 470mm2, 107mm2 less than the original G200 chip. This just goes to show the vast difference between 65nm and 55nm - if Nvidia had the balls to go with 55nm chip back in May, the prices of GTX260/280 parts could have been way cheaper and offer much more flexibility, but we can’t cry over spilt milk. 55nm part is here now, and it will consume much less power than is the case with the 65nm one.

    The 55nm GPU consumes roughly 50% less power than it was the case with 65nm one, and this difference is more than massive. When I did quick power checks, the GTX280 at 650/1500/2200 would eat around 266W, while the default clocked GTX280 (600/1300/2200) was specc’ed at 238W.

    Well, the 55nm GPU will eat around 170W at 650/1500/2200, meaning that GTX290 just got 100W of power to play with. If you’re into overclocking, you can now start dreaming about clocking those 240 shaders to 1.7-1.8 GHz range (perhaps even 2.0 if water-cooling setup is powerful enough), and achieve massive performance gains, all happening while you’re consuming less power than a stock clocked GTX280.

    As far as the naming convention goes, Nvidia calls their chips NVxx (we’re at NV60 right now) or Gxx/Gxxx internally, and partners get the GT200-XXX name. But at the end of the day, the number that matters is the one on the chip.
    GTX 260 and 280 both came with G200-200 and G200-300 chips, while GTX270 and 290 will feature G206-202 and G206-302 chips. Essentially, there is no difference between the two, sans the hardwired part that decides how many shaders a certain chip has. If you’re brave enough, you’ll pop the massive HIS and play around with resistors. Who knows, perhaps you can enable 240 shaders on GTX260/270… or maybe not.
    In any case, we can’t wait for these new babies to show up. FX4800, FX5800, GTX270 and 290 are all coming to market very, very soon.
    My personal take is that Nvidia will try to steal the limelight of official Core i7 launch on 11/17 and ship the GTX270/290 to reviewers, trying to tell them that they’re still on top. All hopes with ATI lie in the form of upcoming 4890. But still, Nvidia does not offer a compelling $199 experience and this is where ATI will take them to the cleaners.

    Of course… unless you see a GTX260-216 at a completely new price point, and GTX270 costing just $50 more, dropping to $199 for Christmas. Crazy scenario, but competition brings the best for us, consumers.

    UPDATE: Picture that accompanied the story did not feature GT206 chip, thus I removed it. The rest of the info is pretty valid :-)
    Source
    humm what you guys think?

  2. #2
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Alberta, Canada
    Posts
    147
    I'll probably be picking up a 260 core 216 or a 270...if the pricing is good...but yeah not a lot of potential as the next step is in August
    Q9400@3.6 -- Thermaltake Water | ASUS P5K | 8800GTS 640 -- Thermalright HR-03 Plus | PC2-8500 Dominator 2x2GB@ 1092 5-5-5-13 | OCZ Core V2 30GB -- Raptor X 150GB -- Seagate 7200.10 250GB | Antec TruePower Quattro 850 | Silverstone FJ07 | Windows Vista Home Premium 64 -- Ubuntu 8.04


    Fold 1 | E8400 3.0 | ECS G31TM | PC2-6400 2x1GB | -- Fold 2 | 3800+ X2 | ASUS ??? | PC2-8500 2x1GB


    Folding...You Know You Want To

  3. #3
    Xtreme Addict
    Join Date
    Nov 2003
    Location
    Oslo, Norway
    Posts
    1,218
    I want 3 - 290GTX now!! Hmmm, I might want to sell my 280gtx cards...

  4. #4
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Going from 266W to 170W seems very unlikely. That'd mean that GPU consumption would have about halved by going 65->55nm - not likely at all.

  5. #5
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    And Nvidia wouldn't laser lock the shaders/cores for what reason?
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  6. #6
    Banned
    Join Date
    Jun 2008
    Posts
    763
    Well yeah I guess those people spitting on the 5800 Quadro were not right after all. There was someone on this forum who noticed the power consumption and concluded that it's a 55nm chip. to him!

  7. #7
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Boissez View Post
    Going from 266W to 170W seems very unlikely. That'd mean that GPU consumption would have about halved by going 65->55nm - not likely at all.
    u dont know that, the gpu with a huge die will see large drops in power with a shrink, its not a linear scale. and if they changed the pwm to accommodate a lower watt gpu u can drop alot of wasted consumption
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  8. #8
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Quote Originally Posted by zanzabar View Post
    u dont know that, the gpu with a huge die will see large drops in power with a shrink, its not a linear scale. and if they changed the pwm to accommodate a lower watt gpu u can drop alot of wasted consumption
    I know that - however the 55nm 9800GTX hardly saw any power benefits over its' predecessor- but could clock higher within the same TDP as the 9800GTX.
    Can't see any reason as to why things should be different with GT206.

  9. #9
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Boissez View Post
    I know that - however the 55nm 9800GTX hardly saw any power benefits over its' predecessor- but could clock higher within the same TDP as the 9800GTX.
    Can't see any reason as to why things should be different with GT206.
    the gt200 is more than 2.5x the size of the g92 thats why it would scale in power much better with that. the g92 was already a nice sized chip and wasnt previously limited by heat and power consumption so with a chip thats large and runs hot by taking out heat u can run at lower voltage and by shrinking u can also cut consumption

    look at the phenom its the same range it goes from like 120W to sub 80W, this may also be a sign that its paired with gddr4 or 5 since both use much less power than gddr3 and with a 512bit buss thats alot of power given off from the massive ic count
    Last edited by zanzabar; 11-14-2008 at 03:02 AM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  10. #10
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by zanzabar View Post
    the gt200 is more than 2.5x the size of the g92 thats why it would scale in power much better with that. the g92 was already a nice sized chip and wasnt previously limited by heat and power consumption so with a chip thats large and runs hot by taking out heat u can run at lower voltage and by shrinking u can also cut consumption

    look at the phenom its the same range it goes from like 120W to sub 80W, this may also be a sign that its paired with gddr4 or 5 since both use much less power than gddr3 and with a 512bit buss thats alot of power given off from the massive ic count
    It uses GDDR3.

    AMD's 65nm was ed up royallly, and 45nm was slightly better than general expectations due to litho and stressed silicon. That's a special case.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  11. #11
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by zanzabar View Post
    look at the phenom its the same range it goes from like 120W to sub 80W, this may also be a sign that its paired with gddr4 or 5 since both use much less power than gddr3 and with a 512bit buss thats alot of power given off from the massive ic count
    Um, I'm confused, did you really mean the phenom, the CPU?
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  12. #12
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Stukov View Post
    Um, I'm confused, did you really mean the phenom, the CPU?
    yes, its the same kind of thing. and im not saying anything over power consumption of large dies

    Quote Originally Posted by Macadamia View Post
    It uses GDDR3.

    AMD's 65nm was ed up royallly, and 45nm was slightly better than general expectations due to litho and stressed silicon. That's a special case.
    it could have the same sort of gains in efficiency gains


    im also saying that new ram would help the card with tdp by alot, even if its still gddr3 it could be a 65 nm build or something new
    Last edited by zanzabar; 11-14-2008 at 03:15 AM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  13. #13
    I am Xtreme
    Join Date
    Sep 2006
    Posts
    10,374
    Well let's hope she performs as good as told on paper and consumes less it's always a good deal ... hopefully GTX290 will appear on the market at a reasonable price (at current GTX280 level...)
    Question : Why do some overclockers switch into d*ckmode when money is involved

    Remark : They call me Pro Asus Saaya yupp, I agree

  14. #14
    Xtreme Member
    Join Date
    Oct 2008
    Location
    Florida..Tampa and St Petersburg
    Posts
    430
    .i been waiting for these cards..i have a feeling these new 55nm are going to overclock a lot better..hoping for a 900mhz core with out a hard mod.....once one of you guys get one and make thread on the card...if like what it the card does then im gona buy one

  15. #15
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    While this is possible, slightly, it seems very unlikely.
    Nvidia was having problems and now they hit the jackpot?
    30% TDP decrease while bumping clocks? They just broke a WR with a halfnode shrink.

    I won't doubt that Nvidia is finally ramping production of 55nm chips, the actual results/numbers aren't going to be that good.

    EDIT- I'm thinking they used Nvidia's numbers for the GTX280 TDP and used the actual power consumption numbers for the G200b chips. Actual power consumption, according to AnandTech, for the GTX280 is already ~180-190w.
    Last edited by LordEC911; 11-14-2008 at 09:36 AM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  16. #16
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    from FUD


    Nvidia already producing 55nm GT200



    Silent hill

    Nvidia’s CEO, Jensen said it in the last Q3 conference call, Marvin Burkett the CFO confirmed it, and we still had to go out and try to find if the 55nm production process they mentioned was about G92b or GT200 55nm.

    Well now, we can finally say that in Q4 Nvidia plans to ship many GT200 55nm CPUs and it looks like they will silently replace the old 65nm parts.

    Most new GTX 260 with 216 Shaders and upcoming GTX 280 will be based on the new 55nm chips but it looks that Nvidia does have a plan to increase the speed on the new products. Retail products based on 55nm chips are yet to be ship but we expect them in early December.

    We believe that the new SKUs should overclock much better than the previous ones and with the current prices, they will certainly put a lot of pressure on Radeon HD 4870 X2 and HD 4870 and it looks that Nvidia is not finished yet. What started as an insanely slow Q4 for every market segment is turning in a rather interesting finish.


    http://www.fudzilla.com/index.php?op...10473&Itemid=1

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  17. #17
    Xtreme Member
    Join Date
    Jun 2008
    Location
    Finland
    Posts
    111
    theovalich.wordpress.com/2008/11/11/55nm-gt206-gpu-powers-both-gtx290-and-quadro-fx-5800/
    And there goes my trust in the "50% less power consumption". 30k+ 3dmark06 with +3ghz phenom, anyone?

  18. #18
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    It's not impossible that they managed to drop the voltage slightly in the process of going to 55nm, which would obviously help power consumption further. And process-shrink aside, the 5800 isn't a carbon copy of the GTX280, so there might have been a few more watts to saved. But yeah... seems a little too good to be true.
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  19. #19
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    If these cards are just under-volted 55nm respins at the same speed, they'd have to overclock awfully well to be worth it imo.

    Hopefully Nvidia can pull through... As much as I respect ATI for pushing the envelope with the 4870X2, I mostly do play older games
    lately and when you're not dealing with newer titles it can be a bit buggy from what I've read.

    If these cards can really come out and represent a more "matured" design that will run more efficiently, I'm all for it.
    Be nice to tide me over until the next batch of "next-gen" cards at least. And give me some confidence back in NV.

  20. #20
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Never ever underestimate nVidia. That's when things like the 8800GTX comes out. Yes Sir! That one was a real shocker wasn't it. That one lasted a long long time. When you have a Company Master Ninja that is using terms like "Whoop Ass" that is an individual that is determined. Determined individuals will go to great lengths when pushed, and I'd say he's been pretty much pushed. hehe He will get it done. I can feel it.

    If they have a 290 coming out, I want it now. I don't just want it now, I need it now. I'm getting ready to buy within a week and will have to buy one or the other. I'd rather not go through having to step up, but will if I have to. AFter that I'll probably stick with what I have, and add another, and may possibly go to Tri SLI, but I suspect SLI is gonna be the sweet spot.

  21. #21
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by T_Flight View Post
    Never ever underestimate nVidia. That's when things like the 8800GTX comes out. Yes Sir! That one was a real shocker wasn't it.
    Agreed!

    I was still hanging onto my 6800 GT at the time, waiting for something that blew me away. And then it hit. G80 really is still great to this day imo.

    G92 and recent $600 launch G200's just left a sour taste in my mouth. Dunno about you guys, but I viewed G92 as nothing more than profit farming. There was no real increase in the top end other then the GX2, and who really wanted that? Great for benching, not so much for real-world games.

    But like I said, I'm kinda optimistic about this respin. Logic tells me that the heat dump from being made in 65nm may have crippled G200 a fair bit.

    Nice to have some GPU related news after all the Shanghai ruckus though. I swear, CPU arguments bring out the worst in people (myself included).

  22. #22
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Great, as soon as the GTX 290 comes out on market, I'm going to use my step up and upgrade.

    Talk about great timing eh?
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  23. #23
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by T_Flight View Post
    Never ever underestimate nVidia. That's when things like the 8800GTX comes out. Yes Sir! That one was a real shocker wasn't it. That one lasted a long long time. When you have a Company Master Ninja that is using terms like "Whoop Ass" that is an individual that is determined. Determined individuals will go to great lengths when pushed, and I'd say he's been pretty much pushed. hehe He will get it done. I can feel it.
    This has nothing to do with underestimating Nvidia, this has to do with breaking the law of Physics as we know it... Die shrinks are usually 10-15% at best, 30-50% is unheard of for an optical shrink.

    I am pretty sure that they are comparing using Nvidia's TDP numbers for G200 and then using actual power consumption for the G200b. G200 vs G200b actual power consumption is going to be about the same, within 10-20w.
    Last edited by LordEC911; 11-14-2008 at 11:55 AM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  24. #24
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by Sly Fox View Post
    Agreed!

    I was still hanging onto my 6800 GT at the time, waiting for something that blew me away. And then it hit. G80 really is still great to this day imo.

    G92 and recent $600 launch G200's just left a sour taste in my mouth. Dunno about you guys, but I viewed G92 as nothing more than profit farming. There was no real increase in the top end other then the GX2, and who really wanted that? Great for benching, not so much for real-world games.

    But like I said, I'm kinda optimistic about this respin. Logic tells me that the heat dump from being made in 65nm may have crippled G200 a fair bit.

    Nice to have some GPU related news after all the Shanghai ruckus though. I swear, CPU arguments bring out the worst in people (myself included).
    hehe I still have my 6800GT. See how bad I need the 290 now? It won't be long though.

  25. #25
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    319
    bring it on nvidia, I have only 42 days left for the evga stepup program.
    no time to waste.
    2x Asus P8Z68-V PRO Bios 0501
    i7 2600K @ 4.6GHz 1.325v / i5 2500K @ 4.4GHz 1.300v
    2x G.SKILL Ripjaws X Series 8GB DDR3 1600
    Plextor M5P 256GB SSD / Samsung 840 Pro 256GB SSD
    Seasonic X-1050 PSU / SeaSonic X Series X650 Gold PSU
    EVGA GTX 690 (+135%/+100MHz/+200MHz/75%) / EVGA GTX 680 SC Signature+ (+130%/+80MHz/+200MHz/70%)


Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •