MMM
Page 1 of 6 1234 ... LastLast
Results 1 to 25 of 146

Thread: 55nm GT200 (GT200-400) on the way?

  1. #1
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574

    55nm GT200 (GT200-400) on the way?

    I don't know where they got it from, but in Donanimhaber it says that ForceWare 177.40 Beta has a "GT200-400" tag in it, along with GT200-100 (GTX 260) and GT200-300 (GTX 280).

    This is probably the 55nm GT200. I think it shouldn't be too far away now since NVidia has already began producing in 55nm (9800GTX+)

  2. #2
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    i think its more likely that it is a 290gtx aka 280gtx ultra.

    Since most 280gtx can clock 700mhz+ without any problem.

  3. #3
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    If it is a 55nm, it is a long way off since that won't be around for months yet.

  4. #4
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    One can only hope. nVidia seriously needs to drop prices at the top end.

  5. #5
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Hmm.. no more than 4 months from now, since Nvidia has no other choice as ATI is winning this round.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  6. #6
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    yepp, sounds like an ultra... but damn, that thing will be HOTTTTT

  7. #7
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    If it's indeed an ultra model of GTX 280 its rated TDP will be fun to look at. Plus its price.

    And I don't think we're 4 months away from an 55nm GT200. I think it could be as early as in two months, some time after 4870x2's launch, but not much after.

  8. #8
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    4 months would be Oct/Nov. I think that'd be on the early side of a die shrink.

    Nvidia has yield problems on the current version. A die shrink isn't going to be a simple process as they're going to have to sort out the "problems" from the simply failed sample productions. That takes time, and while you're right, AMD will eat them alive during that "time" used, they really can't help themselves at this point.

    The 260/280 might be less than they should be but it's going to be all you'll see probably for most if not all of the year.

  9. #9
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    I think the yield problems of GT200 chips should mostly be due to its massive size, not the problems with the chip itself.

    And if you're right and we don't see an 55nm GT200 until near the end of the year, NVidia is really doomed for the time inbetween. I don't see them lowering GT200 prices even more. GTX 260 was already dropped from $500 to $400 anyway, and another $100 drop to make it actually competitive with 4870 is unlikely. And GTX 280 definitely needs to be dropped to around at most $450 if 4870x2 is to be released at $500. ATM NVidia can only compete at the $100-$160 price point.

    ATI will sell billions of 4800's by that time; and by next year's January they will have announced the RV800's.

  10. #10
    Xtreme Mentor
    Join Date
    Apr 2007
    Location
    Idaho
    Posts
    3,200
    Yields are bad enough to where they might as well try. TSMC has perfected their 55nm process with AMD...
    Bad yields = high price
    "To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."

  11. #11
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    Oh they'll try, that's for certain. They aren't entirely stupid. But it won't be a fast transition is all. And yes AMD will take every advantage, but AMD has needed a comeback chip for a while now no? It keeps things balanced for us.

    And Crossfire on Intel chipsets, does it get any better?

  12. #12
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Boise, ID
    Posts
    353
    I have a feeling that nvidia has been working on their 55nm 280's for quite a while now. They already have 55mn G92 based chips so I don't think the 280 is far behind. How far is anyones guess.

  13. #13
    Xtreme Mentor
    Join Date
    Mar 2007
    Posts
    2,588
    question for you graphics cards reviewers in here..... do you think the GTX290 or GTX280 ULTRA, whatever they decide to call it, will be able to scale any more than the speeds given out of the box ?

    Also if the design will not be 55nm based then do you think there would be some improvements in core voltage, memory voltage & timing and lastly the shader controls ?

    Oh and your thoughts on pricing

    P.s. Ive been busy lately and did not get a chance to fire up my 2 x 9800GX2 PNY cards from almost 2 months back now.... and now there is supposed to be a successor to the GTX 280 lol.... I think I might just sell my 2 PNY cards since they are brand new and still sealed in the packaging boxes.

  14. #14
    Xtreme Member
    Join Date
    Aug 2007
    Location
    San Diego, CA
    Posts
    330
    Quote Originally Posted by saaya View Post
    yepp, sounds like an ultra... but damn, that thing will be HOTTTTT
    Like Jessica Alba in a swimsuit kind of hot or like Rossie O'Donell in a sauna sweting kind of hot....?
    Quote Originally Posted by Kunaak View Post
    High end videocards are like hot girls.

    your never gonna pick the fat girl with the nice personality over the smoking hot 10.






    E8400
    DFI X38
    ATI 4870 X2
    8 Gb GSkill
    2x 36 Gb Raptor RAID 0
    750 Gb WD
    CPU, NB Water Cooled

  15. #15
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by annihilat0r View Post
    If it's indeed an ultra model of GTX 280 its rated TDP will be fun to look at. Plus its price.
    the gtx280 is about the same as the 2900xt i think, and thats around 160W, right? i wonder if the ultra might be the first to break the 200W barrier... and then larrabee will break the 250W barrier... insane... just when cpus start to reach reasonable TDPs gpus double the TDP of the top end cpus

    Quote Originally Posted by annihilat0r View Post
    And I don't think we're 4 months away from an 55nm GT200. I think it could be as early as in two months, some time after 4870x2's launch, but not much after.
    hard to tell, i bet even nvidia doesnt know... even if they can make some cards, it doesnt mean they can actually make enough to make money with them. same as with the current gpus, they can make them, a few per month, sure... but do they actually make money selling gtx cards? i dont think so...

  16. #16
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Atal View Post
    Like Jessica Alba in a swimsuit kind of hot or like Rossie O'Donell in a sauna sweting kind of hot....?
    my brain blanked out the second option, must be some protection mechanism kicking in so ill go for option no1,
    *leans back and closes eyes*mmmhhhh jessica alba... hmmmmm

  17. #17
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by saaya View Post
    the gtx280 is about the same as the 2900xt i think, and thats around 160W, right?
    I believe the GTX280 is rated max 235 watts. Of course with any card you will rarely see the absolute max. Although maxing out the GPU is a good thing, it means you're utilizing it as much as possible.

  18. #18
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by saaya View Post
    my brain blanked out the second option, must be some protection mechanism kicking in
    same here

  19. #19
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Surrey, UK
    Posts
    179
    I think NVIDIA will pass 55nm and go with 45nm or even 40nm coz they have to think for the future RV800 or somethink like that.
    We all know they dont like to fall behind, i hope they go with gddr5.

  20. #20
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    what is this all about gddr5?

    nvidia really needs gddr5? i dont think so.


    When i'm being paid i always do my job through.

  21. #21
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    i just checked and unfortunately xbitlabs isnt measuring the vga consumption anymore
    they had really professional results modding the pciE slot on one board to meassure all the power that flows to the card... really a shame...

    according to techpowerup the gtx 280 is 293W under load, well, system consumption, compared to 305W for a gx2.
    the last vga power numbers from xbitlabs meassured the gx2 at 180W under load.

    so the gtx280 consumes around 10W less than the gx2 which means 170W

    tpu also reviewed the gtx280 amp edition from zotac, which is clocked at 700/1150 compared to 600/1100 of a normal gtx280 card.
    it consumes 318W under load, which is almost 15W more than the gx2, which is at 180W according to xbitlabs.

    sooooo the final conclusion, the oced gtx280 cards are already scratching on the 200W barrier!
    im really curious if nvidia will really go for an ultra card... tdp wise it would make more sense to go for a dual card with lower clocked gpus...

  22. #22
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by Anemone
    Nvidia has yield problems on the current version.
    Source? Or are you just talking out of your arse?

    Quote Originally Posted by saaya
    but do they actually make money selling gtx cards? i dont think so...
    You're right. Nvidia is a charitable institution. They spent all that money on R&D in order to save the world. They don't want anything in return. Oh by the way, what is your source for Nvidia's manufacturing costs? And AMDs? I would be very interested to read exactly how much everything costs them. I think what is clear is that AMD is taking a HUGE loss on every 4000 series card that uses GDDR5. Remember how much DDR3 cost when it was first released? Just try to imagine how much DDR5 would cost you if it were released this week. A gig of that stuff might cost more than all the rest of their manufacturing combined. Even more than labor. GDDR5 just ramped up and trust me, it probably costs about what a chip of solid gold what cost of the same size. I would say AMD is taking about a $3299 loss on every card due to the expensive GDDR5. I guess they are just desperate to get that precious market share. At least AMD is used to working on negative margins. Maybe they can keep it up forever.
    Last edited by gojirasan; 06-30-2008 at 06:38 AM. Reason: quote attributed to wrong person

  23. #23
    Xtreme X.I.P.
    Join Date
    Aug 2004
    Location
    Chile
    Posts
    4,151
    Quote Originally Posted by saaya View Post
    i just checked and unfortunately xbitlabs isnt measuring the vga consumption anymore
    they had really professional results modding the pciE slot on one board to meassure all the power that flows to the card... really a shame...

    according to techpowerup the gtx 280 is 293W under load, well, system consumption, compared to 305W for a gx2.
    the last vga power numbers from xbitlabs meassured the gx2 at 180W under load.

    so the gtx280 consumes around 10W less than the gx2 which means 170W

    tpu also reviewed the gtx280 amp edition from zotac, which is clocked at 700/1150 compared to 600/1100 of a normal gtx280 card.
    it consumes 318W under load, which is almost 15W more than the gx2, which is at 180W according to xbitlabs.

    sooooo the final conclusion, the oced gtx280 cards are already scratching on the 200W barrier!
    im really curious if nvidia will really go for an ultra card... tdp wise it would make more sense to go for a dual card with lower clocked gpus...
    GTX 280 TDP is 236Watts but this seems to be inflated, as many reviews show a lot less.

  24. #24
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Source? Or are you just talking out of your arse?
    I think what is clear is that AMD is taking a HUGE loss on every 4000 series card that uses GDDR5.


    maybe you should stick to your own advice?

  25. #25
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by saaya View Post
    so the gtx280 consumes around 10W less than the gx2 which means 170W
    I don't think that is correct. 4870 is 160 watts, ~900 million transistors, 55nm. 280 is 1.4 billion transistors, 65nm. Seems very unlikely to me that they could match the load power draw of the 4870.

    The only source I could find is Fudzilla which lists the 280 @ 236 watts, but not the best source around.

Page 1 of 6 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •