MMM
Page 2 of 6 FirstFirst 12345 ... LastLast
Results 26 to 50 of 146

Thread: 55nm GT200 (GT200-400) on the way?

  1. #26
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Nvidias own PR slides list a 236W TDP but that doesn't mean it uses that much.

    http://www.hardwarecanucks.com/forum...review-20.html

    At least they tell us their methodolgy

  2. #27
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by metro.cl View Post
    GTX 280 TDP is 236Watts but this seems to be inflated, as many reviews show a lot less.
    I think it's because most games don't stress the GPU completely. The few GTX's that had overheating problems would skyrocket to 105 degrees before shutting down, but only in certain games and in Vantage I think did it as well. So that seems like the GPU is often not 100% stressed, maybe due to CPU bottlenecks.

  3. #28
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Well look at the hardwarecanucks review because they run the tests by stressing the GPU's on the same benchmark over and over (the only way to eliminate multiple variables).

    I'm against reviews that test power usage in cases where varying situations can occur, such as in-game. Different cards will react differently to situations in a game and so power loads will vary because GPU's aren't being equally loaded. The only way to test is if you run the same benchmark over and over that stress each GPU the same amount of load and compare the relative power usage.

  4. #29
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by zerazax
    maybe you should stick to your own advice?
    Whoosh!

  5. #30
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by Atal View Post
    Rossie O'Donell in a sauna sweting kind of hot....?
    You did not just say that
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  6. #31
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    I'd heard rumors that nVDA developed the chip for borh 65nm AND 55nm

    it won't be months, it will be weeks, 9800GTX+ is 55nm supposedly...

  7. #32
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    Could also check 280 power consumption via SLI vs single card consumption, which if done right subtracted should leave the consumption of a single card remaining. However, that won't give TDP of course.

  8. #33
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    112
    Quote Originally Posted by xlink View Post
    I'd heard rumors that nVDA developed the chip for borh 65nm AND 55nm

    it won't be months, it will be weeks, 9800GTX+ is 55nm supposedly...
    Source? And are you talking about GT200@55nm? Because that`s what NVIDIA needs the most.

  9. #34
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by gojirasan View Post
    Whoosh!
    Don't you mean...

    Skadoosh!

    :p

    Pekram

  10. #35
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    Quote Originally Posted by Barys View Post
    Source? And are you talking about GT200@55nm? Because that`s what NVIDIA needs the most.
    somewhere on XS or OCN... too lazy to dig it up.

  11. #36
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Won't the 55nm version be GT200b-400?

    This is essentially, an Ultra chip for stable running higher clocked cards.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  12. #37
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by Macadamia
    This is essentially, an Ultra chip for stable running higher clocked cards.
    Do you work for Nvidia or are you psychic? Just curious. I am so tired of people making predictions without citing any kind of source.

  13. #38
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by gojirasan View Post
    Do you work for Nvidia or are you psychic? Just curious. I am so tired of people making predictions without citing any kind of source.
    9600GSO/8800GS - G92-150
    8800GT- G92-200
    8800GTS- G92-400
    9800GTX- G92-420

    Guess what? I'm sick of hypocritical raving people that call out others without even looking ANYTHING up. Happy succeding Sr7 and his fate though.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  14. #39
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Oh. The G92. I thought we were talking about the GTX 280 ultra or GX2 or some other unannounced product that none of us without any inside info have any focking clue about. Obviously, since you haven't even claimed any inside info and haven't revealed your source for what the GTX code names mean I would have to conclude that you are full of it. I wish you idiots would at least mention that what you are saying is nothing but idle speculation.

  15. #40
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by saaya View Post
    the gtx280 is about the same as the 2900xt i think, and thats around 160W, right? i wonder if the ultra might be the first to break the 200W barrier... and then larrabee will break the 250W barrier... insane... just when cpus start to reach reasonable TDPs gpus double the TDP of the top end cpus

    hard to tell, i bet even nvidia doesnt know... even if they can make some cards, it doesnt mean they can actually make enough to make money with them. same as with the current gpus, they can make them, a few per month, sure... but do they actually make money selling gtx cards? i dont think so...
    Lol intel will probably end up losing money with larabee because they'll have to give everyone a pci_e 3.0 mobo and 5 kw psu to go with it just get it to post, and man once you start gaming....
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  16. #41
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by gojirasan
    Quote Originally Posted by annihilat0r
    Nvidia has yield problems in the current version.
    Source? Or are you just talking out of your arse?
    How can you manage to quote the wrong person for the wrong message? You just click QUOTE. I wasn't the one to say that. It was Anemone who said that.

    And yeah, it is correct. NVidia's GT200-related yield problems were in the rumors even before the launch, we even had figures (a single GT200 gpu costs around $150 to manufacture for NVidia).

  17. #42
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by annihilat0r
    How can you manage to quote the wrong person for the wrong message? You just click QUOTE. I wasn't the one to say that. It was Anemone who said that.
    Ouch. I'm sorry annihilat0r. I better go edit that. I don't usually click quote to quote because I am usually responding to just a small part of a post.

    Those alleged "yield problems" are not based on anything factual. They're just rumors. Is there any actual hard data on Nvidia's costs? You know like actual evidence or at least a first hand account of it? I mean it's not like there has never been a widespread rumor that was later shown to be false. Although in this case I just don't see how we are ever going to get any hard data. Why would Nvidia publish their yields whether they are worse than expected or better than expected. I see speculation about Nvidia's yields to be fairly pointless. Either they will lower prices or they won't. Still it bothers me when people claim to know something that they couldn't possibly know.

  18. #43
    Xtreme Member
    Join Date
    May 2007
    Posts
    341
    Does NVIDIA really pay the full cost of bad dies? I presume, that when a company orders a chip from a fab they don't own, the cost of bad dies is not their fault and thus loss of money completely. TMSC are the ones that buys the waffers and produces the dies, not NVIDIA. If TMSC accepts the job and whatever deal has been made, they have more responsibility for bad dies and errored products than NVIDIA.

    This is based upon usual business practice as I see it, but anything computer related is seldom normal business and other rules usually apply.

    I think the figures mentioned here about cost is totally off for various reasons, but the some of the yield numbers may or may not... This is just confusing to me.

    Just wondering.

  19. #44
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Well if that were true, then say a company orders a ridiculously sized chips and only pays for the good ones... how would TSMC make any money if yields were terrible?

    I'm sure they have some contract down for both sides. Because after all, both sides are responsible for yields. TSMC for making sure the process is low in defect / area and also Nvidia for making sure the design is feasible for reasonable yields. Likeliest what happens is that TSMC sets a price for a wafer and the process (often a bulk discount as well given that Nvidia is buying many of these) probably with contract terms to ensure that certain defect/cm^2 is below a certain level and so on, but Nvidia is responsible for making sure their design will produce acceptable yields.

    After all, if you only had to pay for what was good, then a company could create a ridiculous chip where a 300mm wafer produces one good chip that the company would pay forwhile TSMC eats the rest of the cost of the wafer. I doubt that's what happens or else TSMC would be in big financial trouble.

  20. #45
    Xtreme Member
    Join Date
    May 2007
    Posts
    341
    Quote Originally Posted by zerazax View Post
    Well if that were true, then say a company orders a ridiculously sized chips and only pays for the good ones... how would TSMC make any money if yields were terrible?

    I'm sure they have some contract down for both sides. Because after all, both sides are responsible for yields. TSMC for making sure the process is low in defect / area and also Nvidia for making sure the design is feasible for reasonable yields. Likeliest what happens is that TSMC sets a price for a wafer and the process (often a bulk discount as well given that Nvidia is buying many of these) probably with contract terms to ensure that certain defect/cm^2 is below a certain level and so on, but Nvidia is responsible for making sure their design will produce acceptable yields.

    After all, if you only had to pay for what was good, then a company could create a ridiculous chip where a 300mm wafer produces one good chip that the company would pay forwhile TSMC eats the rest of the cost of the wafer. I doubt that's what happens or else TSMC would be in big financial trouble.
    That makes much sense, and thanks for replying.

    Still, TSMC can refuse the job and aren't forced to anything, and I have always thought there are initial trials before any contract is sealed? Tests are being made for both, and a contract is made based upon the results of those.

    I have no clue about all of this, of course

  21. #46
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by perkam View Post
    Don't you mean...

    Skadoosh!

    :p

    Pekram
    GOTCHA!

    Are we there yet?

  22. #47
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by xlink View Post
    somewhere on XS or OCN... too lazy to dig it up.

    maybe this :



    Seems like GT200 got two versions. One is 65nm and the other is 55nm.

    Probably 65nm is for first wave of GT200 cards and 55nm is for later batches.

    http://we.pcinlife.com/thread-929091-1-1.html



    http://forums.vr-zone.com/showpost.p...3&postcount=33

    regards
    Last edited by mascaras; 06-30-2008 at 07:27 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  23. #48
    Xtreme Member
    Join Date
    May 2005
    Posts
    193
    Wow if they keep overcloking that heat monster, it'll explode ...

  24. #49
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by zerazax View Post
    Well look at the hardwarecanucks review because they run the tests by stressing the GPU's on the same benchmark over and over (the only way to eliminate multiple variables).

    I'm against reviews that test power usage in cases where varying situations can occur, such as in-game. Different cards will react differently to situations in a game and so power loads will vary because GPU's aren't being equally loaded. The only way to test is if you run the same benchmark over and over that stress each GPU the same amount of load and compare the relative power usage.
    I prefer seeing what sort of power the cards will draw while doing things I atually do & not in a lab environment (line conditioners etc). Saying that it could consume 500W and if the performance was there I wouldn't care, i'd just slap some Phase on it.

  25. #50
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    Quote Originally Posted by jam2k View Post
    Wow if they keep overcloking that heat monster, it'll explode ...
    Do you own one?

Page 2 of 6 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •