Nvidias own PR slides list a 236W TDP but that doesn't mean it uses that much.
http://www.hardwarecanucks.com/forum...review-20.html
At least they tell us their methodolgy
Nvidias own PR slides list a 236W TDP but that doesn't mean it uses that much.
http://www.hardwarecanucks.com/forum...review-20.html
At least they tell us their methodolgy
I think it's because most games don't stress the GPU completely. The few GTX's that had overheating problems would skyrocket to 105 degrees before shutting down, but only in certain games and in Vantage I think did it as well. So that seems like the GPU is often not 100% stressed, maybe due to CPU bottlenecks.
Well look at the hardwarecanucks review because they run the tests by stressing the GPU's on the same benchmark over and over (the only way to eliminate multiple variables).
I'm against reviews that test power usage in cases where varying situations can occur, such as in-game. Different cards will react differently to situations in a game and so power loads will vary because GPU's aren't being equally loaded. The only way to test is if you run the same benchmark over and over that stress each GPU the same amount of load and compare the relative power usage.
Whoosh!Originally Posted by zerazax
I'd heard rumors that nVDA developed the chip for borh 65nm AND 55nm
it won't be months, it will be weeks, 9800GTX+ is 55nm supposedly...
Could also check 280 power consumption via SLI vs single card consumption, which if done right subtracted should leave the consumption of a single card remaining. However, that won't give TDP of course.
Do you work for Nvidia or are you psychic? Just curious. I am so tired of people making predictions without citing any kind of source.Originally Posted by Macadamia
Oh. The G92. I thought we were talking about the GTX 280 ultra or GX2 or some other unannounced product that none of us without any inside info have any focking clue about. Obviously, since you haven't even claimed any inside info and haven't revealed your source for what the GTX code names mean I would have to conclude that you are full of it. I wish you idiots would at least mention that what you are saying is nothing but idle speculation.
How can you manage to quote the wrong person for the wrong message? You just click QUOTE. I wasn't the one to say that. It was Anemone who said that.Originally Posted by gojirasan
And yeah, it is correct. NVidia's GT200-related yield problems were in the rumors even before the launch, we even had figures (a single GT200 gpu costs around $150 to manufacture for NVidia).
Ouch. I'm sorry annihilat0r. I better go edit that. I don't usually click quote to quote because I am usually responding to just a small part of a post.Originally Posted by annihilat0r
Those alleged "yield problems" are not based on anything factual. They're just rumors. Is there any actual hard data on Nvidia's costs? You know like actual evidence or at least a first hand account of it? I mean it's not like there has never been a widespread rumor that was later shown to be false. Although in this case I just don't see how we are ever going to get any hard data. Why would Nvidia publish their yields whether they are worse than expected or better than expected. I see speculation about Nvidia's yields to be fairly pointless. Either they will lower prices or they won't. Still it bothers me when people claim to know something that they couldn't possibly know.
Does NVIDIA really pay the full cost of bad dies? I presume, that when a company orders a chip from a fab they don't own, the cost of bad dies is not their fault and thus loss of money completely. TMSC are the ones that buys the waffers and produces the dies, not NVIDIA. If TMSC accepts the job and whatever deal has been made, they have more responsibility for bad dies and errored products than NVIDIA.
This is based upon usual business practice as I see it, but anything computer related is seldom normal business and other rules usually apply.
I think the figures mentioned here about cost is totally off for various reasons, but the some of the yield numbers may or may not... This is just confusing to me.
Just wondering.
Well if that were true, then say a company orders a ridiculously sized chips and only pays for the good ones... how would TSMC make any money if yields were terrible?
I'm sure they have some contract down for both sides. Because after all, both sides are responsible for yields. TSMC for making sure the process is low in defect / area and also Nvidia for making sure the design is feasible for reasonable yields. Likeliest what happens is that TSMC sets a price for a wafer and the process (often a bulk discount as well given that Nvidia is buying many of these) probably with contract terms to ensure that certain defect/cm^2 is below a certain level and so on, but Nvidia is responsible for making sure their design will produce acceptable yields.
After all, if you only had to pay for what was good, then a company could create a ridiculous chip where a 300mm wafer produces one good chip that the company would pay forwhile TSMC eats the rest of the cost of the wafer. I doubt that's what happens or else TSMC would be in big financial trouble.
That makes much sense, and thanks for replying.
Still, TSMC can refuse the job and aren't forced to anything, and I have always thought there are initial trials before any contract is sealed? Tests are being made for both, and a contract is made based upon the results of those.
I have no clue about all of this, of course![]()
maybe this :
Seems like GT200 got two versions. One is 65nm and the other is 55nm.
Probably 65nm is for first wave of GT200 cards and 55nm is for later batches.
http://we.pcinlife.com/thread-929091-1-1.html
http://forums.vr-zone.com/showpost.p...3&postcount=33
regards
Last edited by mascaras; 06-30-2008 at 07:27 AM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
Wow if they keep overcloking that heat monster, it'll explode ...
Bookmarks