No it doesnt make sense. You draw no patterns or anything of substance other than stating when certain companies had failures.
Printable View
And I wouldn't call HD2900XT a train wreck...
2900 XT Was a good 3D Mark card in its day.
I agree it wasnt that bad!
Sorry guys - this is all a little ridiculous to me.
First and foremost: yields are always kept on the down low (even in foundry situations). Information like this doesn't just float out of nowhere.
Secondly: this isn't so much NVIDIA's problem as it is TSMC's problem. If Nvidia's chips are following TSMC's provided design rules then it will be up to TSMC to provide whatever minimum yields they guarantee. Obviously things are a little more complex than that (since chip designs do influence yields in a measurable way) but you have to remember that NVIDIA outsources 100% of their chip manufacturing. It isn't like Intel or (formerly)AMD/ATI who is going to be directly linked to their yields.
Yes, this may hurt NVIDIA if they can't push out their flagship product in a timely manner - but the effect is going to be less harsh than you may expect. And I can assure you TSMC will do whatever it takes to appease NVIDIA as NVIDIA is their largest client: if they have to meet NVIDIA's supply quotas due to low yield they'll be happy to take a loss as necessary (especially given that the top end chips will be low volume products. Smaller chips invariably have higher yield given that yield is related to defects per unit area).
In short.. move along folks.
Just one thing...
Who revealed the most "crucial" part of G200's architecture first ( well, actually it was the only correct pic until the launch :p: ) and several other 100% accurate info ?... hmm
Anyway, I don't do c0cks...
Nonetheless why should I even expose my sources or take the risk to expose NDA'd or not NDA'd stuff ?
Until Nvidia shows their chips or gives some proof, they cannot really do anything to make us not trust these rumours.
I believe 2% figure is not realistic (hey, it's Charlie after all), but it is still supposedly very low. Otherwise, since competition revealed their cards already, why not give your own fans something for their confidence?
yeah but you dont go to tsmc and hand them the design and say i wanna buy 100 fully functional chips with this design please... you can buy wafers and tsmc will try their best at getting high yields... but theres no guarantee, you get the wafers, not working chips... so its not tsmcs problem... if gt300 cant be made in comercial viable yields nvidia will suffer, not tsmc... tsmc doesnt need gt300, nvidia does...
below a certain chip size having even smaller chips doesnt result in notably higher yield... theres a break point with chip sizes where a wafer goes from trash to usable... getting the chip size right is a tricky game, cause you gotta design the chip and set its size a long time before the process is available for test runs... at least reliable test runs. the smart way is to implement plenty of fuses and some redundancy so you dont have to hit the sweet chip size spot exactly, and yields improve so you can try to just hit the acceptable yield rate just barely and then just wait for yields to improve to get your ship out of bad waters... thats what nvidia traditionally did... knowing that tsmcs 40nm was supposed to be fixed but still has yield issues i wouldnt be surprised if gt300 is above the acceptable yield chip size, at least for now...
im not convinced yields are that bad... but if they are, its a problem for nvidia... but yes, not a major one... nvidia could survive with gt200 for another year without losing too much money i think... q3-q4 2010 is where it gets critical... if they dont have a really nice product out by then it could break their neck... but thats plenty of time and im sure they have more planned than just gt300... by that time tsmc and gf should have 28nm done, so even if nvidia never gets propper 40nm parts out, they still have a second chance with 28nm...
Means that GT200 wouldn't be too big? :) Name me 5 chips bigger than GT200. I will name you 500 chips smaller than GT200 in return.
Source?Quote:
G300 is smaller than GT200 but larger than GT200b.
Intels fabs are somewhat ahead of those of TSMC, hence they can produce bigger chips somewhat easier, hence direct comparison is somewhat flawed. Though it holds some value. :)Quote:
Larrabee is supposed to be bigger than GT200.
IF this is causing problems for Nvidia, I am sure they have a backup plan. Get GT300 working, slash 200-series prices, do some PR stuff, rename, driver tricks, rename some more, more PR stuff. It is amazing how much Nvidia can do(and has done, as history suggests) without having new chips to show.
:exclaim:
The following is the most sensible text about this matter: (posted earlier in this thread, from Anadtech)
Quote:
Can it be that bad? Sure, it can always be zero.
Let's just assume ALL of Charlie's numbers and sources are 100% correct...its four wafers.
Getting low yields on four wafers is not exactly uncommon. And it especially comes as no surprise for a hot lot as typically hot lots have nearly all the inline inspection metrology steps skipped in order to reduce the cycle-time all the more.
Those inline inspections are present in the flow for standard priority wip for a reason, related to both yield (reworks and cleanups) as well as cost reduction (eliminate known dead wip earlier in the flow).
I really pity anyone who is wasting their time attempting to extrapolate the future health of an entire product lineup based on tentative results from four hot-lotted wafers. That's not a put down to anyone who is actually doing just that, including Charlie, its an honest empathetic response I have for them because they really are wasting their time chasing after something with error bars so wide they can't see the ends of the whiskers from where they stand at the moment.
Now if we were talking about results averaged from say 6-8 lots and a minimum of 100-200 wafers ran thru the fab at standard priority (i.e. with all the standard yield enhancement options at play) then I'd be more inclined to start divining something from the remnants of the tea leaves here.
But just four wafers? Much ado about nothing at the moment, even IF all of the claimed details themselves are true.
This would have been far more interesting had the yields on those four wafers came back as 60% or 80%, again not that such yield numbers could be used to say anything about the average or the stdev of the yield distribution but it would speak to process capability and where there is proven capability there is an established pathway to moving the mean of the distribution to that yield territory.
But getting zero, or near-zero, yield is the so-called trivial result, it says almost nothing about process yield to get four wafers at zero yield. All it takes is one poorly performing machine during one process step and you get four wafers with yield killing particles spewed on them.
Sounds the most plausible, leaving any conjecture moot. As they die set the lith, and refine their angle, etc.
But will Nvidia have a $299 DX11 part...? That is the question. If not, they loose!
Extreme high-end cards are awesome, but only a few people ever buy those.
Xmas is coming, if people are going to upgrade to a new system, they will do it for Xmas and Windows 7 and Microsoft is going to make sure the whole world knows about it come October 22nd.
So, after millions of Xmas shoppers buy new computers with ATi DX11 cards... who's left to be waiting for Nvidia re-branders and $499 GT300s..?
Those 100k people who waited for a nVidia DX11 cards to go on sale in January, aren't enough to bring in profits for Nvidia !
sigh more of charlie faerie tales...
Can someone close Fudzilla?
http://www.fudzilla.com/content/view/15535/1/
NV is already about three months behind, if its really that bad they better scrap gt300, lower the 2xx model prices and develop something new from ground up..
If you believe Fudzilla GT300 is already built "from ground up" ;)
http://www.fudzilla.com/content/view/15535/1/
Quote:
We can only confirm that GT300 is not a GT200 in 40nm with DirectX 11 support. It’s a brand new chip that was designed almost entirely from the ground up. Industry sources believe that this is the biggest change since G80 was launched and that you can expect such level of innovation and change.
makes me wonder if it's really a completely new chip. or: what does it take for a chip to be "completely new". if you think about the steps from g80-> gt200 -> gt300, all have been completely new chips (if the rumor @fud holds water), whereas ati/amd focused on improving their older chips and adding new features etc (or does this count as "completely new chip" as well? :rolleyes:).
the reason i'm thinking about that is, because i'm wondering why nvidia stopped improving chips (like it was from e.g. 6800 to 7800) and aims for completely new ones instead. that has to be way more expensive... or maybe nvidia wasn't able to get more out of their current chips? i don't know...
you mean your not sure? its pretty obvious isnt it?
on nvidias recent roadmaps gt200 is a q1 part, so there is no way they will have any dx11 part for christmas, ie in late november which is when shops and distries stock up for christmas.
even if it would be out by then, it certainly wouldnt be 299$...
but that doesnt mean nvidia loses... they lose a lot of potential sales... but is it really that much? what games do you need a faster card than a 285 for these days? unless your on a 30" monitor that number can be counted on one hand, especially if you take into account games that might be interesting to play for one certain person, nobody is going to want to play all of those demanding games...
nobody needs or can make any use of dx11 for now and for the next couple of months, and even then itll be more like a patched on tech demo than really a notable diference...
what nvidia really needs is a cheaper gt200, they dont NEED dx11, and they dont need a monster perf gt300 chip... i hope nvidia realizes this as well and doesnt put all their efforts into gt300 :D
GT300 will most likely do better than a 5870 because its suppose, hell the GT300 part is not suppose to be priced around 5870's cost anyways an will be higher, so it would most likely perform better also.
holy crap thats bad news for all of us