So you think that when nVidia and ATi contract with TSMC to manufacture the GPU they charge them per wafer? I doubt it. Yeilds effect TSMC profits.
I believe i heard the BOM of a GTX card is around 220-240 US$ (it's something said in B3D forum), and that numbers are somewhat confirmed privately by a representative of one of nVidia AIBs in my local forum (in which i'm a mod overthere). So yes, IMO, HD 4800 series have HUGE cost advantage compared to GTX series, my hunch tells me only R700 would have a similar BOM compared to GT200 series card (perhaps up to 10% more expensive).
"What about cost. This stuff is going to cost a fortune, right? Well, yes and no. High-speed GDDR3 and GDDR4 memory is certainly expensive. We're told to expect GDDR5 to initially cost something like 10–20% more than the really high speed GDDR3. Of course, you don't buy memories, you buy a graphics card. Though the memory will cost more, it will be offset somewhat in other places on the product you buy. You don't need as wide a memory interface which means a smaller chip with fewer pins. The board doesn't need to contain as many layers to support wider memory busses, and the trace wire design can be a bit more simple and straightforward, reducing board design costs. As production ramps up, GDDR5 could be as cost effective overall as GDDR3. It will only be appropriate for relatively high-end cards at first, but should be affordable enough for the $80–150 card segment over the next couple years."
source:http://www.extremetech.com/article2/...2309888,00.asp
Guys. You are just pulling numbers out of your arses. Show me a scanned document or an official press release and I will believe it. You really expect me to believe rumors on an internet forum about Nvidia's or AMD's manufacturing costs? I can't believe you guys are for real. Even if the indonesian guy is right about the cost of a GTX card (and I have absolutely no reason to believe him), AMD's costs are still a complete mystery. The bottom line is cost speculation is irrelevant. Even MSRPs aren't all that important. What is important is the actual street prices. That is what determines the bang for buck. I am at least basing my GDDR5 costs on what DDR3 vs. DDR2 costs upon release. IIRC DDR3 was 2-4 times more expensive than DDR2. You really think the memory manufacturers are only going to charge a 10% premium over the old tech memory? That's just ridiculous. However none of us have any hard numbers. If someone does then fine. I will believe it. I realize that some of you probably think I am some kind of Nvidia fanboy, but I am not. I have no horse in this race. My next card is almost certainly going to be an HD4870x2. I just think many of you are being unrealistic and over-hyping things in AMDs favor. The assumption is that Nvidia's card costs significantly more to manufacture but that assumption is based on very little. And "significantly more" could mean $120 per card instead of $70 per card. It may only cost an extra $10 - $20 per die, and it seems reasonable to assume that the GDDR3 is going to be significantly cheaper. There is very limiited supply for GDDR5 and demand for it is very high. If it were the same price and widely available don't you think Nvidia would be using it too? As for the GTX280 being so expensive, it is not any more expensive than most of Nvidia's high end releases in the past few years. Actually it's about the same street price as the 8800GTX was selling for and now you can buy those cards for just over $200. So yes. I think these companies make a large profit for their new cards. They are in business to make money. So they charge as much as they think they can. And they need to recoup their R&D costs for a new arch.
It wouldn't surprise me at all if:
1) the memory on both the AMD and Nvidia cards cost more than the GPU.
2) both the HD4870 and the GTX280 cost under $100 to manufacture.
3) recouping the large R&D costs is more relevant to the prices than the actual manufacturing costs.
So which company spends more on R&D may be more relevant to the MSRPs of the high end cards than the actual cost of manufacturing the card itself. This is the point. I don't think anyone posting here really knows why each company prices a particular card a certain way. All we have are guesses and assumptions. I am willing to bet that Nvidia could sell the GTX280 for $189 and still make a profit, but probably not enough to recoup their research costs.
yes, but most blurb is about justifying the retail price![]()
i7 3610QM 1.2-3.2GHz
AFAIK TSMC does charge them per wafer. They charge them for the initial manufacturing of the equipment (not sure the exact term anymore, it was hard for my lil brain to remember all that) and after that its just per wafer. I believe I read that on some interview with a CEO or such, not sure where tho.
X3350 | DFI LP X38 T2R | d9gkx
9800gtx | Raptor1500AHFD/5000AACS/WD3201ABYS
Corsair 620HX | Coolermaster CM690
Bro...
Your just one person! If you choose not to acknowledge or accept the information, then don't. There are many smart people here, some of them are connected.
Nobody here has to demonstrate to you, where they got their specific numbers or have a burden of proof... just so you can believe them. It only takes remedial mathematics to get a close enough figure to understand the underlying discussion.
ie:
$5000 per 300mm wafer = 94 cores
100% yield = $54
50% yield = $108
40% yield = $?? (you try it)
These are just rough figures, but it easily demostrates how far off you are in your thinking. I have been here for a very short time and have read a lot of your posts... you really don't have a firm grasp on the business side of things.
Yields come down to "logic". We might be off by a few % or $, but nobody is blatantly abusing these figures. You need to stop being such a naysayer and just learn.
.
Last edited by Xoulz; 07-01-2008 at 04:43 AM.
Yes, Nvidia might have a lot of sway, but if they're going to lose money, even they wouldn't take Nvidia. Remember, both sides are businesses so even if Nvidia has a lot of weight, TSMC wouldn't accept a contract if it guaranteed they were losing money.
TSMC is considered one of the best in the world for volume production outside of Intel (probably the clear leader) and AMD (still good believe it or not). If they aren't going to accept the terms of a company on grounds that they (TSMC) would lose money, I doubt very many other foundries out there can accept that same contract and generate money.
You need to stop being such a naysayer and just learn.
^he's vewy disappointed with gt200 yields
where's my profits?
"65nm gt200 cancelled" - no that's not right
...next...
"...which just goes to show what great value gt200 is"
engage 'hype protocols'; reboot.
![]()
Last edited by adamsleath; 07-01-2008 at 06:58 AM.
i7 3610QM 1.2-3.2GHz
No one here has to do anything. But any logical rational non-AMD fanboy is going want to see some evidence for these assumptions. There is none. Nothing. Zero. There is no evidence whatsover of how much any of these cards cost. Even with your assumption of $5000 per wafer you are left guessing the yields. We don't know them. You don't know them. So it's a pointless exercise. This kind of thing reminds me of the Drake Equation. When you don't know any of the variables you cannot solve for anything.Originally Posted by Xoulz
Well I have a degree in EE so I think I should be able to manage the math.It only takes remedial mathematics to get a close enough figure to understand the underlying discussion.
You have demonstrated nothing with those figures. You don't know the yields. You don't know the cost of the wafer. You don't have any numbers at all.These are just rough figures, but it easily demostrates how far off you are in your thinking.
What figures? I don't see any. All I see are wild arse guesses.We might be off by a few % or $, but nobody is blatantly abusing these figures.
I think everybody is entitled to each opinion, no need to call any arses, unless it's Beyonce's one.![]()
soe of the arses here more beautiful lthen beyonces
When i'm being paid i always do my job through.
Your incredible....
It's a KNOWN FACT that TSMC charges about $5,000 per wafer for their 300mm process. Do you even read other web-sites or tech blogs..? Secondly, there is a whole science dedicated to determining and improving yields.
This has been mentioned over and over and not just in this particular thread but many others here. We also know the law of averages and logic dictates a bigger die = more to go wrong = less yields.
There are exact formulas they give a very detailed realistic expectation of such things. Just because you cannot grasp these concepts or are incapable of doing math doesn't give you the right to constantly thread-crap and become a incessant naysayer.
You simply refuse to accept anything because you want proof, but you, yourself hasn't even bothered to investigate. The simple figures in my last posts are KNOWN! Anyways... It's doesn't matter if TSMC charges $5K per wafer or $3.5k... if you play with the numbers, it easy to see you cannot add or even reason. Your simply trolling now.
.
I would have though your 'degree in EE' would have taught you some common sense. Certainly you can request 'proof' (which you'll never get, so maybe you're just stalling your inevitable admittance of nvidia-hardonism), but I can't believe you need 'proof' just to verify some well-grounded predictions, not your so called 'guesses.'
It's quite sad to see so many people calling themselves engineers these days. Quality of education in universities has fallen so far.
Also, the only 'demand' for DDR5 is for ATI's 4870. And they aren't fabbed all in one day. And you should factor in volume and 'partnership' discounts - unfortunately AMD gets them much cheaper than you do.
Last edited by cegras; 07-01-2008 at 01:34 PM.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
You AMD fanboys are the ones who don't listen to reason. AMD right or wrong! I checked that link and there was no mention of how much Nvidia or AMD are paying for their TSMC wafers. It was also a "known fact" that the R600 was going to blow away the competition. Until it was released. Then all the AMD fanboys were proven wrong. I have yet to see any facts here. If they are so known, then why is it so difficult to cite your sources? And "well grounded predictions"? That is totally idiotic. There's nothing "well grounded" about your wild guesses. Nothing. I think you are morons for making the assumptions that you do. You may not lack "common sense" but you certainly lack critical reasoning skills.
Take it easy guys
@gojirasan
Attack the message but not the person who writes it and if you do attack the message then make sure it's clear whose message(s) you are are attacking. Being offensive to individuals is not a good idea in any topic, being offensive towards 'groups' (in your opinion) is worse than that.
I do hope you can see further than just my avatar, it is basically a joke from 4-5 years ago and now I only use it because people recognize me by it. My sig and posts tell something about who I am (note: just a a little part), not my avatar.
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
I completely miss your point that "the manufacturing costs of 4870 and GTX 280 should be very close to each other"
You don't need exact numbers to prove this wrong. Here is a very niceway of doing so:
1. The cost of a graphics card is more or less dependent on its GPU cost. GT200 is 576mm˛. RV770 is 275mm˛. Without factoring in yields (or manufacturing defects), which will be obviously lower on the GT200, a 576mm˛ die is more than two times more expensive than a 275mm˛ chip.
2. The other major cost of a graphics card (that would be different on those two cards) is memory and memory bus. A 512bit bus PCB is more expensive than a 256-bit one, and a GDDR5 chip is more expensive than a GDDR3 one. However, 4870 has only 512MB GDDR5 whereas GT200 has 1024MB GDDR3.
So there you have it! The only thing in a RV770 that costs more (per unit) than a GT200 is its GDDR5 RAM. For your pointless assumptions to be correct, a GDDR5 chip should be so expensive than a GDDR3 that it should offset the cost of two times the RAM size, two times the memory bus, and two times the GPU cost. Is a GDDR5 chip really around 2,000% (that's two thousand percent) more expensive than a GDDR3 chip? I would think it should be around %20. Are you serious?
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
I couldn't care less about the R600, I barely knew anything about the computer hardware world until I started seriously browsing maybe around Sept 2007.
Now you're sounding more and more like someone with a pocket degree. It's almost pathetic to see you rambling on like this, pointing fingers without even raising a comprehensible rebuttal.
Originally Posted by savantu
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
Bookmarks