MMM
Page 4 of 6 FirstFirst 123456 LastLast
Results 76 to 100 of 146

Thread: 55nm GT200 (GT200-400) on the way?

  1. #76
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    So you think that when nVidia and ATi contract with TSMC to manufacture the GPU they charge them per wafer? I doubt it. Yeilds effect TSMC profits.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  2. #77
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by xlimit View Post
    It's all about yields, with 40% and 65nm per cchip will cost USD110, and if nv can go with 55nm and yields up to at least 60% per chip will cost around USD 70 , it's almost ideal to USD 50, the cost that every highend chip should be. I think the cost of GTX280 about USD175, and with 55nm it can get lower to USD135 more room for lower price
    Heard it was around $150.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  3. #78
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    I believe i heard the BOM of a GTX card is around 220-240 US$ (it's something said in B3D forum), and that numbers are somewhat confirmed privately by a representative of one of nVidia AIBs in my local forum (in which i'm a mod overthere). So yes, IMO, HD 4800 series have HUGE cost advantage compared to GTX series, my hunch tells me only R700 would have a similar BOM compared to GT200 series card (perhaps up to 10% more expensive).

  4. #79
    Xtreme Member
    Join Date
    Aug 2004
    Location
    rutgers
    Posts
    465

    Re: gddr5 vs gddr3

    "What about cost. This stuff is going to cost a fortune, right? Well, yes and no. High-speed GDDR3 and GDDR4 memory is certainly expensive. We're told to expect GDDR5 to initially cost something like 10–20% more than the really high speed GDDR3. Of course, you don't buy memories, you buy a graphics card. Though the memory will cost more, it will be offset somewhat in other places on the product you buy. You don't need as wide a memory interface which means a smaller chip with fewer pins. The board doesn't need to contain as many layers to support wider memory busses, and the trace wire design can be a bit more simple and straightforward, reducing board design costs. As production ramps up, GDDR5 could be as cost effective overall as GDDR3. It will only be appropriate for relatively high-end cards at first, but should be affordable enough for the $80–150 card segment over the next couple years."

    source:http://www.extremetech.com/article2/...2309888,00.asp

  5. #80
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Guys. You are just pulling numbers out of your arses. Show me a scanned document or an official press release and I will believe it. You really expect me to believe rumors on an internet forum about Nvidia's or AMD's manufacturing costs? I can't believe you guys are for real. Even if the indonesian guy is right about the cost of a GTX card (and I have absolutely no reason to believe him), AMD's costs are still a complete mystery. The bottom line is cost speculation is irrelevant. Even MSRPs aren't all that important. What is important is the actual street prices. That is what determines the bang for buck. I am at least basing my GDDR5 costs on what DDR3 vs. DDR2 costs upon release. IIRC DDR3 was 2-4 times more expensive than DDR2. You really think the memory manufacturers are only going to charge a 10% premium over the old tech memory? That's just ridiculous. However none of us have any hard numbers. If someone does then fine. I will believe it. I realize that some of you probably think I am some kind of Nvidia fanboy, but I am not. I have no horse in this race. My next card is almost certainly going to be an HD4870x2. I just think many of you are being unrealistic and over-hyping things in AMDs favor. The assumption is that Nvidia's card costs significantly more to manufacture but that assumption is based on very little. And "significantly more" could mean $120 per card instead of $70 per card. It may only cost an extra $10 - $20 per die, and it seems reasonable to assume that the GDDR3 is going to be significantly cheaper. There is very limiited supply for GDDR5 and demand for it is very high. If it were the same price and widely available don't you think Nvidia would be using it too? As for the GTX280 being so expensive, it is not any more expensive than most of Nvidia's high end releases in the past few years. Actually it's about the same street price as the 8800GTX was selling for and now you can buy those cards for just over $200. So yes. I think these companies make a large profit for their new cards. They are in business to make money. So they charge as much as they think they can. And they need to recoup their R&D costs for a new arch.

    It wouldn't surprise me at all if:
    1) the memory on both the AMD and Nvidia cards cost more than the GPU.
    2) both the HD4870 and the GTX280 cost under $100 to manufacture.
    3) recouping the large R&D costs is more relevant to the prices than the actual manufacturing costs.

    So which company spends more on R&D may be more relevant to the MSRPs of the high end cards than the actual cost of manufacturing the card itself. This is the point. I don't think anyone posting here really knows why each company prices a particular card a certain way. All we have are guesses and assumptions. I am willing to bet that Nvidia could sell the GTX280 for $189 and still make a profit, but probably not enough to recoup their research costs.

  6. #81
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    yes, but most blurb is about justifying the retail price
    i7 3610QM 1.2-3.2GHz

  7. #82
    Xtreme Member
    Join Date
    Mar 2008
    Location
    Germany
    Posts
    351
    Quote Originally Posted by gojirasan View Post
    I am willing to bet that Nvidia could sell the GTX280 for $189 and still make a profit
    Personally I doubt that . I do think some of the prices for the gpu itself claimed here sound plausible. Nvidias problem tho
    X3350 | DFI LP X38 T2R | d9gkx
    9800gtx | Raptor1500AHFD/5000AACS/WD3201ABYS
    Corsair 620HX | Coolermaster CM690

  8. #83
    Xtreme Member
    Join Date
    Mar 2008
    Location
    Germany
    Posts
    351
    Quote Originally Posted by Aberration View Post
    So you think that when nVidia and ATi contract with TSMC to manufacture the GPU they charge them per wafer? I doubt it. Yeilds effect TSMC profits.
    AFAIK TSMC does charge them per wafer. They charge them for the initial manufacturing of the equipment (not sure the exact term anymore, it was hard for my lil brain to remember all that ) and after that its just per wafer. I believe I read that on some interview with a CEO or such, not sure where tho.
    X3350 | DFI LP X38 T2R | d9gkx
    9800gtx | Raptor1500AHFD/5000AACS/WD3201ABYS
    Corsair 620HX | Coolermaster CM690

  9. #84
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by gojirasan View Post
    Guys. You are just pulling numbers out of your arses. Show me a scanned document or an official press release and I will believe it. You really expect me to believe rumors on an internet forum about Nvidia's or AMD's manufacturing costs? I can't believe you guys are for real. Even if the indonesian guy is right about the cost of a GTX card (and I have absolutely no reason to believe him), AMD's costs are still a complete mystery. The bottom line is cost speculation is irrelevant. Even MSRPs aren't all that important. What is important is the actual street prices. That is what determines the bang for buck. I am at least basing my GDDR5 costs on what DDR3 vs. DDR2 costs upon release. IIRC DDR3 was 2-4 times more expensive than DDR2. You really think the memory manufacturers are only going to charge a 10% premium over the old tech memory? That's just ridiculous. However none of us have any hard numbers. If someone does then fine. I will believe it. I realize that some of you probably think I am some kind of Nvidia fanboy, but I am not. I have no horse in this race. My next card is almost certainly going to be an HD4870x2. I just think many of you are being unrealistic and over-hyping things in AMDs favor. The assumption is that Nvidia's card costs significantly more to manufacture but that assumption is based on very little. And "significantly more" could mean $120 per card instead of $70 per card. It may only cost an extra $10 - $20 per die, and it seems reasonable to assume that the GDDR3 is going to be significantly cheaper. There is very limiited supply for GDDR5 and demand for it is very high. If it were the same price and widely available don't you think Nvidia would be using it too? As for the GTX280 being so expensive, it is not any more expensive than most of Nvidia's high end releases in the past few years. Actually it's about the same street price as the 8800GTX was selling for and now you can buy those cards for just over $200. So yes. I think these companies make a large profit for their new cards. They are in business to make money. So they charge as much as they think they can. And they need to recoup their R&D costs for a new arch.

    It wouldn't surprise me at all if:
    1) the memory on both the AMD and Nvidia cards cost more than the GPU.
    2) both the HD4870 and the GTX280 cost under $100 to manufacture.
    3) recouping the large R&D costs is more relevant to the prices than the actual manufacturing costs.

    So which company spends more on R&D may be more relevant to the MSRPs of the high end cards than the actual cost of manufacturing the card itself. This is the point. I don't think anyone posting here really knows why each company prices a particular card a certain way. All we have are guesses and assumptions. I am willing to bet that Nvidia could sell the GTX280 for $189 and still make a profit, but probably not enough to recoup their research costs.



    Bro...

    Your just one person! If you choose not to acknowledge or accept the information, then don't. There are many smart people here, some of them are connected.

    Nobody here has to demonstrate to you, where they got their specific numbers or have a burden of proof... just so you can believe them. It only takes remedial mathematics to get a close enough figure to understand the underlying discussion.

    ie:
    $5000 per 300mm wafer = 94 cores
    100% yield = $54
    50% yield = $108
    40% yield = $?? (you try it)

    These are just rough figures, but it easily demostrates how far off you are in your thinking. I have been here for a very short time and have read a lot of your posts... you really don't have a firm grasp on the business side of things.

    Yields come down to "logic". We might be off by a few % or $, but nobody is blatantly abusing these figures. You need to stop being such a naysayer and just learn.




    .
    Last edited by Xoulz; 07-01-2008 at 04:43 AM.

  10. #85
    Xtreme Enthusiast
    Join Date
    Jul 2005
    Posts
    586
    Quote Originally Posted by Seraphiel View Post
    That makes much sense, and thanks for replying.

    Still, TSMC can refuse the job and aren't forced to anything, and I have always thought there are initial trials before any contract is sealed? Tests are being made for both, and a contract is made based upon the results of those.

    I have no clue about all of this, of course
    True but this is a NVIDIA contract we're talking about. Even with bad design and yields they couldn't let nvidia go to someone else...

  11. #86
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by Dave_Sz View Post
    True but this is a NVIDIA contract we're talking about. Even with bad design and yields they couldn't let nvidia go to someone else...
    QFT.
    At least for the time being.

  12. #87
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Yes, Nvidia might have a lot of sway, but if they're going to lose money, even they wouldn't take Nvidia. Remember, both sides are businesses so even if Nvidia has a lot of weight, TSMC wouldn't accept a contract if it guaranteed they were losing money.

    TSMC is considered one of the best in the world for volume production outside of Intel (probably the clear leader) and AMD (still good believe it or not). If they aren't going to accept the terms of a company on grounds that they (TSMC) would lose money, I doubt very many other foundries out there can accept that same contract and generate money.

  13. #88
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    Quote Originally Posted by Dave_Sz View Post
    True but this is a NVIDIA contract we're talking about. Even with bad design and yields they couldn't let nvidia go to someone else...
    But where could nVidia go to beside TSMC ? Especially for an enthusiast class GPU chip, where bleeding edges capability is required to make them. UMC and Chartered seem certainly not the answer for that particular question.

  14. #89
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    You need to stop being such a naysayer and just learn.

    ^he's vewy disappointed with gt200 yields
    where's my profits?

    "65nm gt200 cancelled" - no that's not right

    ...next...

    "...which just goes to show what great value gt200 is"

    engage 'hype protocols'; reboot.
    Last edited by adamsleath; 07-01-2008 at 06:58 AM.
    i7 3610QM 1.2-3.2GHz

  15. #90
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by Xoulz
    Nobody here has to demonstrate to you, where they got their specific numbers or have a burden of proof... just so you can believe them.
    No one here has to do anything. But any logical rational non-AMD fanboy is going want to see some evidence for these assumptions. There is none. Nothing. Zero. There is no evidence whatsover of how much any of these cards cost. Even with your assumption of $5000 per wafer you are left guessing the yields. We don't know them. You don't know them. So it's a pointless exercise. This kind of thing reminds me of the Drake Equation. When you don't know any of the variables you cannot solve for anything.

    It only takes remedial mathematics to get a close enough figure to understand the underlying discussion.
    Well I have a degree in EE so I think I should be able to manage the math.

    These are just rough figures, but it easily demostrates how far off you are in your thinking.
    You have demonstrated nothing with those figures. You don't know the yields. You don't know the cost of the wafer. You don't have any numbers at all.

    We might be off by a few % or $, but nobody is blatantly abusing these figures.
    What figures? I don't see any. All I see are wild arse guesses.

  16. #91
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    I think everybody is entitled to each opinion, no need to call any arses, unless it's Beyonce's one.

  17. #92
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by spursindonesia View Post
    I think everybody is entitled to each opinion, no need to call any arses, unless it's Beyonce's one.
    Are we there yet?

  18. #93
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    soe of the arses here more beautiful lthen beyonces


    When i'm being paid i always do my job through.

  19. #94
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by kromosto View Post
    soe of the arses here more beautiful lthen beyonces
    :fear:
    Are we there yet?

  20. #95
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by gojirasan View Post
    No one here has to do anything. But any logical rational non-AMD fanboy is going want to see some evidence for these assumptions. There is none. Nothing. Zero. There is no evidence whatsover of how much any of these cards cost. Even with your assumption of $5000 per wafer you are left guessing the yields. We don't know them. You don't know them. So it's a pointless exercise. This kind of thing reminds me of the Drake Equation. When you don't know any of the variables you cannot solve for anything.

    Well I have a degree in EE so I think I should be able to manage the math.

    You have demonstrated nothing with those figures. You don't know the yields. You don't know the cost of the wafer. You don't have any numbers at all.

    What figures? I don't see any. All I see are wild arse guesses.



    Your incredible....

    It's a KNOWN FACT that TSMC charges about $5,000 per wafer for their 300mm process. Do you even read other web-sites or tech blogs..? Secondly, there is a whole science dedicated to determining and improving yields.

    This has been mentioned over and over and not just in this particular thread but many others here. We also know the law of averages and logic dictates a bigger die = more to go wrong = less yields.

    There are exact formulas they give a very detailed realistic expectation of such things. Just because you cannot grasp these concepts or are incapable of doing math doesn't give you the right to constantly thread-crap and become a incessant naysayer.

    You simply refuse to accept anything because you want proof, but you, yourself hasn't even bothered to investigate. The simple figures in my last posts are KNOWN! Anyways... It's doesn't matter if TSMC charges $5K per wafer or $3.5k... if you play with the numbers, it easy to see you cannot add or even reason. Your simply trolling now.




    .

  21. #96
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by gojirasan View Post
    No one here has to do anything. But any logical rational non-AMD fanboy is going want to see some evidence for these assumptions. There is none. Nothing. Zero. There is no evidence whatsover of how much any of these cards cost. Even with your assumption of $5000 per wafer you are left guessing the yields. We don't know them. You don't know them. So it's a pointless exercise. This kind of thing reminds me of the Drake Equation. When you don't know any of the variables you cannot solve for anything.



    Well I have a degree in EE so I think I should be able to manage the math.



    You have demonstrated nothing with those figures. You don't know the yields. You don't know the cost of the wafer. You don't have any numbers at all.


    What figures? I don't see any. All I see are wild arse guesses.
    I would have though your 'degree in EE' would have taught you some common sense. Certainly you can request 'proof' (which you'll never get, so maybe you're just stalling your inevitable admittance of nvidia-hardonism), but I can't believe you need 'proof' just to verify some well-grounded predictions, not your so called 'guesses.'

    It's quite sad to see so many people calling themselves engineers these days. Quality of education in universities has fallen so far.

    Also, the only 'demand' for DDR5 is for ATI's 4870. And they aren't fabbed all in one day. And you should factor in volume and 'partnership' discounts - unfortunately AMD gets them much cheaper than you do.
    Last edited by cegras; 07-01-2008 at 01:34 PM.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  22. #97
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    You AMD fanboys are the ones who don't listen to reason. AMD right or wrong! I checked that link and there was no mention of how much Nvidia or AMD are paying for their TSMC wafers. It was also a "known fact" that the R600 was going to blow away the competition. Until it was released. Then all the AMD fanboys were proven wrong. I have yet to see any facts here. If they are so known, then why is it so difficult to cite your sources? And "well grounded predictions"? That is totally idiotic. There's nothing "well grounded" about your wild guesses. Nothing. I think you are morons for making the assumptions that you do. You may not lack "common sense" but you certainly lack critical reasoning skills.

  23. #98
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    Take it easy guys

    @gojirasan

    Attack the message but not the person who writes it and if you do attack the message then make sure it's clear whose message(s) you are are attacking. Being offensive to individuals is not a good idea in any topic, being offensive towards 'groups' (in your opinion) is worse than that.

    I do hope you can see further than just my avatar, it is basically a joke from 4-5 years ago and now I only use it because people recognize me by it. My sig and posts tell something about who I am (note: just a a little part ), not my avatar.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  24. #99
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by gojirasan View Post
    You AMD fanboys are the ones who don't listen to reason. AMD right or wrong! I checked that link and there was no mention of how much Nvidia or AMD are paying for their TSMC wafers. It was also a "known fact" that the R600 was going to blow away the competition. Until it was released. Then all the AMD fanboys were proven wrong. I have yet to see any facts here. If they are so known, then why is it so difficult to cite your sources? And "well grounded predictions"? That is totally idiotic. There's nothing "well grounded" about your wild guesses. Nothing. I think you are morons for making the assumptions that you do. You may not lack "common sense" but you certainly lack critical reasoning skills.
    I completely miss your point that "the manufacturing costs of 4870 and GTX 280 should be very close to each other"

    You don't need exact numbers to prove this wrong. Here is a very niceway of doing so:

    1. The cost of a graphics card is more or less dependent on its GPU cost. GT200 is 576mm˛. RV770 is 275mm˛. Without factoring in yields (or manufacturing defects), which will be obviously lower on the GT200, a 576mm˛ die is more than two times more expensive than a 275mm˛ chip.

    2. The other major cost of a graphics card (that would be different on those two cards) is memory and memory bus. A 512bit bus PCB is more expensive than a 256-bit one, and a GDDR5 chip is more expensive than a GDDR3 one. However, 4870 has only 512MB GDDR5 whereas GT200 has 1024MB GDDR3.

    So there you have it! The only thing in a RV770 that costs more (per unit) than a GT200 is its GDDR5 RAM. For your pointless assumptions to be correct, a GDDR5 chip should be so expensive than a GDDR3 that it should offset the cost of two times the RAM size, two times the memory bus, and two times the GPU cost. Is a GDDR5 chip really around 2,000% (that's two thousand percent) more expensive than a GDDR3 chip? I would think it should be around %20. Are you serious?
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  25. #100
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by gojirasan View Post
    You AMD fanboys are the ones who don't listen to reason. AMD right or wrong! I checked that link and there was no mention of how much Nvidia or AMD are paying for their TSMC wafers. It was also a "known fact" that the R600 was going to blow away the competition. Until it was released. Then all the AMD fanboys were proven wrong. I have yet to see any facts here. If they are so known, then why is it so difficult to cite your sources? And "well grounded predictions"? That is totally idiotic. There's nothing "well grounded" about your wild guesses. Nothing. I think you are morons for making the assumptions that you do. You may not lack "common sense" but you certainly lack critical reasoning skills.
    I couldn't care less about the R600, I barely knew anything about the computer hardware world until I started seriously browsing maybe around Sept 2007.

    Now you're sounding more and more like someone with a pocket degree. It's almost pathetic to see you rambling on like this, pointing fingers without even raising a comprehensible rebuttal.

    Quote Originally Posted by savantu
    Au contraire mon ami. Don't jump so fast on what I know..

    GT200 is 24/24mm for a 576mm^2 chip.

    You can put 93 dies on a 300mm wafer.

    How do you calculate yield ? You need to know the defect density and distribution.

    There are several distribution models , Poisson Model, the Murphy Model, the Exponential Model, and the Seeds Model.

    Poisson is the most pessimistic , with defects spread randomly.Murphy has triangular or rectangular defect densities and is in the middle.
    Exponential assumes , clustered defects and is the most optimistic one.Seeds is best suited to small chips , not the case here.

    With that in mind , how about defect densities ? Intel claims world class yields 0,22-0,25 dd/cm^2.
    AMD claimed "lower than 0.5" , so it must be in the 0,4-0,5dd/cm^2.

    Is TSMC better than Intel or AMD ? I'd say hell no , but that's my opinion.

    So let's see the results :

    0,25dd/cm^2
    Exponential 38 good dies , yield 41%
    Murphy 26 , yield 28,1%
    Poisson 22 , yield 23,7%
    Average ( best+worst/2 ) 30 yield 32,3%

    0,5dd/cm^2
    Exponential 23 , yield 25,8%
    Murphy 9 , yield 10,7%
    Poisson 5 , yield 5,6%
    Average 14 yield 15%


    I would say that TSMC is as good as AMD at best , that's 0.5dd.In that case they get around 10-14 fully working dies per wafer , 12-15% yield.

    Does that tell us how man working chips NVIDA salvages ? No.The chip has a lot of redundancy since it is build from hundreds of parallel very simple cores.

    What it tell us , is that NVIDIA gets 10-14 GTX280 ( or what's it's called ) premium chips per wafer.The rest are lesser models with fewer shader/vertex/etc units.

    Out of those 10-14GTX280 , some might fail running at the required frequency or in the envisioned TDP.So I'd venture to say that they get less than 10 full fledged chips per wafer.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

Page 4 of 6 FirstFirst 123456 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •