Page 3 of 6 FirstFirst 123456 LastLast
Results 51 to 75 of 146

Thread: 55nm GT200 (GT200-400) on the way?

  1. #51
    Xtreme Member
    Join Date
    May 2005
    Posts
    193
    Quote Originally Posted by dengyong View Post
    Do you own one?
    No, thanks ...

    Lots of GTX280 were RMA'ed because overheating issues: Link & Link


  2. #52
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    ^
    and the thread craping continuous...

  3. #53
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    if i may venture a guess, ill say possibly 55nm, ,more stream processors or tweaks to stream processors, and a 256bit bus.
    msrp for $450, maybe release in december?
    or it could be the release of the geforce version of the latest quadro card that is out. an overclocked gtx280 with 4 gigs of ram.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  4. #54
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    215
    Quote Originally Posted by grimREEFER View Post
    if i may venture a guess, ill say possibly 55nm, ,more stream processors or tweaks to stream processors, and a 256bit bus.
    msrp for $450, maybe release in december?
    or it could be the release of the geforce version of the latest quadro card that is out. an overclocked gtx280 with 4 gigs of ram.
    That would be a step backwards the 8800GT & GTS already get severely choked at 2560x1600 in crysis on high. The 8800GTX fared better purely because it had the 384 bit bus.

  5. #55
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    That would actually be highly likely if they decided to use gddr5 except for nvidia's ROP count will suffer if they go back to 256
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  6. #56
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    IMO, it's on the way, but won't show up atleast until November. nVidia has troubles with 55 nm node development, they can't even make a preemptive move by launching a 55 nm G92 BEFORE RV770 is launched, and we're talking about a chip half the transistor count compared to the behemoth GT200 chip.

  7. #57
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    what are you talking about? NVIDIA has no fabs, it is using the same 55nm process the rv670 was built on from TSMC. Also they were a year behind ati on 65nm, it is to be expected they would take longer to get to 55nm
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  8. #58
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    55nm is a half-node shrink compared, and isn't nearly as difficult as going from 90nm to 65nm. The shrink should be pretty easy.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  9. #59
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    It is interesting to see that another GT280 card is on the way, as to what it is, is a complete mystery,
    I wager that the predictions above of it being a slightly higher clocked card (aka Ultra) with perhaps higher quality RAM. Say 650Mhz 2.5Ghz RAM and 1450Mhz on the shaders something like that, maybe higher on the shaders and the core IF they could get the manufacturing process perfected and perhaps a cool heatpipe based cooler.
    55nm is coming, but I would not think the GT280 will be 55nm until October/November. The 9800GTX+ will most likely be the "test run" on the 55nm process, once they have perfected that then they will start scaling down the GT280GTX to 55nm.
    A real surprise and unexpected "rain on ATi's HD4870X2) parade would be 55nm, high clocks, high shaders and fast RAM WITH 256SP. That would give ATi a bloody nose, especially if the prices were slashed across the board, however I cannot see that happening..............yet
    John
    Stop looking at the walls, look out the window

  10. #60
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Hornet331 View Post
    ^
    and the thread craping continuous...
    He made a very valid point. Nvidia is having low yields and some problems with their current fab... which indicated that they will not be moving to 55nm for a few more months.

    Things like this are planned months in advance and the fabbing has to be scheduled. Don't expect a 55nm GTX200 series until late September or October.




    .

  11. #61
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Xoulz View Post
    He made a very valid point. Nvidia is having low yields and some problems with their current fab... which indicated that they will not be moving to 55nm for a few more months.

    Things like this are planned months in advance and the fabbing has to be scheduled. Don't expect a 55nm GTX200 series until late September or October.




    .
    and it goes on and on and on.

    what do overheating 280gtx have to do with a 55nm shrink?

    it just like saying in a amd shanghai (45nm) thread, phenom power consumption sucks...

  12. #62
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Toon
    Posts
    1,570
    Quote Originally Posted by Anemone
    Nvidia has yield problems on the current version.
    Quote Originally Posted by gojirasan View Post
    Source? Or are you just talking out of your arse?
    Yield on a 576mm2 chip (94 dice per wafer) will be considerably lower, possibly as low as 1/2 that of the R770's 260mm2 assuming that yield is dominated by silicon defectivity rather than process defects. Hence, double the cost.

    You also need to bear in mind that the uniformity of semiconductor processes is not ideal, there is usually either a central cluster or ring of 'best dice' so the percentage of dice capable of performing at GTX280/4870 levels may be significantly lower, much like Intel's QX range and AMD's Black Edition chips.

    And before you ask if I am talking out of my ass, I have seen this for myself (via test gear) in both research and industrial fabs on Si & Si/SiGe (strained silicon CMOS) wafers 4" and 8" processes.
    Intel i7 920 C0 @ 3.67GHz
    ASUS 6T Deluxe
    Powercolor 7970 @ 1050/1475
    12GB GSkill Ripjaws
    Antec 850W TruePower Quattro
    50" Full HD PDP
    Red Cosmos 1000

  13. #63
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    So you are saying that the a 572mm2 chip will cost about $5 and whereas a 260mm2 chip would cost only $2.50? Am I getting that right, initialised? My point is that all you can say is that Nvidia has a higher cost per GPU than AMD. You are saying about double. Fair enough. But double of what exactly? It's not like we have any numbers at all for either company. AMD fanboys always seem to have a hardon for the silicon wafer costs but ignore the cost of the brand new just ramped up GDDR5. In fact I would guess that the real reason Nvidia chose not to use it is the cost. GDDR3 is probably less than half the cost. Maybe even 1/3 the cost. And these cards have a lot more silicon in the memory chips than in the GPUs. I would guess that the HD4870 and the GTX280 are within $50 of each other in terms of manufacturing cost. And yes now I am talking out of my arse just to get into the spirit of things. Without facts that is all we can do. The difference is I am willing to admit it as opposed to all these AMD wnkers who pretend they just finished going over the company's financial documents.

  14. #64
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Wait a minute, are you really claiming that the difference between the manufacturing cost of a card with a 576mm² GPU and a 270mm² GPU is only $50?

    Whereas one as 1.4 billion transistors, and the other has 950 million?

    Just because "one uses GDDR5 and the other GDDR3"? Forgetting the transistor count and the 512bit memory bus? You can't be serious.

    The difference between the 512mb 4870 and 1GB 4870 is about $50, for your information. $50 per 512mb.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  15. #65
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    RAM will not cost nearly as much as a chip given that RAM is much smaller (so more per wafer) and are in much larger volumes. As the year goes on, prices will go down with 3 manufacturers out there. By the beginning of next year, GDDR5 will probably begin to phase GDDR3 out to the point where 5 will probably cost nearly the same as 3. It happened with DDR1 -> DDR2 and is now slowly happening for DDR2->3. Difference in this case is that DDR2 and 3 gives users a selection due to motherboards, with GPU's users really can't actively choose which ones they want if they buy a certain card.

    Purely from a die per wafer standpoint though, the RV770 has a LOT more than the GT200 per wafer so if all things were equal, RV770 has 2:1 ratio of chips. But in product sales, volume means huge discounts in price so the cost ratio is probably greater than 2:1. And that's assuming both have equal yield rates. Keep in mind that # of defects increases exponentially with area and not linearly so double the area is greater than double the defect probability. However, we don't know if both sides are using the same process and what TSMC's rates are (and that's why these things are kept secret by everyone, it's very bad publicity if it gets out officially).

    So using the bare minimum of 2:1 RV770:GT200 (based on # of die per wafer) isn't a bad start though.

    Just a picture of the RV770 die:
    http://www.rage3d.com/reviews/video/...ure/rv7701.jpg

    For every 3 x 3 block of 9 RV770 dice, you can fit 4 GT200.

  16. #66
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    I'm pretty sure it's the same chip manufactured on 55nm with less redundancy (and thus less transistors, making it a little smaller even) and a little higher core and shader clock. The other option is that the chip is the same entirely build on 55nm (maybe with some extra transistors for redundancy) with more active components. The first option is more likely as Nvidia really wants to make this chip smaller. If the second option is chosen it is only to not lose face, but Nvidia will lose some money over this.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  17. #67
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by annihilat0r View Post
    Wait a minute, are you really claiming that the difference between the manufacturing cost of a card with a 576mm˛ GPU and a 270mm˛ GPU is only $50?

    Whereas one as 1.4 billion transistors, and the other has 950 million?

    Just because "one uses GDDR5 and the other GDDR3"? Forgetting the transistor count and the 512bit memory bus? You can't be serious.

    The difference between the 512mb 4870 and 1GB 4870 is about $50, for your information. $50 per 512mb.
    Yeah don't forget that the PCB needs more routing for the 512-bit interface, putting RAM on both sides, better PWM for the power, lengthier PCB's, etc.

    Each thing might cost little compared to the rest of teh card (such as the actual RAM and chips) but given you need to put them together and so on, they add up given the # of cards you produce.

    Also, from what it looks like, ATI more or less used the 3800's PCB design for the 4850 for instance so there are definitely savings on production.

  18. #68
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by alexio View Post
    I'm pretty sure it's the same chip manufactured on 55nm with less redundancy (and thus less transistors, making it a little smaller even) and a little higher core and shader clock. The other option is that the chip is the same entirely build on 55nm (maybe with some extra transistors for redundancy) with more active components. The first option is more likely as Nvidia really wants to make this chip smaller. If the second option is chosen it is only to not lose face, but Nvidia will lose some money over this.
    I think the bigger reason for the push to 55nm is to save money anyways. You can put a lot more die per wafer and improve yields dramatically.

    Look at ATI move from 80nm R600 to 55nm RV670. Costs were slashed dramatically and yields improved a lot, considering rumors of terrible R600 yields were later alluded to by engineers after RV670.

    65nm to 55nm won't provide that dramatic of a boost but it allows Nvidia to price things more competitively.

  19. #69
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by gojirasan View Post
    AMD fanboys always seem to have a hardon for the silicon wafer costs but ignore the cost of the brand new just ramped up GDDR5. In fact I would guess that the real reason Nvidia chose not to use it is the cost. GDDR3 is probably less than half the cost. Maybe even 1/3 the cost. And these cards have a lot more silicon in the memory chips than in the GPUs.
    You are forgeting things about that:
    1- GTX 280 requires 16 GDDR3 chips vs 8 GDDR5 chips in HD 4870
    2- HD 4870 have 50% less memory. 512Mb - 1Gb.
    3- GTX280 = 512bit PCB, HD 4870=256bit PCB
    4- Sum to the GTX 280 the NVIO production costs.

    In the end, HD 4870 is less expensive in every single ways.
    Last edited by v_rr; 06-30-2008 at 04:41 PM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  20. #70
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    Quote Originally Posted by zerazax View Post
    I think the bigger reason for the push to 55nm is to save money anyways. You can put a lot more die per wafer and improve yields dramatically.
    Surely that is true, but don't forget that Nvidia has been very arrogant lately. They really really don't want the HD4870x2 to be the fastest graphics card available. It is possible that they sell a very limited number of cards based on the 55nm GT200-400 (with more active components) to not lose face. I doubt it, but you never know.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  21. #71
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Wait a minute, are you really claiming that the difference between the manufacturing cost of a card with a 576mm˛ GPU and a 270mm˛ GPU is only $50?
    Maybe you're right. $50 seems like a bit much. I was trying to be conservative. Make it a $33.89 per card difference once you factor in the much more expensive GDDR5.

  22. #72
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    I think you got his point backwards.

  23. #73
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    gojirasan, you're absolutely making no sense at all.

    So a GTX 280 is only $50 more expensive to produce, but NVidia wants $350 more, because they are greedy greedy people?

    How much more expensive can 512MB of DDR5 be than 1024MB of DDR3?

    Also, have you considered the fact that an additional 512MB of DDR5 makes the 4870 only $50 more expensive? 1GB 4870 = around $350.

    Wow, 512MB GDDR5 for $50. So expensive that it brings the manufacturing cost of a 256bit 270mm² chip on level with a 512bit 576mm² one. Uh huh.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  24. #74
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by gojirasan View Post
    So you are saying that the a 572mm2 chip will cost about $5 and whereas a 260mm2 chip would cost only $2.50? Am I getting that right, initialised? My point is that all you can say is that Nvidia has a higher cost per GPU than AMD. You are saying about double. Fair enough. But double of what exactly? It's not like we have any numbers at all for either company. AMD fanboys always seem to have a hardon for the silicon wafer costs but ignore the cost of the brand new just ramped up GDDR5. In fact I would guess that the real reason Nvidia chose not to use it is the cost. GDDR3 is probably less than half the cost. Maybe even 1/3 the cost. And these cards have a lot more silicon in the memory chips than in the GPUs. I would guess that the HD4870 and the GTX280 are within $50 of each other in terms of manufacturing cost. And yes now I am talking out of my arse just to get into the spirit of things. Without facts that is all we can do. The difference is I am willing to admit it as opposed to all these AMD wnkers who pretend they just finished going over the company's financial documents.

    It doesn't matter what the cost of DDR5 is, because its smaller and the actual cost is no where near the astronomical cost of per-chip GTX.

    Your talking pennies, when we are talking dollars.




    .

  25. #75
    Registered User
    Join Date
    Jul 2006
    Location
    Klaten, Indonesia
    Posts
    24
    It's all about yields, with 40% and 65nm per cchip will cost USD110, and if nv can go with 55nm and yields up to at least 60% per chip will cost around USD 70 , it's almost ideal to USD 50, the cost that every highend chip should be. I think the cost of GTX280 about USD175, and with 55nm it can get lower to USD135 more room for lower price

Page 3 of 6 FirstFirst 123456 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •