MMM
Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 96

Thread: GT300 just taped out?

  1. #26
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by saaya View Post
    everybody is biased and every site is biased...
    i prefer an obvious bias over a hidden pretending to be unbiased bias...
    its easier to tell apart the actual information and the bias that way...

    and not verified by facts... then the only news would be what people either already know or forwarding press releases and anouncements... thats not really news in MY opinion
    Haha, you're right on that ones. But what I mean is... I don't trust at all the news (rumors) about how football (soccer, for the USA people here ) transfers and negotiations are developping that are published on the local sports press (if I had believed everything published on MARCA, the Real Madrid players list would have been of around 50 or 60 players with every international star player on it for this year ). I read them, and comment them, and even think about them... but I don't trust them. At all. When they are more contrastable news (Real Madrid makes the trasfer of X player official, for example), yes I can trust them. And maybe when the other news appear in some other more reliable media, I give them some more relevance (even when I don't absolutely trust them). I was more in those lines...

  2. #27
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by saaya View Post
    ok, so what news do you trust?
    And that's the problem right there: you consider this News.

    This isn't news, nor was it made out to be that. It is an article written based off of the usual "mysterious" sources that just so happen to confirm rumors posted in an older article. Without documented sources and references, it is nothing more than rumor mongering.

    What news should we trust? Pieces with references to sources and without known bias (Charlie versus Nvidia, etc.). Dailytech does this well for one. So do Gizmondo, Ars and Engadget.

  3. #28
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Quote Originally Posted by saaya View Post
    everybody is biased and every site is biased...
    i prefer an obvious bias over a hidden pretending to be unbiased bias...

    I bet that's why we get along so well.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  4. #29
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by SKYMTL View Post
    And that's the problem right there: you consider this News.

    This isn't news, nor was it made out to be that. It is an article written based off of the usual "mysterious" sources that just so happen to confirm rumors posted in an older article. Without documented sources and references, it is nothing more than rumor mongering.

    What news should we trust? Pieces with references to sources and without known bias (Charlie versus Nvidia, etc.). Dailytech does this well for one. So do Gizmondo, Ars and Engadget.
    Totally agree with you SKYMTL

  5. #30
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Olivon View Post
    Totally agree with you SKYMTL
    Why?
    Speculation and rumors is just that, estimated and/or leaked info.
    Just because this "news" doesn't have sources or doesn't use references doesn't mean there isn't info to be learned from it.

    You take your pool from ALL the different sites and use the hints that good sources drop to put the pieces together while filtering out the BS.

    There is a reason why some of us knew what RV740 was going to be at the beginning of Dec...
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  6. #31
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by SKYMTL View Post
    And that's the problem right there: you consider this News.

    This isn't news, nor was it made out to be that. It is an article written based off of the usual "mysterious" sources that just so happen to confirm rumors posted in an older article. Without documented sources and references, it is nothing more than rumor mongering.

    What news should we trust? Pieces with references to sources and without known bias (Charlie versus Nvidia, etc.). Dailytech does this well for one. So do Gizmondo, Ars and Engadget.
    well rumors are news too...
    excuse me but how many reports on tv news are rumors?
    even in news papers...
    and they too refer to "anonymous sources" many times...

    lets face it, imo xs is pretty much the heart of the enthusiast community, its the no1 hangout for tech lovers... as such its a no brainer that it attracts people who work in the industry, and that members of this community end up working in/with the tech industry, macci, shamino, fugger, hicookie, me, etc etc...

    there is a lot of information available to many members here that is not public, and unless they are stupid they wont make it public. luckily most tech companies either dont care or dont notice that there are a lot of hints getting dropped here, either way, thats the origin of many rumors, and theres just no way that this could ever be official or verified.

    if your working in the it news scene youd be an idiot to ignore those rumors and hints and not report about them... you might end up reporting official news from an old roadmap you received offiicially that everybody in the scene long knows has been cancelled/altered. you would lost your readers to news sources that DO report about those rumors.

    as long as a news site makes clear that this and that is a RUMOR and unofiicial, i dont see any problems with this whatsoever, and it doesnt make that news site unreliable, even if the rumor turns out to be completely wrong...

    i agree that dailytech is pretty reliable, but even they report rumors and even they are wrong about things... and they are far far far from unbiased, they do hide their biased well in most articles though and make it subtle... something i dont like as you may not notice some subtle bias in a story without knowing the actual news, which basically manipulates you and your opinion to be more that of the writer of the article.

    Quote Originally Posted by Talonman View Post
    I bet that's why we get along so well.
    i wouldnt say we get along great, i barely know you, but yeah i dont have any problems with you. we greatly disagree on some subjects but other than most people you dont try to hide and at the same time justify your bias, you make it clear that its your opinion and thats something nobody can argue about...
    and i try to do the same, not sure how good i am at that
    Last edited by saaya; 07-31-2009 at 08:10 AM.

  7. #32
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by saaya View Post
    well rumors are news too...
    excuse me but how many reports on tv news are rumors?
    even in news papers...
    and they too refer to "anonymous sources" many times...

    lets face it, IMO xs is pretty much the heart of the enthusiast community, its the no1 hangout for tech lovers... as such its a no brainer that it attracts people who work in the industry, and that members of this community end up working in/with the tech industry, macci, shamino, fugger, hicookie, me, etc etc...

    there is a lot of information available to many members here that is not public, and unless they are stupid they wont make it public. luckily most tech companies either dont care or dont notice that there are a lot of hints getting dropped here, either way, thats the origin of many rumors, and theres just no way that this could ever be official or verified.

    if your working in the it news scene you'd be an idiot to ignore those rumors and hints and not report about them... you might end up reporting official news from an old roadmap you received offiicially that everybody in the scene long knows has been cancelled/altered. you would lost your readers to news sources that DO report about those rumors.

    as long as a news site makes clear that this and that is a RUMOR and unofficial, i don't see any problems with this whatsoever, and it doesn't make that news site unreliable, even if the rumor turns out to be completely wrong...

    i agree that daily tech is pretty reliable, but even they report rumors and even they are wrong about things... and they are far far far from unbiased, they do hide their biased well in most articles though and make it subtle... something i dont like as you may not notice some subtle bias in a story without knowing the actual news, which basically manipulates you and your opinion to be more that of the writer of the article...
    That sums it up folks . I find it odd some still try to treat news like this as legit, confirmed, factual reporting. Then turn around and get so upset when none of it is confirmed to their satisfaction. Their reason why they do that will never make sense to me. Which at times comes off as very selective (IE: depending on what is being reported).
    Last edited by Eastcoasthandle; 07-31-2009 at 08:29 AM.
    [SIGPIC][/SIGPIC]

  8. #33
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Charlie says 4 GT300 variants on the way soon after big daddy drops.

    http://www.semiaccurate.com/2009/08/...-variants-tip/

  9. #34
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    more details too, gt300 is supposedly around 23x23mm that means 529mm^2 ... thats veeery big...
    im pretty sure one of the reasons why nvidia is preparing so many gt300 derived parts is because yields are bad...
    so they want to make sure they have as many gt300 parts in the yield comfort zone at any given time while yields improve.
    depending on how well ati does, nvidia might actually only launch a cut down gt300 initially to compete if thats enough to beat ati and the big chip has yield problems... which is very likely...

    and ati wont have the x2 at launch i think, so nvidias cut down gt300 might still beat atis fastest single gpu solution...

    charlie has figures on the wafer costs too, very nice!
    he says 40nm wafers atm cost 30% more than 55nm wafers... wow, so they DO cost more...
    according to his math gt300 will cost 42% more than gt200 as a result of this plus the increased die size...
    at the same yields...

    gt200 had terrible yields too initially, they launched for around 600$ a piece didnt they?
    so gt300 launch price will most likely be at LEAST that if not more...

    lets see how the cut down gt300 parts do, sounds like a very smart move from nvidia to prepare several parts, for gt200 they didnt have any cut down parts at all, just fully fledged gt200 and broken gt200s they sold as gtx260... that caused them a lot of problems with the mainstream market and they had to lower prices and cut down their margins big time on the 260s, and at the same time were forced to recycle g92 once more... with 4 cut down parts that wont happen for sure...
    i read 4 cut down gt300 parts as NO MORE RECYCLING!!! Whohooooo!


    but 4 parts... isnt that a bit much?
    unless one of those is mobile, its really too much i think... in the past 1 highend and one mainstream worked well for nvidia, and ati roughly did it like that as well...
    4 cards wont be easy on the inventory... but who knows if they will actually launch all 4 parts... maybe they just prepare 4 so as soon as they see what the biggest chip is they can get decent yields on in 40nm ...

    gt300 will have terrible yields for sure, but they dont need good yields, at least initially, cause its a marketing part mostly... there arent too many people who buy a 600$+ card that will be available for 400$ a few months later... so gt300 will basically be marketing, and then the cut down versions will compete with whatever ati and intel have, sounds like a price war is coming in q1, should be very very interesting for hw lovers like us!!!!

  10. #35
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Woof, 529mm^2, I don't know if that's the best approach to take when switching to a new manufacturing process which is giving bad yields even to much smaller chips...

    I really like the cut down ones part of the news though, because I really look more to the 200-300€ bracket than anything higher... and I don't think the full G300 (being 529mm^2) ones are designed to fall inside that bracket

    Quote Originally Posted by saaya View Post
    charlie has figures on the wafer costs too, very nice!
    he says 40nm wafers atm cost 30% more than 55nm wafers... wow, so they DO cost more...
    AFAIK, usually the smaller (and more advanced) the manufacturing process is, the more expensive are the wafers. But notice that the smaller process allow you to fit the same number of transistors in a much reduced surface, so you should get many more chips (provided they're the same architecture) on each wafer...

    For example, the reduction from 55nm to 40nm should mean a theoretical increase of around 89% (55^2=3025, 40^2=1600, 3025/1600=1.89) the number of transistors which can be fit in the same surface (ignoring other variables that may -and will- affect), so if the cost per wafer (and therefore per same surface) is only increased of around 30%, the cost per transistor is much lower.

    Of course, if you end up making chips of the same size, you'll have a higher cost, but also a much higher number of transistors in there...

    gt200 had terrible yields too initially, they launched for around 600$ a piece didnt they?
    so gt300 launch price will most likely be at LEAST that if not more...
    I would bet that GT300 launch price will be in the same bracket than GT200 (the launch one, prior to the forced drop due to the launch of the closely performant HD4800 series), those monstrous chips can only be intended to be on the higher enthusiast market segment...

    But I don't think yields by themselves have nothing to do with the pricing to the end user. Remember that prices depends always on what the potential customers are willing to pay for the product, and not on the production costs.
    Last edited by Farinorco; 08-06-2009 at 03:27 AM.

  11. #36
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Farinorco View Post
    Woof, 529mm^2, I don't know if that's the best approach to take when switching to a new manufacturing process which is giving bad yields even to much smaller chips...
    its definately not a smart move... they originally planned to shrink G92 (yes! lol) to 40nm and then gt200, but those all failed i think... idk, normally that should be a warning sign that you should NOT try to build something complex on that process, at least not yet... but nvidia did it anyways, probably cause they felt they had to, cause they know larrabee is coming and ati will have new highend parts...

    in the end, as funny as it sounds, if you do something insane like this, you can at least be sure you will be the only one doing it, and it means you will have the fastest gpu for at least 1 quarter

    theres just a risk it will be insanely expensive... but since this is more of a pr part anyways that you only need a few dozen or hundreds of for the first quarter, thats something you can tolerate as a company... i guess...

    i think financially this is a bad idea, and nvidia is buying their performance pr crown for more than its worth... its not like everybody would go "huh? nvidia? whos that?" if they lose the perf crown for half a year or even 1 year... they managed to make good money even when ati had the performance crown so i dont understand why they are so paranoid about losing the perf war and are willing to go through so many troubles to come out on top, even if it costs them a hundred of million...

    Quote Originally Posted by Farinorco View Post
    I really like the cut down ones part of the news though, because I really look more to the 200-300€ bracket than anything higher... and I don't think the full G300 (being 529mm^2) ones are designed to fall inside that bracket
    yes, that was the problem in 2008 and 2009, wasnt it? there was not that much price competition cause ati focussed on mainstream and nvidia on highend... as a result nvidia kept their prices high saying "we are the fastest, pay extra" and ati didnt lower their prices either saying "we are fast enough and cheaper than nvidia"

    of course there was competition but not that much... if nvidia would have had a direct competitor to the 3800 and 4800 series and not just a cut down gt200, then prices would have been lower i think...

    so this is great news for q1!
    lots of parts from nvidia and ati = low prices in all semgnets!
    and as a cream topping we get larrabee and ray tracing
    yummy!

    Quote Originally Posted by Farinorco View Post
    AFAIK, usually the smaller (and more advanced) the manufacturing process is, the more expensive are the wafers. But notice that the smaller process allow you to fit the same number of transistors in a much reduced surface, so you should get many more chips (provided they're the same architecture) on each wafer...
    well, lets assume the transistors density is really double, then it means a chip is now half the size in 40nm as it was in 55nm. half the size means you get roughly 2x as many chips per wafer. so the cost per chip is half... but if the wafer costs 30% more, then the cost saving is only 20%, and you ALWAYS have worse yields on the new wafer than the old, so... it means you basically save no money, especially if yields are as bad as on tsmcs 40nm...

    so it seems that moving to a newer mfc process is mostly just to save power and reach higher clockspeeds or be able to produce a chip with lots of transistors that would be impossible with an older node... cost savings seem to only happen when a process has matured, not when its new.

    like if you want to build a chip now, 65nm will be cheaper than 90nm for sure...

    Quote Originally Posted by Farinorco View Post
    I would bet that GT300 launch price will be in the same bracket than GT200 (the launch one, prior to the forced drop due to the launch of the closely performant HD4800 series), those monstrous chips can only be intended to be on the higher enthusiast market segment...
    i dont think so, i think gt300 will cost even more...
    1. gt200 was sold out and in short supply for quite a while, even at 599$ launch price
    2. gt300 is bigger and more expensive to make than gt200
    3. nvidia will have very few chips available i think, less than gt200 when it launched those

    so i think gt300 will launch at 649$ or maybe even 699$...
    all depends highly on what ati and intel do of course...

    Quote Originally Posted by Farinorco View Post
    But I don't think yields by themselves have nothing to do with the pricing to the end user. Remember that prices depends always on what the potential customers are willing to pay for the product, and not on the production costs.
    yeah, but that also depends on the amount of customers...
    you will usually have a pyramid, with some few people who are willing to pay even 5x the normal price, just because...
    then there are maybe 50 people who are willing to pay 3x the normal price
    then 500 that are willing to pay 2x and then for the normal price there are 5000 potential customers... something roughly like that.

    if you only have a few parts available, and you only want to create some pr and claim the product is available, just launch it at a very high price... you can cash in and make quite some extra money and at the same time you make it look like the product is available, its just too expensive...

    so then people have the idea stuck in their head, gt300=awesome but i cant afford it, then a couple of weeks later gt300 drops in price and cut down chips arrive and people go WOW, must buy!!!!

  12. #37
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by saaya View Post
    well, lets assume the transistors density is really double, then it means a chip is now half the size in 40nm as it was in 55nm. half the size means you get roughly 2x as many chips per wafer. so the cost per chip is half... but if the wafer costs 30% more, then the cost saving is only 20%, and you ALWAYS have worse yields on the new wafer than the old, so... it means you basically save no money, especially if yields are as bad as on tsmcs 40nm...

    so it seems that moving to a newer mfc process is mostly just to save power and reach higher clockspeeds or be able to produce a chip with lots of transistors that would be impossible with an older node... cost savings seem to only happen when a process has matured, not when its new.

    like if you want to build a chip now, 65nm will be cheaper than 90nm for sure...
    I think you're doing something wrong in your calcs: if the number of transistors increase by 89% (x1.89) and the cost by 30% (x1.3) the number of transistors per dollar (or euro or what we want, it's the same ) is increased by (1.89/1.3=1.45) 45%, so the price reduction for a given amount of transistors would be (1/1.45=0.69) a 31%, not 20%. And I'm using x1.89 (89% increase of transistors/mm^2) not x2 (100% increase).

    Then of course, yields (and lots of other things) can make this vary hugely...

    yeah, but that also depends on the amount of customers...
    you will usually have a pyramid, with some few people who are willing to pay even 5x the normal price, just because...
    then there are maybe 50 people who are willing to pay 3x the normal price
    then 500 that are willing to pay 2x and then for the normal price there are 5000 potential customers... something roughly like that.

    if you only have a few parts available, and you only want to create some pr and claim the product is available, just launch it at a very high price... you can cash in and make quite some extra money and at the same time you make it look like the product is available, its just too expensive...

    so then people have the idea stuck in their head, gt300=awesome but i cant afford it, then a couple of weeks later gt300 drops in price and cut down chips arrive and people go WOW, must buy!!!!
    Yeah, maybe yields could affect to the final price because of the availability, that's true...

  13. #38
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Farinorco View Post
    I think you're doing something wrong in your calcs: if the number of transistors increase by 89% (x1.89) and the cost by 30% (x1.3) the number of transistors per dollar (or euro or what we want, it's the same ) is increased by (1.89/1.3=1.45) 45%, so the price reduction for a given amount of transistors would be (1/1.45=0.69) a 31%, not 20%. And I'm using x1.89 (89% increase of transistors/mm^2) not x2 (100% increase).

    Then of course, yields (and lots of other things) can make this vary hugely...
    dont know what your doing math wise there, but 40nm and 55nm are just marketing names for the nodes, they dont directly relate to the transistor density afaik... i think the transistor density improvement from 55 to 40nm is actually bigger than it would be if its really 55nm to 40nm.

    anyways, IF you assume the transistor density increases and the same chip is half the size in 40nm as in 55nm, that means double the chips per wafer, which means the chip price is cut in half.

    right?
    if you assume the transistor density is increased less than that, the chip price drops even less... but lets assume it is cut in half...

    then we add the premium of 30% for the 40nm wafer and the chip isnt 50% the price of the 55nm version anymore, but its 80% the price of the 55nm version.

    so you save 20% on 40nm compared to 55nm... at the same yields...

    or am i making a mistake here somewhere?
    Last edited by saaya; 08-06-2009 at 07:39 AM.

  14. #39
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by saaya View Post
    am i making a mistake here somewhere?
    Yes, you are. His math is correct

  15. #40
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by trinibwoy View Post
    Yes, you are. His math is correct
    wheres my mistake then?

  16. #41
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    I hope the GT300 comes before december. Sold my GTX 295's, waiting on next gen cards.

  17. #42
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by saaya View Post
    dont know what your doing math wise there, but 40nm and 55nm are just marketing names for the nodes, they dont directly relate to the transistor density afaik... i think the transistor density improvement from 55 to 40nm is actually bigger than it would be if its really 55nm to 40nm.
    Well, regarding 40nm and 55nm, I'm just making an aproximation based on extrapolation, not very accurate, of course, but it's the better I can do to make an idea about it.

    The 40nm or 55nm aren't just marketing names, are a standard to define the size of the components implemented with them. So those 40/55nm have to be the measure of something. That something is, if I remember correctly, half the average distance between DRAM memory cells, or something similar (I don't have very accurate knowledge about this kind of things, mostly hobby ).

    Of course, that is a pretty vague description, but I use to extrapolate it to get an idea of what each process node supposes...

    Quote Originally Posted by saaya View Post
    wheres my mistake then?
    Now that I've seen your maths, I know it

    Percentiles are multiplicative modifiers, not additive, so you can't add them in the way you're doing. When something gives you -50% and other thing, +30%, you can't add them to have a resulting -20%. -50% is equivalent to multiply by 0.5 (oh, really is the equivalent to multiply by (100-50)/100, which is 0.5), and +30% is equivalent to multiply by 1.3 ((100+30)/100=1.3). So, that results in a total of multiplying by 0.5*1.3=0.65, that's it, the 65% of the original value, or a -35%
    Last edited by Farinorco; 08-06-2009 at 09:46 AM.

  18. #43
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by GAR View Post
    I hope the GT300 comes before december. Sold my GTX 295's, waiting on next gen cards.
    Blimey GAR,
    What are you gaming on in the meantime?
    I hope you haven't dusted off an S3 Virge or something....
    John
    Stop looking at the walls, look out the window

  19. #44
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by saaya View Post
    wheres my mistake then?
    Do it on a unit cost basis and it's very simple.

    30% increase in wafer cost = 30% increase in per chip cost = 1.3x
    100% increase in chips per wafer = 50% decrease in per chip cost = 0.5x

    so 1.3x * 0.5x = 0.65x

    So a chip will cost 65% of its previous cost. Or 35% less. Farinorco got 31% because he used a more conservative estimate in the increase in chips per wafer.

  20. #45
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    Quote Originally Posted by JohnZS View Post
    Blimey GAR,
    What are you gaming on in the meantime?
    I hope you haven't dusted off an S3 Virge or something....
    John
    Im using a gigabyte 4650 1gb, it plays COD4 and TF2 at high settings with no AA @ 1440x900 resolution

  21. #46
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    FACT

    Intel ramped up production of 65nm faster than preceeding die shrinks.
    And, 45nm ramp up was even faster. Where-as switch-over transition used to take many years, its down to months.

    yet, 45nm is the most difficult process. And 32nm is even more complicated. More and more special techniques are required.
    Back in P3/P4 days, there was no worry of: static power, interconnect material, gate material, short channel effects, stresses from straining of silicon, or recent use of double patterning and immersion lithography.

    Amazingly, even though 32nm, 22nm etc, are even more difficult and present more challenges, Intel and others will be mass producing TRILLIONS of such transistors.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  22. #47
    Xtreme Enthusiast
    Join Date
    Feb 2007
    Location
    So near, yet so far.
    Posts
    737
    Quote Originally Posted by ***Deimos*** View Post
    FACT

    Intel ramped up production of 65nm faster than preceeding die shrinks.
    And, 45nm ramp up was even faster. Where-as switch-over transition used to take many years, its down to months.

    yet, 45nm is the most difficult process. And 32nm is even more complicated. More and more special techniques are required.
    Back in P3/P4 days, there was no worry of: static power, interconnect material, gate material, short channel effects, stresses from straining of silicon, or recent use of double patterning and immersion lithography.

    Amazingly, even though 32nm, 22nm etc, are even more difficult and present more challenges, Intel and others will be mass producing TRILLIONS of such transistors.

    Translation:

    When AMD/ATI do their wafers a bit early, ahead(albeit yield concerns), nVidia cannot. I didn't know if they(nV) start late or just suffering from bad yields.
    Or perhaps too hooked up on old card rebrandings.
    [[Daily R!G]]
    Core i7 920 D0 @ 4.0GHz w/ 1.325 vcore.
    Rampage II Gene||CM HAF 932||HX850||MSI GTX 660ti PE OC||Corsair H50||G.Skill Phoenix 3 240GB||G.Skill NQ 6x2GB||Samsung 2333SW

    flickr

  23. #48
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    I want them to just be released

    thats all i want...

    BOTH series look like MASSIVE improvements to the current gen if speculation and even some rumoured benchmarks are even close to true



    I'm getting the best performer either way but i hope gt300 succeeds for some reason even though I personally like amd more... maybe its just the underdog mentality because if r800 (cypress or w/e this gen is called) is MCM then gt300 has a hard cookie coming its way

  24. #49
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    agree @orangekiwii.

    it's about time we get to see some new tech! ever since the g80 came out, performance didn't increase that much, imo. even though i love my 4850, it's just a tad faster than a 8800gtx respectively on par with the 9800gtx. i'm talking about single-gpu performance btw. i couldn't care less about multi-gpu graphics cards as i hate to be dependant on driver profiles and to experience things like microstuttering and what not.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  25. #50
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by trinibwoy View Post
    Charlie says 4 GT300 variants on the way soon after big daddy drops.

    http://www.semiaccurate.com/2009/08/...-variants-tip/
    "If you take yield into account, that cost will be much higher. We shall see."

    LOL yeah thats been going really well for Nvidia the last couple years. Maybe they are hoping for ATI's card to flop so they can charge us $650+ for another 8800GTX type card again.
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •