Page 1 of 10 1234 ... LastLast
Results 1 to 25 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #1
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084

    Nvidia 270, 290 and GX2 roll out in November

    Why you shouldn't get too excited

    IF YOU WERE wondering why Nvidia put out the GTX260-216 with a stupid-sounding name instead of the saner 270 moniker, here is your answer. There is a 270 coming, it will have a big brother called the 290, and a dual card code named "China Syndrome"(1).

    Yeah, NV is in deep doo-doo right now. The card that was meant to power its way to profits, the GTX280, went from $649 at launch to $499 a few weeks later. A quarter or so on, it is selling retail for sub-$400 prices here and there. AIBs tell us that the 260 costs an ironic $260 to make, which closely matches the teardown numbers we have seen. Toss in the mandatory 15 per cent markup at the retail level, and if you see one for sale at under $300, someone is eating money. Basically, if you can make money on these parts, something is wrong.



    On the up side, the 280 is the single fastest GPU on the market. On the down side people don't buy GPUs, they buy graphics cards, and the 280 is not the single fastest graphics card on the market. That honour goes to the ATI 4870X2 by a large margin. With the new-gen GT200 parts, Nvidia loses on all fronts, performance, performance per dollar, and performance per watt, they simply aren't competitive.

    That brings us to the new parts, the 270 and 290. They popped up on a PNY price list a few weeks ago, and then were pulled off immediately. This part is what we were calling the GT200b in May, but the public code name is GT206. It is simply an optically shrunk GT200, so clock for clock, you won't get any speed boost out of it. It is meant to fatten up the margins by reducing cost. If the GT200 is a 576mm^2 die, and the 206 is around 460mm^2 (~21mm*21mm die), even with the more expensive 55nm process, NV should save some money.

    One big problem though is yield. GT200s started out at 40 per cent yield, that is for the 280 and the yield salvage 260, and were up to a hair above 60 per cent last time we looked. Toss in the smoothly-named GTX260-216, and you screw up the binning a lot. When you transition to a new process, yield almost always goes down, so this part should be back in at the bottom of the yield toilet, not that 60 per cent is out of the bowl, in short order.

    So what are the 270 and 290? That is easy, they are 55nm GT200s, aka GT200b, aka GT206. Nothing new, nothing spectacular at all. Why the new name then, other than desperation, if you get nothing different clock for clock? Well, that is easy, Nvidia simply has to bump up margins, and the easiest way to do that is to snow customers.

    If you remember what they did when changing the name on the 8800GT to the 9800GTX, it is the same thing. Partners can't make money, and a name change will make the stupid fanbois out there line up. Milking the stupid is a time-honored tradition in the GPU world, and since the 55nm shrink of the G92, the G92b sold less worse when they renamed it the 9800GTX+, it looks like they are trying it again. No, not the GT1xx, this time it is going to be less egregiously renamed to the 270 and 290.

    When you shrink a chip, there are three main benefits, area, power and speed, with the last two being a tradeoff of sorts. Going from 65nm to 55nm, you shrink to about 70 per cent of the area, and save a bit of power. The power savings, however, are not quite 30 per cent off the top and, in addition to that, you also lose more on leakage, as smaller structures leak more. People who have done the same shrink at TSMC tell us that you net far less than 20 per cent. Let's be generous and call the power savings 15 per cent.

    The 280 has a TDP of 236W and the 260 is at 182W, with the 260-216 somewhere in between. Since the RAM, external circuitry, power regulators and fans don't scale at all, we will ballpark the board level power savings at 10 per cent. That would put theoretical 270s at 164W and the 290 at 212W, at the same clocks, but lets be overly generous and call it 160 and 200, just to give NV the benefit of the doubt.

    As always with transistors, you can take the power savings as such while keeping the clocks the same, or keep power consumption the same and jack up speed. Guess which one NV is going to do? Hint: "20W less power consumption!" on the side of the box won't sell many cards. You would be better off airbrushing bigger nipples on the chrome chick cover art.

    So, in the end, the 270 and 290 are simply a little faster 260s and 280s. Whoopty-fricking-ding-dong. They will be relaunched as the second coming, and sites that are far to afraid to be cut off will parrot back the NV sermon. It won't be fast enough to beat a 4870X2, not even close, but it will let NV jack up prices just in time for Christmas, giving their partners a desperately-needed few points of margin. NV fanbois will line up for it, and wonder why they took such a bath on the $650 280s they bought.

    These cards were supposed to be out in late August or early September, but are now set for November. That is about time enough for a single respin, so something must have gone a bit pear shaped. In any case, it is much less delayed than the GT212, but that is another article entirely. AIBs just got their 270/290 boards recently, so both are a few weeks out yet.

    If you are underwhelmed, then the dualie card is for you. We haven't seen a code name for it officially, but NV AIBs are talking about it. Take a 55nm GT200b/206 and put two PCBs together a la the 9800GX2, and you get the idea. There is one minor problem this time... heat.

    The G92 (8800GT/9800GTX/GT15x) was coolable, barely, with a single slot cooler. The GT200 is not. Even with a theoretical 20 per cent lower power draw, you would be at 290W for a dual 55nm 260 clone. If you use a 260-216 or jack the clock up, you are at 300+W in an instant, and we can see 350W without trying hard.

    Normally, you would do what ATI did with the 3870X2 and downclock it a bit. Not much, just a little, and take the power savings. The problem is that the 260 loses already to the cheaper 4870, and two of them wouldn't be much of a fight against the 4870X2. At a minimum, you need 2 260-216s, or better yet two upclocked 260-216s to pip the 4870X2 and claim a hollow victory.

    NV is in a real bind here, it needs a halo, but the parts won't let it do it. If they jack up power to give them the performance they need, they can't power it. Then there is the added complication of how the heck do you cool the damn thing. With a dual PCB, you have less than one slot to cool something that runs hot with a two slot cooler. In engineering terms, this is what you call a mess.

    Given NV's problems of late with cooling, (here, here, and here) it is in a bind, but there is no way out of this, none at all. The only thing it can do is resort to unethical tactics, and that is what we think it will do here. The only solution open to NV is to cherry pick ultra low power GT200b parts and make a small run of GX2s that don't burn a hole in the bottom of your case on their way down to the center of the earth.

    If this case follows past tactics, NV will make a very small run of parts and claim there is full production. Think 1000 parts or so, most of which go to reviewers. The rest will go to high-profile marketers, think Newegg, and they will sell out. When people cry for more, the usual 'high demand' lines will be spun, and they will dribble out 10-20 cards here and there to keep up appearances.

    Pricing will likely be right on top of the 4870X2, maybe $50 more in order to bolster the performance claims, and they will be undoubtedly be sold at a loss. Then again, with the number likely to be made, it is chump change to take a small bath on each one to claim the lead for Christmas.

    In any case, if the three upcoming parts, the 270, 290 and GX2 look like naked desperation attempts to grab at a halo, you are right. Upping the clocks will hit NV in the bottom line at the end of the quarter, their margins will suffer because of this stupidity. Fanbois will love it, massively subsidised parts are great deals for consumers. In fact, you will likely see massively discounted 260s and 280s in a few weeks when the new parts hit the street. They won't save Nvidia's bacon, but they may help partner margins. For a short time. At a high price. µ

    (1) It was code named "Smoking Sepuku", but Nvidia didn't want to publicly suggest that this strategy was vaguely related to anything honorable, so it was renamed. We saw the memo, it was poignant. (2)

    (2) We are, of course, making this all up.
    http://www.theinquirer.net/gb/inquir...0-290-gx2-roll
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  2. #2
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    I stopped reading when I saw the source...

  3. #3
    Xtreme Member
    Join Date
    Jun 2008
    Location
    Finland
    Posts
    111
    When I saw the headline I thought yeah, finally some new info about GT200b. While reading the article through it became clear that it was very anti-nvidia and thought "its gotta be inquirer". And guess what, it was. I didn't even bother to finish it.

  4. #4
    Xtreme Addict
    Join Date
    Jul 2008
    Location
    SF, CA
    Posts
    1,294
    yeah well i'm at work so i read the whole thing and was mildly entertained.
    i still don't rest any creednece, it's just another nVidia bashfest. Which I don't mind, but it's getting old.

  5. #5
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Italy
    Posts
    1,331
    I really cant stand the ati fanboism of the INQ.

    The perf. margins between the 280 gtx and the 4870x2 is like 20% and not in every game title.

    SB Rig:
    | CPU: 2600K (L040B313T) | Cooling: H100 with 2x AP29 | Motherboard: Asrock P67 Extreme4 Gen3
    | RAM: 8GB Corsair Vengeance 1866 | Video: MSI gtx570 TF III
    | SSD: Crucial M4 128GB fw009 | HDDs: 2x GP 2TB, 2x Samsung F4 2TB
    | Audio: Cantatis Overture & Denon D7000 headphones | Case: Lian-Li T60 bench table
    | PSU: Seasonic X650 | Display: Samsung 2693HM 25,5"
    | OS: Windows7 Ultimate x64 SP1

    +Fanless Music Rig: | E5200 @0.9V

    +General surfing PC on sale | E8400 @4Ghz

  6. #6
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    That was an excellent article by the INQ!!!

    And that is what I was thinking too! After Nvidia named the dual "8800GTS-512" solution a new number generation, the 9800GX2, there was a leak of the GTX-350. It was rumored to have 2GB of GDDR5 memory at 512-bit bus width. At first, I was wondering "what the heck"?!? Now, it all makes sense, given that the bandwidth is "doubled" on Nvidia's dual-GPU solution, and that Nvidia was always a sucker for making a dual-GPU solution since the 7900GX2 days. I knew that at 55nm, it was Nvidia's only chance of beating the 4870X2, by shrinking the GT200 cores and forcing them into a dual-GPU solution.

    The INQ did a good job at pointing out the power consumption and cooling issues. It is unquestionably going to be a huge challenge to cool this sandwiched thing!!

    GTX 290: die shrink slightly lower power consumption and/or higher clocks possibly GDDR5 memory

    GTX 350 (dual GT200b): ~15-30% more power consumption than 4870X2 probably has to be less than 600MHz per core due to cooling limits still loses to the 4870X2 in some games insane cooling fan noise
    Last edited by Bo_Fox; 10-09-2008 at 05:41 AM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  7. #7
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    Wow, a dual 460 mm^2 GPUs card, while it might or might not take the performance crown, it sure would be one heck of a power hog and heat generator device at the same time.

  8. #8
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    112
    Yeah, GX2 solution with 460mm2 die size!?!?

    Well, i think if GTX270/290 will be only slightly faster than the current models, the only way for NVIDIA to fight with AMD is give people prices comparable with AMDs HD48xx prices. For example GTX270 atl HD4870 level.

  9. #9
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by Blacky View Post
    I stopped reading when I saw the source...
    Say, the source grew fully-rounded breasts and matured with wide curvy hips, you still refused to have sex with her? Oh well, your loss....

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  10. #10
    Xtreme Member
    Join Date
    Mar 2008
    Location
    germany-münster
    Posts
    375
    theres always atleast minimum truth in theinqs articles

    but i guess their right this time, not like nvidia hasnt done this before

    think availability of 7800 gtx 512mb +mayor heat problems...
    system:

    Phenom II 920 3.5Ghz @ 1.4v, benchstable @ over 3,6Ghz (didnt test higher)
    xigmatek achilles
    sapphire hd4870 1gb @ 820 1020
    Gigabyte GA-MA790GP-DS4H
    8gb a-data 4-4-4-12 800
    x-fi xtrememusic
    rip 2x 160gb maxtor(now that adds up to 4...)
    320gb/250gb/500gb samsung

  11. #11
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    So NV is going after the GX2 route again?I wonder if they will raise the clocks of the GPUs to try and match 4870X2.

  12. #12
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    It's only natural for Nvidia to throw 2 GT200 variants into a GX2 card to try and re-capture the performance crown. That's going to be one expensive card though.. $600+ for a while.

  13. #13
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    川崎市
    Posts
    2,076
    There is a shrinked version comming of the GTX260 and 280? like anyone woudnt have been able to guess that...

    And a dual card is not impossible, 2x the heat of 55nm GTX260 probably wont be much more than what current Oc'd GTX280's produce, also nvidia does have a history of being willing to take cooling solutions to the extreme, just think of the 5xx0 ultra's.

  14. #14
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Barys View Post
    Yeah, GX2 solution with 460mm2 die size!?!?

    Well, i think if GTX270/290 will be only slightly faster than the current models, the only way for NVIDIA to fight with AMD is give people prices comparable with AMDs HD48xx prices. For example GTX270 atl HD4870 level.
    That would hurt GTX260 sales a bit too much, dropping it to something like say 150~170 EUR / 220~240 USD.

    Quote Originally Posted by informal View Post
    So NV is going after the GX2 route again?I wonder if they will raise the clocks of the GPUs to try and match 4870X2.
    I seriously doubt it, like INQ said it's more likely that they have to drop the clocks a bit in order to keep power draw realisticly.


    NV is not in a good situation right now, for customers there aren't any huge problems here but as a company it ain't doing so well.
    Last edited by RPGWiZaRD; 10-09-2008 at 05:47 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  15. #15
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    317
    If nVidia manages to put out a GX2 in competitive prices, then I don't see where INQ's bashing comes from. I mean they can very well be paid from ATI to say nonsense, but there is no greater folly than mocking the engineering feat of a GX2 GT200 card (which BTW will obliterate the 4780x2 and make INQ cry). Lest nVidia puts out those cards and they're kings again, which is rather sad, but things won't get better for ATI by mocking nVidia, they would have to get their act together once again, as their claims on the market will start dwindling shortly after.
    Last edited by Stevethegreat; 10-09-2008 at 06:03 AM.

  16. #16
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    The source is right about the die size at least. 10/100!

    p/s: There IS a reason why even the simpler G80 never even had a 2XPCB.


    Plus, think of it. We're now 3.5 months after the launch of the current gen. The next gen should come in 6 months. Why get these (that will launch in Winter) when the next batch of chips are immensely better? (At least the RV870. Near GT200GX2 performance for cheap. And yes, it's by common guesstimates.)
    Last edited by Macadamia; 10-09-2008 at 05:48 AM.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  17. #17
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    England, UK
    Posts
    1,838
    Quote Originally Posted by RealTelstar View Post
    I really cant stand the ati fanboism of the INQ.

    The perf. margins between the 280 gtx and the 4870x2 is like 20% and not in every game title.
    Depends on if you use AA or not. From the benchmarks ive taken in Crysis against the GTX280 at high res with AA it shows around a 50% increase in performance.

  18. #18
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Kuwait
    Posts
    616
    really nice article by the INQ but they are raciest when it comes to Nvidia
    although they have a good point about the Watt Vs Heat and how to cool a GX2 of 260

    but i cant wait for GTX290 reviews i hope its at least 30% better then GTX280 if not i'm gonna pass on upgrading again

  19. #19
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Quote Originally Posted by Stevethegreat View Post
    ...the engineering feat of a GX2 GT200 card (which BTW will obliterate the 4780x2 and make INQ cry)...
    Engineering feat of slapping two PCBs together?

  20. #20
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    317
    Quote Originally Posted by insurgent View Post
    Engineering feat of slapping two PCBs together?
    No I was speaking for it's cooling. To make such a thing produce enough heat that can be cooled by air, IS a feat, no matter how you spin it.

  21. #21
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Italy
    Posts
    1,331
    Quote Originally Posted by Stevethegreat View Post
    If nVidia manages to put out a GX2 in competitive prices, then I don't see where INQ's bashing comes from.
    From their stupidity...

    I mean they can very well be paid from ATI to say nonsense.
    ...or more likely this

    SB Rig:
    | CPU: 2600K (L040B313T) | Cooling: H100 with 2x AP29 | Motherboard: Asrock P67 Extreme4 Gen3
    | RAM: 8GB Corsair Vengeance 1866 | Video: MSI gtx570 TF III
    | SSD: Crucial M4 128GB fw009 | HDDs: 2x GP 2TB, 2x Samsung F4 2TB
    | Audio: Cantatis Overture & Denon D7000 headphones | Case: Lian-Li T60 bench table
    | PSU: Seasonic X650 | Display: Samsung 2693HM 25,5"
    | OS: Windows7 Ultimate x64 SP1

    +Fanless Music Rig: | E5200 @0.9V

    +General surfing PC on sale | E8400 @4Ghz

  22. #22
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by Bo_Fox View Post
    Say, the source grew fully-rounded breasts and matured with wide curvy hips, you still refused to have sex with her? Oh well, your loss....
    Shhhhh






























    I lied lulz

  23. #23
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Italy
    Posts
    1,331
    Quote Originally Posted by Scubar View Post
    Depends on if you use AA or not. From the benchmarks ive taken in Crysis against the GTX280 at high res with AA it shows around a 50% increase in performance.
    Crysis is not the only game out there and it is not the best optimized engine either.

    Now, if you tell me that the 4870x2 has a better image quality (anti-aliasing), then yes, I agree with you. But it was vaporware for over two months. if i had to buy a graphic card lets say in september (when the 4870x2 was really available in europe), i would have probably bought it.

    SB Rig:
    | CPU: 2600K (L040B313T) | Cooling: H100 with 2x AP29 | Motherboard: Asrock P67 Extreme4 Gen3
    | RAM: 8GB Corsair Vengeance 1866 | Video: MSI gtx570 TF III
    | SSD: Crucial M4 128GB fw009 | HDDs: 2x GP 2TB, 2x Samsung F4 2TB
    | Audio: Cantatis Overture & Denon D7000 headphones | Case: Lian-Li T60 bench table
    | PSU: Seasonic X650 | Display: Samsung 2693HM 25,5"
    | OS: Windows7 Ultimate x64 SP1

    +Fanless Music Rig: | E5200 @0.9V

    +General surfing PC on sale | E8400 @4Ghz

  24. #24
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by C.Ron7aldo View Post
    really nice article by the INQ but they are raciest when it comes to Nvidia
    although they have a good point about the Watt Vs Heat and how to cool a GX2 of 260

    but i cant wait for GTX290 reviews i hope its at least 30% better then GTX280 if not i'm gonna pass on upgrading again
    I never knew the INQ was racy towards nVidia.


    Kinky sex, maybe?
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  25. #25
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by RPGWiZaRD View Post
    That would hurt GTX260 sales a bit too much, dropping it to something like say 150~170 EUR / 220~240 USD.



    I seriously doubt it, like INQ said it's more likely that they have to drop the clocks a bit in order to keep power draw realisticly.


    NV is not in a good situation right now, for customers there aren't any huge problems here but company ain't doing so well.
    The GTX-260's are already selling at that price range (as low as $200 after rebates).

    http://www.newegg.com/Product/Produc...82E16814127361

    It all depends on how much Nvidia gets out of the 55nm process. It certainly looks like Nvidia did a re-spin of the GT200b in order to optimize the clock speeds. If the core is at say, 700MHz and the shaders at 1600MHz, Nvidia will have won most of us back over from ATI.


    Quote Originally Posted by Stevethegreat View Post
    If nVidia manages to put out a GX2 in competitive prices, then I don't see where INQ's bashing comes from. I mean they can very well be paid from ATI to say nonsense, but there is no greater folly than mocking the engineering feat of a GX2 GT200 card (which BTW will obliterate the 4780x2 and make INQ cry). Lest nVidia puts out those cards and they're kings again, which is rather sad, but things won't get better for ATI by mocking nVidia, they would have to get their act together once again, as their claims on the marker will start dwindling shortly after.
    Nah, although the INQ does bash Nvidia *hard*, it was a great article by the INQ nonetheless. It certainly does sound very plausible.

    Let's look at the reality for a minute.. do you really think that 55nm is going to bring more than a generous claim of "15%" power savings that the INQ made? It was generous of the INQ to say that, for Nvidia's sake. Do you think it would actually bring more than a 50 MHz boost in clock speed without increasing power usage? Apparently, 55nm only allowed the 9800GTX+ a 68 MHz clock increase using the same power envelope. The 9800GTX+ still failed to show a clear lead over the single-slot HD 4850, which was a great disappointment for Nvidia.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

Page 1 of 10 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •