Results 1 to 25 of 421

Thread: HD 4850 Previews

Hybrid View

  1. #1
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by zerazax
    G80 had a smaller die size, used less power and generated less heat and we didn't see a GX2 card with that core until the G92 at 65nm so I doubt we will see GT200 in a GX2 card form until 40nm at least
    Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.

    The fact is you can only get so much out of CF/SLI before diminishing returns kicks in. It's basically a hack introduced by 3DFX. We're lucky it works at all. Now maybe the 4870x2 is going to revolutionize the world of GPUs by changing that. But I'll believe that when I see it. Even in a post 4870x2 world Nvidia is still going to have the advantage in a sense because I doubt the 4870 is going to be faster than a GTX280 at 65nm and it's even less likely when the GTX280 is shrunk down to 55nm. We can all enjoy AMD's victory this summer, but they are going to have to pull out quite a few rabbits if they want to keep it.

  2. #2
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by gojirasan View Post
    Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.
    How do you propose dissipating +450w of heat with a dual slot cooler?
    The limit for current dual slot coolers is obviously right around a 250w TDP.

  3. #3
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by LordEC911 View Post
    How do you propose dissipating +450w of heat with a dual slot cooler?
    The limit for current dual slot coolers is obviously right around a 250w TDP.
    With a large heatpipe cooler like the TRUE or the the Scythe Orochi, but engineered to attach to the video card. You have almost the same problem with GTX280 SLI but I haven't heard of cards melting just from the stock cooling.

  4. #4
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    isn't ati droppong to 45 or 40nm soon,like end of year or q1 09?

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •