Results 1 to 25 of 1917

Thread: GeForce 9900 GTX & GTS Slated For July Launch

Hybrid View

  1. #1
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    If the 280 gtx performs as good as it sounds, I will definately be buying it depending on whether the gtx or hd4870x2 performs the best. Do you think my corsait 520hx psu will handle it? I'm running an hd2900xt and a quad which are all doing good. Buying a new PSU is something I'm not tempted at doing.
    Last edited by Nuker_; 05-20-2008 at 01:12 PM.

  2. #2
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by Nuker_ View Post
    If the 280 gtx performs as good as it sounds, I will definately be buying it depending on whether the gtx or hd4870x2 performs the best. Do you think my corsait 520hx psu will handle it? I'm running an hd2900xt and a quad which are all doing good. Buying a new PSU is something I'm not tempted at doing.
    I'm banking on mine handling it (520HX Corsair). Time to sell off the SLI 8800GT 512s . I am kind of concerned still about whether a PCI-E 2.0 slot will be needed to get the most out of this though I'm sure it's fine, as I don't have one .
    Last edited by GoldenTiger; 05-20-2008 at 07:11 PM.

  3. #3
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    above USA...below USSR
    Posts
    1,186
    The upcoming GeForce GTX 260/280 GPUs are based on the GT200 (NV60) core and will be built using a 65 nm manufacturing process at TSMC. However, we heard that not many chips will actually fit on a 300 mm wafer, since Nvidia has come up with a huge die measuring 24 x 24 mm, resulting in a die area size of 576 mm2. This area is almost 100 mm2 larger than Nvidia’s previous 90 nm high-end GPU (G80) and a consequence of 16 processing blocks (G80 came with nine blocks, eight were enabled for GTX models, six for GTS versions) and a new 512-bit memory controller, which replaces the old 384-bit model (the GTS260 will integrate a 448-bit version). The current G92 or GeForce 8800GT/8800GTS512/9800GTX/9800GX2 CPUs are built in a 65 nm process and end up at a die size of 330 mm2. With its new GPU generation, Nvidia is going to continue on the safe route and plan with enough spare transistors for 240 shader units (actually, 240FP+240MADD). The same will be the case with the GeForce GTX 280 and 260. The GPU will have 15 processing units (240 shader processors) available on the GTX280, while the GTX260 will come with 12 units for a grand total of 192 shader processors. Our sources state that the manufacturing cost of the GT200 die is somewhere between $100 to $110 per piece.

    source; http://www.tgdaily.com/content/view/37554/135/
    Case-Coolermaster Cosmos S
    MoBo- ASUS Crosshair IV
    Graphics Card-XFX R9 280X [out for RMA] using HD5870
    Hard Drive-Kingston 240Gig V300 master Seagate 160Gb slave Seagate 250Gb slave Seagate 500Gb slave Western Digital 500Gb
    CPU-AMD FX-8320 5Ghz
    RAM 8Gig Corshair c8
    Logitech 5.1 Z5500 BOOST22
    300Gb of MUSICA!!


    Steam ID: alphamonkeywoman
    http://www.techpowerup.com/gpuz/933ab/

  4. #4
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    Not unthinkable, but the 250W seems a little odd.

    When the 8800GTX hit the scene, people were claiming it pulled 225W, since that was the sum of the max wattages of the PCI-E slot+2 PCI-E connectors. They were also claiming that R600 needed 250W, since "apparently" ATI didn't think 2 6-pin connectors were adequate.

    Personally, don't care if this thing wants 500W, as long as it uses all that power for performance. Mind, if it DOES use that kind of power, the PSU industry might just cream its collective pants.
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  5. #5
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Monkeywoman View Post
    The upcoming GeForce GTX 260/280 GPUs are based on the GT200 (NV60) core and will be built using a 65 nm manufacturing process at TSMC. However, we heard that not many chips will actually fit on a 300 mm wafer, since Nvidia has come up with a huge die measuring 24 x 24 mm, resulting in a die area size of 576 mm2. This area is almost 100 mm2 larger than Nvidia’s previous 90 nm high-end GPU (G80) and a consequence of 16 processing blocks (G80 came with nine blocks, eight were enabled for GTX models, six for GTS versions) and a new 512-bit memory controller, which replaces the old 384-bit model (the GTS260 will integrate a 448-bit version). The current G92 or GeForce 8800GT/8800GTS512/9800GTX/9800GX2 CPUs are built in a 65 nm process and end up at a die size of 330 mm2. With its new GPU generation, Nvidia is going to continue on the safe route and plan with enough spare transistors for 240 shader units (actually, 240FP+240MADD). The same will be the case with the GeForce GTX 280 and 260. The GPU will have 15 processing units (240 shader processors) available on the GTX280, while the GTX260 will come with 12 units for a grand total of 192 shader processors. Our sources state that the manufacturing cost of the GT200 die is somewhere between $100 to $110 per piece.

    source; http://www.tgdaily.com/content/view/37554/135/
    LoL... Someone was right...
    Interesting that they are saying 16 and 12 shader clusters. Weren't we thinking 10 and 8 clusters?
    Also, what is with the transistor count? 900m-1.1b?
    I'm still betting on 1.3-1.4b.

    Quote Originally Posted by MpG View Post
    Not unthinkable, but the 250W seems a little odd.

    When the 8800GTX hit the scene, people were claiming it pulled 225W, since that was the sum of the max wattages of the PCI-E slot+2 PCI-E connectors. They were also claiming that R600 needed 250W, since "apparently" ATI didn't think 2 6-pin connectors were adequate.

    Personally, don't care if this thing wants 500W, as long as it uses all that power for performance. Mind, if it DOES use that kind of power, the PSU industry might just cream its collective pants.
    Ummm... certain people were stating that R600 was pulling 300w per card...
    TDP is 250w, that doesn't mean it is going to pull 250w at load.
    Last edited by LordEC911; 05-20-2008 at 10:24 PM.

  6. #6
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    If those specs are true, I don't think ATI stands much of a chance to begin with.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  7. #7
    Registered User
    Join Date
    Feb 2008
    Posts
    46
    So I guess that means it's going to be insanely hot? Man I was hoping it wouldn't be such a barbecue grill this time around.

    I'm excited to see benchmarks though... wonder when the NDA is lifted?

    Also... shouldn't this thread's title be changed to 280GTX and June 18 now?
    Enermax Infiniti 720W || ASUS P5K Deluxe || Intel Core 2 Quad Q6700 || 4GB OCZ Reaper DDR2 1066Mhz RAM || eVGA GeForce GTX 280 SC || WD Velociraptor 300GB || 2x Seagate 7200.11 1TB || WD Caviar SE16 640GB || Auzentech X-Fi Prelude || 2x LG DVD-RW || Samsung 245BW || Windows Vista Ultimate x64

    http://www.cuddlewar.com/

  8. #8
    Xtreme Member
    Join Date
    Apr 2007
    Posts
    386
    Quote Originally Posted by Lightning_Rider View Post
    Also... shouldn't this thread's title be changed to 280GTX and June 18 now?
    arrrgh im buying a house on the 17th
    Gaming Box:: q6600 @3.0 :: 9800gtx :: Abit IP35 :: 4gb :: 1.4TB :: akasa eclipse :: Win7
    Development:: PhenomII 955BE @3.2 :: 4200 :: asus M4A785 M Evo :: 1.25TB ::Win7
    Media Centre :: q6600 @3.0 :: x1950pro :: asus p35 epu :: 8gb :: 320 GB :: Lc17B :: Win7
    server:: I7 860 :: p55 gd65 :: 3450 :: 8 TB :: 8gb :: Rebel 12 :: server 2008 R2

  9. #9
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Malaysia
    Posts
    1,383
    Quote Originally Posted by GoldenTiger View Post
    I'm banking on mine handling it (520HX Corsair). Time to sell off the SLI 8800GT 512s . I am kind of concerned still about whether a PCI-E 2.0 slot will be needed to get the most out of this though I'm sure it's fine, as I don't have one .
    eh and what are u going do about the 8 pin pcie.. do some voodoo magic??
    kekeke

  10. #10
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by cstkl1 View Post
    eh and what are u going do about the 8 pin pcie.. do some voodoo magic??
    kekeke
    What? HX series has multiple 8-pin pcie (6+2 pin).

  11. #11
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Malaysia
    Posts
    1,383
    Quote Originally Posted by Nuker_ View Post
    What? HX series has multiple 8-pin pcie (6+2 pin).
    damn .. just read the review.. didnt know that..
    always assumed only the high wattage had it other than pcp..

    ok my bad..

    one thing guys if this card is as long as it is.. forget p5q deluxe
    having so much trouble with it and 9800gx2..it closes of one sata port..

  12. #12
    Xtreme Member
    Join Date
    Jan 2008
    Posts
    379
    Last edited by ryan92084; 05-21-2008 at 12:21 PM. Reason: giving credit to actual original source

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •