MMM
Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 75

Thread: 9800GTX pic and specs

  1. #26
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Location
    Cincinnati, Ohio
    Posts
    614
    Ummm, who is to say that Nvidia CAN launch the 9800GTX that they leaked months ago? Maybe they just don't have G100 (or whatever was supposed to be on this Card) in working order yet. So, 3870x2 is kicking their butt, so they release a revamped GTS with TriSLI to fill the gap till they can make the next Chip work. Everyone is saying that they don't have to release a much better card, what if they CANT!
    Aaron___________________________Wife________________________ HTPC
    intel i7 2600k_____________________AMD5000+ BE @ 3ghz___________AMD4850+ BE @ 2.5ghz
    stock cooling______________________CM Vortex P912_______________ Foxconn A7GM-S 780G
    AsRock Extreme 4_________________ GB GA-MA78GM-S2H 780G_______OCZ SLI 2gb PC6400
    4gb 1600 DDR3___________________ OCZ Plat 2GB PC6400___________Avermedia A180 HDTV tuner
    MSI 48901gb 950/999______________Tt Toughpower 600w___________ Saphire 4830
    Corsair HX620____________________ inwin allure case___________ ___ Coolmax 480w
    NZXT 410 Gunmetal________________Acer 23" 1080p________________ LiteOn BD player
    X2gen 22" WS
    ________________ ________________________________ nMediaPC 1000B case

  2. #27
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Weak.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  3. #28
    Xtreme Addict
    Join Date
    Jan 2004
    Location
    somewhere in Massachusetts
    Posts
    1,009
    Is it really that surprising? the 8800 GT and (new) GTS really should have been 9xxx line cards, except nVidia didn't want to kill of sales of it's old G8x parts by releasing it. Notice how the 8800 GTX & Ultra have been disappearing over the last few months?

    G9x is very much just a slightly improved G8x on the newer process.

    Anyone know if the next step is a G1x0 or if it's going for multiplicity with the G9x? Smaller die + less power seems to suggest GPU might be going in the same direction of the CPU, that there are limits to what a single die can do and to progress the answer is multi-core designs

  4. #29
    Xtreme 3D Mark Team Staff
    Join Date
    Nov 2002
    Location
    Juneau Alaska
    Posts
    7,607
    256 bit?

    what in the hell is up with that?

    are they trying to de-evolve cards or something?
    I thought newer, was suppose to get better... not worse.




    "The command and conquer model," said the EA CEO, "doesn't work. If you think you're going to buy a developer and put your name on the label... you're making a profound mistake."

  5. #30
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Washington State
    Posts
    1,315
    Quote Originally Posted by SKYMTL View Post
    Sooooo.....it's an 8800GTS 512MB with a slight overclock and boasts Tri-SLI compatibility.

    Thats the entire 9 series so far



    So far it looks like both companys were trying to saturate the market with DX10 cards. Focusing on the midrange like they did and closing the huge gap that used to be there was a good idea for both camps. We win
    Last edited by Jimmer411; 02-25-2008 at 01:34 PM.
    Phenom 9950BE @ 3.24Ghz| ASUS M3A78-T | ASUS 4870 | 4gb G.SKILL DDR2-1000 |Silverstone Strider 600w ST60F| XFI Xtremegamer | Seagate 7200.10 320gb | Maxtor 200gb 7200rpm 16mb | Samsung 206BW | MCP655 | MCR320 | Apogee | MCW60 | MM U2-UFO |

    A64 3800+ X2 AM2 @3.2Ghz| Biostar TF560 A2+ | 2gb Crucial Ballistix DDR2-800 | Sapphire 3870 512mb | Aircooled inside a White MM-UFO Horizon |

    Current Phenom overclock


    Max Phenom overclock

  6. #31
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    No way! It'll have at least 192 shaders and 24 rops!


  7. #32
    Xtreme 3D Mark Team Staff
    Join Date
    Nov 2002
    Location
    Juneau Alaska
    Posts
    7,607
    maybe they got it mixed up with the 9600GT.

    I mean 256 bit, and 512 ram?

    come on.... damn....
    what happend?




    "The command and conquer model," said the EA CEO, "doesn't work. If you think you're going to buy a developer and put your name on the label... you're making a profound mistake."

  8. #33
    Xtreme Addict
    Join Date
    Jan 2007
    Location
    Detroit, MI
    Posts
    1,048
    I too think that nvidia is just waiting to see what ATI comes out with. After launching the 8800GTX, 8800 ULTRA, 800 GTS 320, 640, 512, 8800GT 256 AND 512, I doubt that the 9800 gtx and gx2 are the only cards nvidia has up their sleeves.

    But I hope I am wrong and ATI can rally! Too much green as of late, I need some red to believe in.


    My Custom Pressure/Temperature Charts for Various Refrigerants

    QX6700 @ 3900 | EVGA 680i |2 GB Corsair 8500 Dominator | 7800 GT SLi | Silverstone 1KW | Seagate 7200.9 Barracuda RAID 0 + Hitachi Deskstar 2TB RAID 0 | SS Phase w/ Cryostar Evap | MIPS Full Motherboard cooling
    My Fake Quad Core is better than your real Quad Core! **cough**Barcelona**cough**

    Lian Li Cube Case with Phase and Water DONE!!

  9. #34
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    Where the Cheese Heads Reside
    Posts
    2,173
    Quote Originally Posted by Kunaak View Post
    maybe they got it mixed up with the 9600GT.

    I mean 256 bit, and 512 ram?

    come on.... damn....
    what happend?
    ATI is from everything we heard going to use the 256-bit bus as well on the newer R700 cards. Faster memory = less need to up the bus bandwidth to get same results which = less money spent on the card for manufacturing.
    -=The Gamer=-
    MSI Z68A-GD65 (G3) | i5 2500k @ 4.5Ghz | 1.3875V | 28C Idle / 65C Load (LinX)
    8Gig G.Skill Ripjaw PC3-12800 9-9-9-24 @ 1600Mhz w/ 1.5V | TR Ultra eXtreme 120 w/ 2 Fans
    Sapphire 7950 VaporX 1150/1500 w/ 1.2V/1.5V | 32C Idle / 64C Load | 2x 128Gig Crucial M4 SSD's
    BitFenix Shinobi Window Case | SilverStone DA750 | Dell 2405FPW 24" Screen
    -=The Server=-
    Synology DS1511+ | Dual Core 1.8Ghz CPU | 30C Idle / 38C Load
    3 Gig PC2-6400 | 3x Samsung F4 2TB Raid5 | 2x Samsung F4 2TB
    Heat

  10. #35
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    Rochester, MN
    Posts
    718
    Quote Originally Posted by HKPolice View Post
    No way! It'll have at least 192 shaders and 24 rops!

    omg, that's what it should have
    Thermaltake Armor Series Black
    GIGABYTE GA-P35-DS3R
    Q6600 3.6 GHZ Thermalright Ultra 120 eXtreme
    4 GB Corsair XMS2 w/ OCZ XTX Ram Cooler 2 x 60mm
    9800GT 512MB
    18X Pioneer DVD-RW Burner
    720 Watt Enermax Infiniti
    4x640GB RAID 10
    Windows 7

  11. #36
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by DilTech View Post
    We can't call it crap until we see how it performs...

    Anyway, they're merely doing it to combat ATi's efforts. If they weren't comfortable, they'd just push forward and drop a real new lineup instead of playing sit and wait, all the while giving them time to set up to drop on a smaller process size.
    That doesn't make sense at all from a business prospective. If they were that comfortable they would just introduce their true next gen video card knowing they would beat whatever the R700 offered through sheer performance, price or both. This would mirror the G80 release. Beside, they already own Ageia and are planning on implementing the PhysX SDK using CUDA (assuming) through it's TWIMTB program. By dropping the true next gen ASAP with Physx (through CUDA) would show confidence in their arch design, captured market share and overall price/performance ratio of their next gen video card including PhysX. But we are not seeing this at all. IMO, it looks more of a reactionary move then proactive one.

    Either:
    A. They aren't ready
    B. Truly concerned of what the R700 offers so they can counter it
    C. Bluffing
    D. worried about Intel and AMD/ATI as a whole

    I guess we have to wait and see...
    Last edited by Eastcoasthandle; 02-25-2008 at 02:25 PM.
    [SIGPIC][/SIGPIC]

  12. #37
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    212
    A,B and D.Le'ts hope those are not the specs.

  13. #38
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    I think i'll pick C....

  14. #39
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    Rochester, MN
    Posts
    718
    I think that ATI has a pretty bright future if Nv continues on like this. I really expected them to keep dominating, seeing that they had 2 years to work on a new card. 192 Shaders and 24 ROP's would kickass.
    Thermaltake Armor Series Black
    GIGABYTE GA-P35-DS3R
    Q6600 3.6 GHZ Thermalright Ultra 120 eXtreme
    4 GB Corsair XMS2 w/ OCZ XTX Ram Cooler 2 x 60mm
    9800GT 512MB
    18X Pioneer DVD-RW Burner
    720 Watt Enermax Infiniti
    4x640GB RAID 10
    Windows 7

  15. #40
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Grand Forks, ND (Yah sure, you betcha)
    Posts
    1,266
    As has been mentioned on chiphell, the memory clock is going to be 2200mhz for the stock config, and by assumption that means it will use .8ns 2400mhz GDDR3 from Hynix...So I imagine we'll see overclocked parts from AIBs with the mem clock closer to 2400mhz effective.

    The current G92/G94 chips use 1.0ns 2000mhz GDDR3 iirc.

    While I don't like it any more than ya'll, this should be at least some good news for a architecture starved for bandwidth. A 20% increase on paper will make the part look good in reviews, that much can be assumed.

    That all being said, it's a pretty weak way to go about things...I imagine anyone that has a 9600gt/3800 and on up will wait for RV770 and the upcoming 512-bit gen from nvidia.
    Last edited by turtle; 02-25-2008 at 02:59 PM.
    That is all.

    Peace and love.

  16. #41
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    ble, so its official 9800gtx is worse then a factory oced 8800ultra...

    this is so boring
    i7 3610QM 1.2-3.2GHz

  17. #42
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    Have to say was expecting this, but still very dissapointed (from an xtreme sense). However i still think with the new SP meaning that 128 will roughly be 150SP (old), this thing will fly with the faster mem.

    Selfishly i'm glad i bought a 8800GTX 2 months ago

  18. #43
    Xtreme Member
    Join Date
    Oct 2005
    Posts
    462
    Quote Originally Posted by Eastcoasthandle View Post
    That doesn't make sense at all from a business prospective. If they were that comfortable they would just introduce their true next gen video card knowing they would beat whatever the R700 offered through sheer performance, price or both. This would mirror the G80 release. Beside, they already own Ageia and are planning on implementing the PhysX SDK using CUDA (assuming) through it's TWIMTB program. By dropping the true next gen ASAP with Physx (through CUDA) would show confidence in their arch design, captured market share and overall price/performance ratio of their next gen video card including PhysX. But we are not seeing this at all. IMO, it looks more of a reactionary move then proactive one.

    Either:
    A. They aren't ready
    B. Truly concerned of what the R700 offers so they can counter it
    C. Bluffing
    D. worried about Intel and AMD/ATI as a whole

    I guess we have to wait and see...
    In semiconductor manufacturing, you use the old process as long as you can, thanks to the bathtub curve fail rates that come with new processes. This is purely due to lack of competition, nothing else.

    New EOCF SuperPi thread! Post your scores here
    PCProfile ClubOC ClubNBOC

  19. #44
    Xtreme Cruncher
    Join Date
    Mar 2005
    Posts
    861
    Quote Originally Posted by Eastcoasthandle View Post
    That doesn't make sense at all from a business prospective. If they were that comfortable they would just introduce their true next gen video card knowing they would beat whatever the R700 offered through sheer performance, price or both. This would mirror the G80 release. Beside, they already own Ageia and are planning on implementing the PhysX SDK using CUDA (assuming) through it's TWIMTB program. By dropping the true next gen ASAP with Physx (through CUDA) would show confidence in their arch design, captured market share and overall price/performance ratio of their next gen video card including PhysX. But we are not seeing this at all. IMO, it looks more of a reactionary move then proactive one.

    Either:
    A. They aren't ready
    B. Truly concerned of what the R700 offers so they can counter it
    C. Bluffing
    D. worried about Intel and AMD/ATI as a whole

    I guess we have to wait and see...
    From a business perspective - nobody should ever hold back a superior product in hopes of *future* sales. The only possible motivation for doing so would be if the cost was substantially less for making the current products than the superior ones, or to clear out inventory.
    Bloodrage || 920 @ 3.2Ghz || TRUE Black
    3x 2GB HyperX 2000 || @ 2000Mhz 7.7.7
    2x 300GB WD VR Raid 0 || 2x 2TB Samsung F3 Raid 0
    LG 10x BD-R || LG 22x DVD/RW
    MSI GE 470 || LG 246WP
    Sonar X-Fi || Klipsch 5.1
    Lycosa || Mamba || Exact Mat
    CM ATCS 840 || Seasonic M12D
    Server 2008 R2 x64

  20. #45
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Vancouver, BC
    Posts
    2,061
    I recall a story from about a year ago that the 8800GTX silicon actually had 160 shaders but they were only using 128 to improve yields. An engineer from nVidia actually confirmed this and indicated that the full 160 shaders might be utilized after a die shrink.

    So my guess is that the card will have at least 160 shaders, and I'm also guessing there is no reason to go back to 512MB of memory and they will stick with 768MB (also supported by the pics of the card and how long it is).

  21. #46
    Xtreme Mentor
    Join Date
    Apr 2007
    Location
    Idaho
    Posts
    3,200
    Quote Originally Posted by Eastcoasthandle View Post
    That doesn't make sense at all from a business prospective. If they were that comfortable they would just introduce their true next gen video card knowing they would beat whatever the R700 offered through sheer performance, price or both. This would mirror the G80 release. Beside, they already own Ageia and are planning on implementing the PhysX SDK using CUDA (assuming) through it's TWIMTB program. By dropping the true next gen ASAP with Physx (through CUDA) would show confidence in their arch design, captured market share and overall price/performance ratio of their next gen video card including PhysX. But we are not seeing this at all. IMO, it looks more of a reactionary move then proactive one.

    Either:
    A. They aren't ready
    B. Truly concerned of what the R700 offers so they can counter it
    C. Bluffing
    D. worried about Intel and AMD/ATI as a whole

    I guess we have to wait and see...
    QFT!

    GeForce 9 series is definitely a *yawn*
    "To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."

  22. #47
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Location
    New Hampshire (USA)
    Posts
    998
    Quote Originally Posted by Kunaak View Post
    maybe they got it mixed up with the 9600GT.

    I mean 256 bit, and 512 ram?

    come on.... damn....
    what happend?
    Ditto...
    Asus Maximus III Formula (2001)
    Intel i7 860 (L924B516)
    Noctua D14
    Corsairs CMG4GX3M2A2000C2 (2 x 2GB) RAM
    eVGA GTX480
    DD-H20
    BIX GTX360
    MCP35X PWM
    Creative X-Fi Titanium PCI-e
    LG GGC-H20L Blu-Ray
    Toughpower 850w Modular
    GSkill Phoenix Pro SSD 120GB


    HEAT

  23. #48
    Registered User
    Join Date
    Oct 2005
    Posts
    18
    bah dissapointing

  24. #49
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    The pic doesn't fit the description.

  25. #50
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    So my guess is that the card will have at least 160 shaders, and I'm also guessing there is no reason to go back to 512MB of memory and they will stick with 768MB (also supported by the pics of the card and how long it is).
    that's morelike it...
    i7 3610QM 1.2-3.2GHz

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •