MMM
Page 6 of 7 FirstFirst ... 34567 LastLast
Results 126 to 150 of 156

Thread: Nvidia Geforce 9800GTX Specifications released!

  1. #126
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by DataNusse View Post
    DAAMIT's ass is toast if those specs are true
    really? because if I remember correctly, AMD/ATi does continue to spend money on R&D and typically speaking such funding creates new and better products
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  2. #127
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    North USA
    Posts
    670
    Guys, please consider the possibility that these specs are fake. The eDram piece is a huge indicator to me that someone made these specs up.
    Asus P6T-DLX V2 1104 & i7 920 @ 4116 1.32v(Windows Reported) 1.3375v (BIOS Set) 196x20(1) HT OFF
    6GB OCZ Platinum DDR3 1600 3x2GB@ 7-7-7-24, 1.66v, 1568Mhz
    Sapphire 5870 @ 985/1245 1.2v
    X-Fi "Fatal1ty" & Klipsch ProMedia Ultra 5.1 Speaks/Beyerdynamic DT-880 Pro (2005 Model) and a mini3 amp
    WD 150GB Raptor (Games) & 2x WD 640GB (System)
    PC Power & Cooling 750w
    Homebrew watercooling on CPU and GPU
    and the best monitor ever made + a Samsung 226CW + Dell P2210 for eyefinity
    Windows 7 Utimate x64

  3. #128
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,550
    Quote Originally Posted by MaxxxRacer View Post
    lol.. they are coppying the name of one of the most successful video cards ever created... Good ole' 9800XT..
    If NVIDIA's version is as successful as ATI's, I'm naming my kid 9800

  4. #129
    Xtreme Addict
    Join Date
    Sep 2006
    Location
    Surat, India.
    Posts
    1,309
    OMG.... this is unreal

    I will start saving up
    Sound: Asus Essense ST | Wharfedale Diamond 9.1 | Norge 2060 Stereo amp | Wharfedale SW150 sub (coming soon)
    Camera Gear: Canon 6D | Canon 500D | Canon 17-40L | Canon 24-105L | Canon 50mm f1.4 | Canon 85mm f1.8 | Rokinon 14mm f2.8 | Sigma 10-20EX HSM | Benro A3580F + Vanguard SBH250 | Bag full of filters and stuff

  5. #130
    Xtreme Member
    Join Date
    May 2005
    Location
    Sweden
    Posts
    125
    Quote Originally Posted by nn_step View Post
    really? because if I remember correctly, AMD/ATi does continue to spend money on R&D and typically speaking such funding creates new and better products
    Well of course they do. But can they keep up with nVidias time schedule. And if they do, is there a possibility that they will mess up, again.

    If those specs are true and the card will be released in the end of this year, then ATI is toast. Because by the end of this year i cant see ATI comming with something new beside a 65nm of R600 card.

    But again the spec of this 9800GTX can be fake.
    "The flames of freedom. How lovely. How just. Ahh, my precious anarchy.."

  6. #131
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by DataNusse View Post
    Well of course they do. But can they keep up with nVidias time schedule. And if they do, is there a possibility that they will mess up, again.

    If those specs are true and the card will be released in the end of this year, then ATI is toast. Because by the end of this year i cant see ATI comming with something new beside a 65nm of R600 card.

    But again the spec of this 9800GTX can be fake.
    ATi isn't going to be toast, it is going to counteract and outperform it
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  7. #132
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    We should edit the original post to let people know its some comment left a month and a half ago, not any fact...

  8. #133
    Tyler Durden
    Join Date
    Oct 2003
    Location
    Massachusetts, USA
    Posts
    5,623
    Quote Originally Posted by nn_step View Post
    ATi isn't going to be toast, it is going to counteract and outperform it
    Yea, just like they did with R600 after they saw G80.
    Formerly XIP, now just P.

  9. #134
    Registered User
    Join Date
    Dec 2006
    Location
    UE , Romania, Baia Mare
    Posts
    72
    KoHaN69 - why not name it HAL 9000 ?
    ------ NO SIGNATURE ------

  10. #135
    Xtreme Enthusiast
    Join Date
    Jan 2005
    Location
    Stafford, UK
    Posts
    810
    Quote Originally Posted by nn_step View Post
    really? because if I remember correctly, AMD/ATi does continue to spend money on R&D and typically speaking such funding creates new and better products
    Well the way I remember, AMD was toasting Intel for 3 years, but instead of working on K10 when they had money, they've been...?

    I can't remember who said it here, but something like "Looks like AMD spent their R&D budget on booze and hookers for the last 3 years". Don't get me wrong I love AMD to bits, but because of their poor foresight they're getting spit-roasted by Intel and nV at the moment

    VENOM: DFI LP LT X38-T2R ~ Core 2 Duo E8600 @ 4.00GHz ~ 4GB OCZ Blade LV DDR2-1150 ~ Radeon R9 380 4GB ~ Crucial C300 64GB ~ Seasonic X-750 ~ Dell U2913WM 29" ~ Win 7 Ultimate x64
    LAIKA: Alienware Alpha R2 ~ Core i5-6400T @ 2.20GHz / 2.80GHz ~ 16GB Ballistix Sport LT DDR4-2133 ~ GeForce GTX 960 4GB ~ Crucial MX300 275GB ~ LG OLED55B7A 55" TV ~ Win 10 Home x64
    BLADE: Razer Blade 14" (2013) ~ Core i7-4702HQ @ 2.20GHz / 3.20GHz ~ 8GB DDR3-1600 ~ GeForce GTX 765M 2GB ~ Samsung 840 EVO mSATA 500GB ~ Win 7 Ultimate x64
    E-MAIL

  11. #136
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by Sumanji View Post
    Well the way I remember, AMD was toasting Intel for 3 years, but instead of working on K10 when they had money, they've been...?

    I can't remember who said it here, but something like "Looks like AMD spent their R&D budget on booze and hookers for the last 3 years". Don't get me wrong I love AMD to bits, but because of their poor foresight they're getting spit-roasted by Intel and nV at the moment
    http://redhill.net.au/c/c-8.html
    Scroll down and look at the chart for a couple minutes and realize that even though they haven't always had the performance crown doesn't mean that it killed them nor did it mean they weren't working their ass off.
    Heck their "failed" K5 was the key to their Famous K6 procs, after they gleaned some alternate experience from nexgen (which was considered a horrid idea by the industry but proved to be genius) or how about how they picked up the DEC alpha team at huge expense and once again thought as an absolute failure (atleast until K7 and K8 came out and changed the industry) and NOW... AMD bought ATi and people are screaming how horrid of an idea it is but I honestly Question people who don't look at AMD's history and see how often that was exactly perfect for their future plans.
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  12. #137
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by FghtinIrshNvrDi View Post
    I thought the unified arch fixed that. I'm not much of a microarchitecture specialist, so correct me if I'm wrong.

    Ryan
    well im not expert either but what i under stood is the unified shader architecture took the pixel pipelines and vertex shaders and made a one size fits all processor but the ROP's and TMU's still are on there own. that seems to be the 2900xt's biggest weakness it has less than the 8800gtx and gts.

    Quote Originally Posted by Truckchase! View Post
    eDram is nothing more than ram. You can't get "free" aniso from fast ram, as it's a filtering operation. Some could argue that you could get "free" AA, but with today's AA ops as complex as they are that's overly simplistic. Also take into account that while a common target for eDram used to be the frame buffer, today's popular resolutions have grown too high to make it cost effective. Take for example 1680x1050x32bpp w/ NO AA, standard front and back buffer.... the front buffer alone is be 56,448,000 uncompressed. (56MB)

    eDram is nothing but a waste of transistors for any decent size resolution, and therefore this looks entirely fake. The only possibility is that this could be closer to realistic now that transistor counts in the last gen have already gotten ridiculous, but I still don't think anyone would waste space on eDram for PC resolutions.

    P.S. Both ATI and Nvidia designs have had L1 and L2 cache for quite some time now.
    i see your point and you seem to be right. i did not know that you would need that much edram in order to make it work on higher res screens. but surely edram could be used to speed something up. but i guess it's far to "big" to fit on a die and offer good benefits compared to more stream processors, ROP's and TMU's.

    on another note it says "next gen unified shader" could that combine the ROP's and TMU's into the steam processors?
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  13. #138
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    Quote Originally Posted by nn_step View Post
    http://redhill.net.au/c/c-8.html
    Scroll down and look at the chart for a couple minutes and realize that even though they haven't always had the performance crown doesn't mean that it killed them nor did it mean they weren't working their ass off.
    Heck their "failed" K5 was the key to their Famous K6 procs, after they gleaned some alternate experience from nexgen (which was considered a horrid idea by the industry but proved to be genius) or how about how they picked up the DEC alpha team at huge expense and once again thought as an absolute failure (atleast until K7 and K8 came out and changed the industry) and NOW... AMD bought ATi and people are screaming how horrid of an idea it is but I honestly Question people who don't look at AMD's history and see how often that was exactly perfect for their future plans.


    +100000

  14. #139
    Xtreme Enthusiast
    Join Date
    Jan 2005
    Location
    Stafford, UK
    Posts
    810
    Quote Originally Posted by nn_step View Post
    http://redhill.net.au/c/c-8.html
    Scroll down and look at the chart for a couple minutes and realize that even though they haven't always had the performance crown doesn't mean that it killed them nor did it mean they weren't working their ass off....
    And my point is that over the past 3 years we have seen NO major architectural improvements in AMD processors at all. Higher clocks, DDR2 support, 65nm shrink (and a not very successful one it seems). Maybe it's too much to expect revolutionary change, but that list is evolutionary at the bare minimum imo.

    K10 should have been out a year ago, giving Intel a worthy competitor to Conroe.


    By the way, these specs are quite obviously made up. We go through the same thing EVERYTIME whenever we start to smell a new graphics card release. Some jackass pieces together rubbish from the Inq and Fudzilla and adds some "common sense" evolutionary bits to the spec, along with an arbitrary clock speed guess, and this makes us all shout "zomfgwtfbbq" and cream our pants.

    Sorry I'm not biting on this one!

    VENOM: DFI LP LT X38-T2R ~ Core 2 Duo E8600 @ 4.00GHz ~ 4GB OCZ Blade LV DDR2-1150 ~ Radeon R9 380 4GB ~ Crucial C300 64GB ~ Seasonic X-750 ~ Dell U2913WM 29" ~ Win 7 Ultimate x64
    LAIKA: Alienware Alpha R2 ~ Core i5-6400T @ 2.20GHz / 2.80GHz ~ 16GB Ballistix Sport LT DDR4-2133 ~ GeForce GTX 960 4GB ~ Crucial MX300 275GB ~ LG OLED55B7A 55" TV ~ Win 10 Home x64
    BLADE: Razer Blade 14" (2013) ~ Core i7-4702HQ @ 2.20GHz / 3.20GHz ~ 8GB DDR3-1600 ~ GeForce GTX 765M 2GB ~ Samsung 840 EVO mSATA 500GB ~ Win 7 Ultimate x64
    E-MAIL

  15. #140
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by Sumanji View Post
    And my point is that over the past 3 years we have seen NO major architectural improvements in AMD processors at all. Higher clocks, DDR2 support, 65nm shrink (and a not very successful one it seems). Maybe it's too much to expect revolutionary change, but that list is evolutionary at the bare minimum imo.

    K10 should have been out a year ago, giving Intel a worthy competitor to Conroe.


    By the way, these specs are quite obviously made up. We go through the same thing EVERYTIME whenever we start to smell a new graphics card release. Some jackass pieces together rubbish from the Inq and Fudzilla and adds some "common sense" evolutionary bits to the spec, along with an arbitrary clock speed guess, and this makes us all shout "zomfgwtfbbq" and cream our pants.

    Sorry I'm not biting on this one!
    one silly stupid thought but, have you considered that during those 3+ years that AMD has been developing a new design to such a degree that little to none of the features came out in existing products.
    Kinda like how K6 never had K7's improved Floating point performance
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  16. #141
    Xtreme Enthusiast
    Join Date
    Jan 2005
    Location
    Stafford, UK
    Posts
    810
    Quote Originally Posted by nn_step View Post
    one silly stupid thought but, have you considered that during those 3+ years that AMD has been developing a new design to such a degree that little to none of the features came out in existing products.
    Kinda like how K6 never had K7's improved Floating point performance
    A) AMD have already stated that K10 will be more evolutionary than revolutionary.
    B) Let's wait till September
    C) Let's get back on topic

    VENOM: DFI LP LT X38-T2R ~ Core 2 Duo E8600 @ 4.00GHz ~ 4GB OCZ Blade LV DDR2-1150 ~ Radeon R9 380 4GB ~ Crucial C300 64GB ~ Seasonic X-750 ~ Dell U2913WM 29" ~ Win 7 Ultimate x64
    LAIKA: Alienware Alpha R2 ~ Core i5-6400T @ 2.20GHz / 2.80GHz ~ 16GB Ballistix Sport LT DDR4-2133 ~ GeForce GTX 960 4GB ~ Crucial MX300 275GB ~ LG OLED55B7A 55" TV ~ Win 10 Home x64
    BLADE: Razer Blade 14" (2013) ~ Core i7-4702HQ @ 2.20GHz / 3.20GHz ~ 8GB DDR3-1600 ~ GeForce GTX 765M 2GB ~ Samsung 840 EVO mSATA 500GB ~ Win 7 Ultimate x64
    E-MAIL

  17. #142
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by Sumanji View Post
    A) AMD have already stated that K10 will be more evolutionary than revolutionary.
    B) Let's wait till September
    C) Let's get back on topic
    1) I wouldn't call significant modifications to instruction dispatch a minor evolution, rather a significant evolutionary step
    2) Works for me
    3) Wait you mean this isn't like the WCG?
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  18. #143
    Xtreme Enthusiast
    Join Date
    Jan 2003
    Location
    NYC
    Posts
    634
    we're off topic

    back on topic: when is the approx release date for this "9800GTX?"
    May 30th, 2007
    Quote Originally Posted by Haltech View Post
    Ive alrwady predicted that within 3 years, Sony will be purchased and liquidated.
    i7 920 D0 @ 2.8ghz // GTX 670 // Rampage II Extreme // 14GBs DDR3 // 550w PSU // corsair H100i // Corsair 800D

    i7 3770k stock // Maximus V Gene // 16GB @1600 // OCZ z850 //

    i7 3930k @ 4.5ghz // 780 SLI // P9X79-E WS // 16GB @ 1600 // AX1200i // and a horrible swiftech h220 // Carbide Air 540

  19. #144
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by KoHaN69 View Post
    If NVIDIA's version is as successful as ATI's, I'm naming my kid 9800
    That's one of the better posts I've read around here in a while.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  20. #145
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    295
    I can't believe people are already wetting themselves over these specs.

    I'd say theres a 99.9 percent chance those specs are complete BS.
    "Shake off all the fears of servile prejudices, under which weak minds are servilely crouched. Fix reason firmly in her seat, and call on her tribunal for every fact, every opinion. Question with boldness even the existence of a God; because, if there be one, he must more approve of the homage of reason than that of blindfolded fear."
    Thomas Jefferson (1743-1826)

  21. #146
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    I was discussing this with Mav on MSN earlier...I thought the Audio chip and the GDDR4 sounded a bit dodgy, but maybe GDDR4 will be really decent by then?

    How would a sound chips outputs be dealt with? ribbon cable and a PCI-slot for the jacks? adjust the heatsink so theres space for them on the 2nd half of the I/O-PCI plate? (Which would strongly suggest the HSF will not disperse heat out the back of the case)
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  22. #147
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by K404 View Post
    I was discussing this with Mav on MSN earlier...I thought the Audio chip and the GDDR4 sounded a bit dodgy, but maybe GDDR4 will be really decent by then?

    How would a sound chips outputs be dealt with? ribbon cable and a PCI-slot for the jacks? adjust the heatsink so theres space for them on the 2nd half of the I/O-PCI plate? (Which would strongly suggest the HSF will not disperse heat out the back of the case)
    Sound chip will be small and only to use sound over the HDMI cable IMO like how Ati does it
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  23. #148
    Registered User
    Join Date
    Nov 2006
    Posts
    67

    Cool

    Quote Originally Posted by Verisimilitude View Post
    Generally makers go for RAM/2. Sometimes they do full on ram on high end parts, but I've only seen that in the x1950 series.

    768ram = 384bit
    640ram = 320bit

    the gts version of this part will most likely be 512 = 256bit


    Thats why it makes sense.
    I still dont get it, for such a beast of a card it doesnt make sense to have the gts version of a newer card be 256bit when the older is 320bit so is either the GTS version of the 9 series gonna be 768mb and the high end 1gb or it just would not make sense @ all to get the GTS version of the 9 series!!!
    Crymera
    Intel Quad Core Q6600 @3Ghz /artic silver ceramic, Zalman9700NT, MSI P6N Diamond mobo, Corsair XMS 4Gb Kit PC6400 @1066, 150Gb Raptor, 500Gb PR Hitachi, Pioneer 122D, Samsung SH-S183L, TT TP 700W Modular, TT Armor +25cm fan, Dell SP2008WFP LCD Monitor!
    Vista Ultimate 64-bit/XP 32-bit
    CRYSIS READY!!

  24. #149
    Xtreme Enthusiast
    Join Date
    Jan 2005
    Location
    Stafford, UK
    Posts
    810
    Quote Originally Posted by K404 View Post
    I was discussing this with Mav on MSN earlier...I thought the Audio chip and the GDDR4 sounded a bit dodgy, but maybe GDDR4 will be really decent by then?
    They aren't the dodgy bots at all mate... as mentioned already the "sound" bit could just be HDMI audio pass-through as done on teh HD2900, and GDDR4 is surely a logical progression with faster speeds etc?

    The EDRAM bit is probably the fishiest item of the list!

    But the entire thing is made up anyway so it doesn't matter

    VENOM: DFI LP LT X38-T2R ~ Core 2 Duo E8600 @ 4.00GHz ~ 4GB OCZ Blade LV DDR2-1150 ~ Radeon R9 380 4GB ~ Crucial C300 64GB ~ Seasonic X-750 ~ Dell U2913WM 29" ~ Win 7 Ultimate x64
    LAIKA: Alienware Alpha R2 ~ Core i5-6400T @ 2.20GHz / 2.80GHz ~ 16GB Ballistix Sport LT DDR4-2133 ~ GeForce GTX 960 4GB ~ Crucial MX300 275GB ~ LG OLED55B7A 55" TV ~ Win 10 Home x64
    BLADE: Razer Blade 14" (2013) ~ Core i7-4702HQ @ 2.20GHz / 3.20GHz ~ 8GB DDR3-1600 ~ GeForce GTX 765M 2GB ~ Samsung 840 EVO mSATA 500GB ~ Win 7 Ultimate x64
    E-MAIL

  25. #150
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Grand Forks, ND (Yah sure, you betcha)
    Posts
    1,266
    Quote Originally Posted by dejavuxx View Post
    I still dont get it, for such a beast of a card it doesnt make sense to have the gts version of a newer card be 256bit when the older is 320bit so is either the GTS version of the 9 series gonna be 768mb and the high end 1gb or it just would not make sense @ all to get the GTS version of the 9 series!!!
    Agreed. I believe the GTS will be 384-bit. Well, that or 448-bit. Considering performance parts usually replace last-gen flagships (and are a little faster) let's see why 384-bit works:

    How many shaders does the GTS now have compared to the GTX?

    A. 3/4.

    Where are most ATi performance products in specs compared to the flagship?

    A. 3/4.

    What is 384/512?

    A. 3/4

    What is 12 ram chips divided by 16 ram chips (4 less chips saving nvidia moneys and perhaps a pcb change from the current 8800's for the GTS)

    A. 3/4.

    56x? = The answer to the universe

    A. 3/4
    Last edited by turtle; 07-26-2007 at 04:48 PM.
    That is all.

    Peace and love.

Page 6 of 7 FirstFirst ... 34567 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •