Page 6 of 8 FirstFirst ... 345678 LastLast
Results 126 to 150 of 185

Thread: The 9800 GTX Review Thread

  1. #126
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,550
    Why does the 9800GTX beat the 8800GTX clock for clock, when the 8800GTX has 24 ROPs?

  2. #127
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    PHX
    Posts
    1,494
    it doesn't mean i won't jump whenever reading:

    "512bit" and "Nvidia" in the same sentence. I'd like to see 1024MB and 256 3Ghz shaders too. Oh, and I'd like a nice video transcoding tool as well.

  3. #128
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    the 9800 is nothing more than an 8800GTS G92 overclocked, nVidia should have named the G92's as 9 series, they messed up naming then 8800GTS again..........

  4. #129
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Quote Originally Posted by GAR View Post
    the 9800 is nothing more than an 8800GTS G92 overclocked,
    The 9800GTX's have a new PCB, so nVidia decided to release them as a new series.

    nVidia should have named the G92's as 9 series, they messed up naming then 8800GTS again..........
    I agree. The 8800GT should have been named as 9700GTS, the 8800GTS (g92) should have been skipped, the 9800GTX should have been named as 9800GTS, the 9800GTX should have been released with gDDR4 @2.4GHz and core @700MHz/shader @1800MHz.

  5. #130
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Last edited by mascaras; 03-26-2008 at 09:05 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  6. #131
    Registered User
    Join Date
    May 2007
    Location
    Germany
    Posts
    71
    Quote Originally Posted by mascaras View Post
    Those benchmarks are really hard to believe.
    I think UT3 is really a game which benefits from SLI, but the 8800 GT OC hits nearly the same score like the GX2. Something must be wrong there.
    Desktop and Gaming PC

    CPU: AMD Phenom II 940 - TESTING
    Motherboard: ASUS M3A79-T Deluxe - BIOS0703
    RAM: Transcend aXeRam 2x2GB 5-5-5-15
    GPU: HD4870 512MB
    Case: Coolermaster HAF
    PSU: Enermax Revolution85+ 1050W
    Storage: 2x250GB Seagate SATA2 Raid0 + Samsung Spinpoint 1TB
    Cooling: Custom Water with Triple Radi for CPU+Mainboard

  7. #132
    Registered User
    Join Date
    Jan 2007
    Posts
    32
    Here's a datapoint from my rig:
    3 GHz Q6600
    8800GTS G92 GPU: 760 MHz Memory: 972 MHz Shader: 1900 MHz
    XP 3DMark06 1280x1024: 14,400
    http://service.futuremark.com/compare?3dm06=5934528

    Tweaktown measured
    3 GHz Q6600
    9800GTX GPU: 675 MHz Memory: 1100 MHz Shader:1688 MHz
    XP 3DMark06 1280x1024: 14,927
    http://www.tweaktown.com/articles/13..._xp/index.html

  8. #133
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    It is obvious that the RAM is becoming a bottleneck for the 8800GTS 512, so the high shader and core clocks can't make it up for the lack of bandwidth.

    I managed to squeeze 17645 3D Marks out of a Q6600 @4.1GHz and a 8800GTS 512 GPU: 820MHz Memory: 1100MHz Shader: 2000MHz:
    http://service.futuremark.com/result...&resultType=14
    Last edited by gOJDO; 03-26-2008 at 04:13 PM.

  9. #134
    Registered User
    Join Date
    Jul 2006
    Location
    Romania
    Posts
    31
    8800GTS 512 MB vs. 9800GTX
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	3DMark2006 3600 775 2150 x.JPG 
Views:	1856 
Size:	192.1 KB 
ID:	75065   Click image for larger version. 

Name:	3DMark2006 9800GTX oc x.JPG 
Views:	1659 
Size:	186.9 KB 
ID:	75066  
    i7 920, Zalman CNPS10X Flex, DFI LP DK X58, 3x1024 Corsair 1333 MHz, EVGA GTX460 FTW 1024MB, Hitachi 40/120/320/500 GB, Recom PWS-EVO-BW, Sirtec 500W old style, HP P275 21"

  10. #135
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    ^ deliberately botlnekked by 2.7G cpu.
    but the 8800 GT OC hits nearly the same score like the GX2. Something must be wrong there.
    ???

    http://www.tweaktown.com/articles/13..._xp/index.html
    126fps 9800gtx vs 58 fps 8800gt oc in UT3. @ 1920x1200 all settings maxed, according to the article.

    notable absence of 8800gts512 in article comparisons
    Last edited by adamsleath; 03-26-2008 at 04:53 PM.
    i7 3610QM 1.2-3.2GHz

  11. #136
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Posts
    668
    Quote Originally Posted by adamsleath View Post
    ^ deliberately botlnekked by 2.7G cpu.

    ???

    http://www.tweaktown.com/articles/13..._xp/index.html
    126fps 9800gtx vs 58 fps 8800gt oc in UT3. @ 1920x1200 all settings maxed, according to the article.

    notable absence of 8800gts512 in article comparisons
    That's due to higher clocks and also more shaders(128 from GTX vs 112 from GT)

  12. #137
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Posts
    559
    Quote Originally Posted by adamsleath View Post
    ^ deliberately botlnekked by 2.7G cpu.
    I think he has speedstep/eist or whatever enabled, because 3DMark shows it @ 3.6Ghz.

    @ dsaraolu Can you downclock the 9800gtx to 8800gts clocks and bench?
    x6.wickeD

  13. #138
    Registered User
    Join Date
    Jul 2006
    Location
    Romania
    Posts
    31
    adamsleath E6750@3600 MHz, 450x8 with C1E enable

    Nicksterr printscreen, 9800GTX=8800GTS 512 MB with idem clocks.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	3DMark2006 9800GTX oc 8800GTS.JPG 
Views:	1612 
Size:	185.2 KB 
ID:	75151  
    i7 920, Zalman CNPS10X Flex, DFI LP DK X58, 3x1024 Corsair 1333 MHz, EVGA GTX460 FTW 1024MB, Hitachi 40/120/320/500 GB, Recom PWS-EVO-BW, Sirtec 500W old style, HP P275 21"

  14. #139
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Posts
    559
    Thanks dsaraolu, finally a direct comparison. 9800GTX = 8800GTS + more o/c headroom + 9 series HD features.
    x6.wickeD

  15. #140
    Registered User
    Join Date
    May 2007
    Location
    Germany
    Posts
    71
    It's strange that Tweaktown has deleted their article about 9800 GTX.
    Desktop and Gaming PC

    CPU: AMD Phenom II 940 - TESTING
    Motherboard: ASUS M3A79-T Deluxe - BIOS0703
    RAM: Transcend aXeRam 2x2GB 5-5-5-15
    GPU: HD4870 512MB
    Case: Coolermaster HAF
    PSU: Enermax Revolution85+ 1050W
    Storage: 2x250GB Seagate SATA2 Raid0 + Samsung Spinpoint 1TB
    Cooling: Custom Water with Triple Radi for CPU+Mainboard

  16. #141
    Xtreme Member
    Join Date
    Mar 2005
    Location
    France
    Posts
    100
    Dsaraolu, thank you very much for these results.

    I am hesitating between a 8800 GTX and a 9800 GTX and what I find strange is the weak gain when the 9800 GTX is overclocked : according to your results, the GPU is 7 % increased, RAM 16 % and shader 7 %.

    And in 3D Mark 06, the gains are far weaker :
    GT1 : + 2.2 %
    GT2 : + 5 %
    HDR1 : + 9%
    HDR2 : + 6%

    It looks like the 9800 GTX architecture is not well-balanced, something prevents it from showing gains roughly proportional to the clocks increases.

    For example, my 2900 Pro 1 GB sees roughly proportinal gains when overclocked from 600 to 900 Mhz (GPU) and 925 to 1240 Mhz (RAM) :

    GT1 : + 42 %
    GT2 : + 44.9 %
    HDR1 : + 43.2 %
    HDR2 : + 45%

    The CPU is a E6850@4140 Mhz. I don't want to show that the 2900 architecture is better than the 9800 one.

    Would it be possible for a 8800 GTX owner to show 3DMark06 results at different GPU and RAM frequencies, wituout changing the CPU clock. I would like to check if the results are roughly proportional to the clocks increases. In this case, the weak 9800 GTX memory bandwidth would be the bottleneck in 3DMark06.

    Which would mean that the 8800 GTX would be a better purchase today than the 9800 GTX.

    Thanks.
    Intel 9900K @ 4800 Mhz
    Gigabyte Z390 Aorus Pro bios F9
    4x8 GB Gskill TridentZ 4000 18-19-19-39 @ 3866 Mhz
    ASUS RTX 2080 TI
    ASUS Xonar Essence STX
    Intel Optane 900p 480 GB
    Crucial MX300 2TB
    Crucial MX500 2TB
    Corsair AX 1500i
    Windows 10 x64
    Custom case and watercooling

  17. #142
    Registered User
    Join Date
    Jul 2006
    Location
    Romania
    Posts
    31
    Quote Originally Posted by oc_junkY View Post
    It's strange that Tweaktown has deleted their article about 9800 GTX.
    Probably NDA to end 1 april (not 25 march).

    overclock 9800GTX default is 675/2200 MHz (13890 in 3DMark2006), overclock 830/2500 MHz (14960 in 3DMark2006), GPU and shader ~23%, memory ~13.6%
    GT1: +11.9%
    GT2: +10.8%
    HDR1: +17.8%
    HDR2: +11.2%
    Last edited by dsaraolu; 03-27-2008 at 03:03 AM.
    i7 920, Zalman CNPS10X Flex, DFI LP DK X58, 3x1024 Corsair 1333 MHz, EVGA GTX460 FTW 1024MB, Hitachi 40/120/320/500 GB, Recom PWS-EVO-BW, Sirtec 500W old style, HP P275 21"

  18. #143
    Xtreme Member
    Join Date
    Mar 2005
    Location
    France
    Posts
    100
    Yes, you are right but I am right too : I compared 3 D Mark06 posted just above :
    14960 with 830/2075/1250 Mhz and 14365 with 775/19385/1075 Mhz.

    Whatever the frequencies you choose - 775/19385/1075 Mhz or default clocks - the result is the same : the 3DMark06 gains are far weaker than the freqeuncies increases.
    Intel 9900K @ 4800 Mhz
    Gigabyte Z390 Aorus Pro bios F9
    4x8 GB Gskill TridentZ 4000 18-19-19-39 @ 3866 Mhz
    ASUS RTX 2080 TI
    ASUS Xonar Essence STX
    Intel Optane 900p 480 GB
    Crucial MX300 2TB
    Crucial MX500 2TB
    Corsair AX 1500i
    Windows 10 x64
    Custom case and watercooling

  19. #144
    Registered User
    Join Date
    Jan 2008
    Posts
    12
    Quote Originally Posted by dsaraolu View Post
    adamsleath E6750@3600 MHz, 450x8 with C1E enable

    Nicksterr printscreen, 9800GTX=8800GTS 512 MB with idem clocks.
    tks for the comparison, unless the GTX comes really cheap I'll go for the 8800 GTS

  20. #145
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Why are sites still using the canned benchmark for Crysis where the resolution on ATI cards is limited to below 1920x1200? With the Crysis benchmarking tool all you have to do is record a custom timedemo, select the proper resolution and voila...a real comparison of how an ATI card does against the Nvidia cards in Crysis high res.

  21. #146
    Xtreme Member
    Join Date
    Jul 2007
    Location
    Kentucky, USA
    Posts
    154
    If he used a q6600 anything over 3ghz and that score is in the 16's... That's pretty dag on good.

    CPU - Q6600 "G0" @ 2.8
    MB - ASUS P5N32-E SLI
    GPU - BFG 8800GT OC2 @ 740/951
    RAM - 2GB Crucial Ballistix 800
    PSU - OCZ GameXStream 700W PS
    Case - Coolermaster RC690
    Fans - 5 x 120mm Coolermaster Case fans
    CPU Cooling - Zalman 9500 HSF
    HDD - WD 160GB System/WD 250GB Storage
    Media - LG SATA 16x DVD-RW
    Monitoring - NZXT Controller Panel
    OS - Vista Ultimate 64
    3DMark06 - 13,425

  22. #147
    Xtreme Member
    Join Date
    May 2005
    Location
    San Antonio, TX
    Posts
    297
    I just got my GTS 512 in today and I'm currently running 800 on the core, 1100 on the memory and 1998 on the shaders, and that's 100% stable no artifacts. Honestly, I'm not seeing any reason for the GTX if the GTS 512 has the same GPU headroom for $100 less. The memory will have more headroom, but that's about it (which is admittedly handy with the 256-bit interface). Sadly, my 3GHz Opteron 165 with 2x1GB DDR500 is my main bottleneck. I'm more concerned about 45nm Intel than I am the '9800'GTX.

  23. #148
    Registered User
    Join Date
    Mar 2007
    Location
    Vancouver, BC
    Posts
    51
    one of the things I believe, when you crank the AA and AF, and make that memory bus really count, your going to see the G80 based 8800s outperform the 9800s consistently. 1 year late doesn't help.

    pisses me right the F off.

  24. #149
    Registered User
    Join Date
    Dec 2006
    Posts
    6
    You guys reckon it's worth it going from geforce 8800gts 640mb to geforce 9800gtx?? Consider I play at 1680x1050.
    Main rig:
    Antec p160w
    core2duo e6400 @ 2.8ghz
    asus p5n32 e sli bios 0802
    2x1gig ddr 2 800 mhz team extreem
    xfx 8800 gts 640mb
    logitech oem keyboard
    logitech mx310 mouse
    logitech rumble pad 2
    Samsung syncmaster 753s ( upgrading to 226bw)
    windows vista home premium 32 bit
    aopen 700 watt
    Zalman 9500at cpu cooler

  25. #150
    Xtreme Member
    Join Date
    Jul 2007
    Location
    Kentucky, USA
    Posts
    154
    Quote Originally Posted by stefan9 View Post
    You guys reckon it's worth it going from geforce 8800gts 640mb to geforce 9800gtx?? Consider I play at 1680x1050.
    Only if you really want Crysis to be as good as possible. 8800GT and GTS G92 can handle any game at 1680x1050 full settings except Crysis. Personally, I have Crysis all Very High DX10 and I can still play it, looks great, a little choppy but liveable. Save 120-150 and get a G92 instead of G94.

    CPU - Q6600 "G0" @ 2.8
    MB - ASUS P5N32-E SLI
    GPU - BFG 8800GT OC2 @ 740/951
    RAM - 2GB Crucial Ballistix 800
    PSU - OCZ GameXStream 700W PS
    Case - Coolermaster RC690
    Fans - 5 x 120mm Coolermaster Case fans
    CPU Cooling - Zalman 9500 HSF
    HDD - WD 160GB System/WD 250GB Storage
    Media - LG SATA 16x DVD-RW
    Monitoring - NZXT Controller Panel
    OS - Vista Ultimate 64
    3DMark06 - 13,425

Page 6 of 8 FirstFirst ... 345678 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •