Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 79

Thread: My performance simulation of GeForce 9800GX2

  1. #26
    Xtreme Member
    Join Date
    Jan 2006
    Location
    Montreal, Quebec, Canada
    Posts
    460
    Quote Originally Posted by Eastcoasthandle View Post
    Wait, isn't the 9800 GTX nothing more then a 8800 GTX die shrink?
    i think its more like a 8800GS 512mb die shrink (g92) instead of (G80) die shrink. But anyways, this thing does not deserve the *98XX name series, its not new architecture, we want new chip, we want G100 or g98 whatever the hell its called..

    But still, nice little "preview" but this is based on what we know, im sure nvidia has something more than a die shrink on its hands to at least have a product that deserves its place in the high end market.
    Last edited by shogo_ca; 01-04-2008 at 11:30 AM.


    CPU Intel C2D 6400 - 6600 L631B Aircooled 4010 MHz (wip) 3850 MHz daily
    COOLER Big Typhoon 120 - Ultra 120/Panaflow 103cfm
    MOBO Asus p5w dh - Badaxe 2
    RAM 2gb Team 667 cl3 844mhz 4-4-3-4 2.0v (wip)
    GPU Bfg 7900GT - XFX GX2 600/1600 3Dmark06 10554 (wip) 207K AM3
    PSU Gamestream 600w - Powerstream 520 - Silverstone Zeus 750w
    Chassis Lian-Li V2000b Plus2 Lian-Li v1100b

  2. #27
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Cairo
    Posts
    2,366
    Some one should compare 7950GX2 to underclocked 7900GTX SLI to see if they will follow the same pattern

  3. #28
    Xtreme Enthusiast
    Join Date
    Apr 2004
    Posts
    703
    Nice simulation, but I would imagine that 9800GX2 would only shine and be "30%" faster than Ultra only in AA/AF performance.

    And with BioShock, I don't think it's a quality benchmark to show the performance difference since you have to test manually with FRAPs. Any possible chance to show CoH, WIC and Lost Planet with built-in benchmarks.
    A wiseman once said, "If Bible proves the existence of God, then comic books prove the existence of Superheros."

  4. #29
    Xtreme X.I.P. Soulburner's Avatar
    Join Date
    Oct 2003
    Location
    Lincoln, NE
    Posts
    8,868
    You can show framerate with Rivatuner.
    System
    ASUS Z170-Pro
    Skylake i7-6700K @ 4600 Mhz
    MSI GTX 1070 Armor OC
    32 GB G.Skill Ripjaws V
    Samsung 850 EVO (2)
    EVGA SuperNOVA 650 G2
    Corsair Hydro H90
    NZXT S340

  5. #30
    Xtreme Member
    Join Date
    Jan 2006
    Location
    Montreal, Quebec, Canada
    Posts
    460
    benchmark should be calculated in timedemos so the exact same frames are being rendered by different hardware, but still overall it gives a good overview


    CPU Intel C2D 6400 - 6600 L631B Aircooled 4010 MHz (wip) 3850 MHz daily
    COOLER Big Typhoon 120 - Ultra 120/Panaflow 103cfm
    MOBO Asus p5w dh - Badaxe 2
    RAM 2gb Team 667 cl3 844mhz 4-4-3-4 2.0v (wip)
    GPU Bfg 7900GT - XFX GX2 600/1600 3Dmark06 10554 (wip) 207K AM3
    PSU Gamestream 600w - Powerstream 520 - Silverstone Zeus 750w
    Chassis Lian-Li V2000b Plus2 Lian-Li v1100b

  6. #31
    Xtreme Enthusiast
    Join Date
    Apr 2004
    Posts
    703
    Quote Originally Posted by Soulburner View Post
    You can show framerate with Rivatuner.
    But BioShock still requires manually run of the app, it doesn't have timedemo function built in. So the results aren't exactly perfect since you have now introduced human errors in the equation.
    A wiseman once said, "If Bible proves the existence of God, then comic books prove the existence of Superheros."

  7. #32
    Xtreme X.I.P. Soulburner's Avatar
    Join Date
    Oct 2003
    Location
    Lincoln, NE
    Posts
    8,868
    Quote Originally Posted by thephenom View Post
    But BioShock still requires manually run of the app, it doesn't have timedemo function built in. So the results aren't exactly perfect since you have now introduced human errors in the equation.
    All you have to do is save the game at the point you which to measure performance.

    Hit "Continue" and load up the game each time. Don't move, just take down the numbers. It is a very repeatable procedure, with no variables to account for unless a Splicer happens to walk on screen, which is unlikely. If that happened though, the test would be repeated.
    System
    ASUS Z170-Pro
    Skylake i7-6700K @ 4600 Mhz
    MSI GTX 1070 Armor OC
    32 GB G.Skill Ripjaws V
    Samsung 850 EVO (2)
    EVGA SuperNOVA 650 G2
    Corsair Hydro H90
    NZXT S340

  8. #33
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Montenegro
    Posts
    333
    Quote Originally Posted by EnJoY View Post
    Remember, this was just a simulation, not real results.
    Thanks for reminding me. But just like with the last gx2 the performance was less then what 2x same cards did in pure SLI mode. So this simulation is probably withing %3 of what to expect when it finally comes out. My guess it'll cost around 599 dollars and no smart buyer would even consider buying this card considering the alternatives.
    Last edited by RedBull78; 01-04-2008 at 01:52 PM.
    Internet will save the World.

    Foxconn MARS
    Q9650@3.8Ghz
    Gskill 4Gb-1066 DDR2
    EVGA GeForce GTX 560 Ti - 448/C Classified Ultra
    WD 1T Black
    Theramlright Extreme 120
    CORSAIR 650HX

    BenQ FP241W Black 24" 6ms
    Win 7 Ultimate x64

  9. #34
    Xtreme X.I.P. Soulburner's Avatar
    Join Date
    Oct 2003
    Location
    Lincoln, NE
    Posts
    8,868
    Quote Originally Posted by RedBull78 View Post
    Thanks for reminding me. But just like with the last gx2 the performance was less then what 2x same cards did in pure SLI mode. So this simulation is probably withing %3 of what to expect when it finally comes out. My guess it'll cost around 599 dollars and no smart buyer would even consider buying this card considering the alternatives.
    Which is why he underclocked both cards.
    System
    ASUS Z170-Pro
    Skylake i7-6700K @ 4600 Mhz
    MSI GTX 1070 Armor OC
    32 GB G.Skill Ripjaws V
    Samsung 850 EVO (2)
    EVGA SuperNOVA 650 G2
    Corsair Hydro H90
    NZXT S340

  10. #35
    Xtreme Enthusiast
    Join Date
    Apr 2004
    Posts
    703
    Quote Originally Posted by Soulburner View Post
    All you have to do is save the game at the point you which to measure performance.

    Hit "Continue" and load up the game each time. Don't move, just take down the numbers. It is a very repeatable procedure, with no variables to account for unless a Splicer happens to walk on screen, which is unlikely. If that happened though, the test would be repeated.
    I guess that would be a test with little variant, but I personally still prefer timedemo that gives you a good average performance of the game being tested.
    A wiseman once said, "If Bible proves the existence of God, then comic books prove the existence of Superheros."

  11. #36
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by saaya View Post
    9800gtx is a 8800gts g92 but with higher clocks afaik
    Only 3 things are known about the D9E(which is assumed to be the 9800GTX).

    1.) 512bit bus(this alone should make quite the difference)
    2.) 32 rops(see 1). AA will be a breeze for this thing.
    3.) 65nm

    It's assumed to also be the 1-tflop core NVidia talked about last year.

    I don't think that's a 8800GTS G92 with higher clocks. It may be based on that design, but the memory bus alone tells us it cannot be the same core.

    As for the 9800GX2, I'm not thinking it's going to be the flagship from nvidia. Either that, or because NVidia doesn't fear the 3870x2 they're just sitting on the D9E for now and will release it as a 9900. Why play the ace of spades when the queen will still win the book?
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  12. #37
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    nvidia loosing its touch? you got to be kidding.

  13. #38
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by GAR View Post
    nvidia loosing its touch? you got to be kidding.
    They aren't, this is just the effect of a lack of competition.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  14. #39
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,550
    Quote Originally Posted by saaya View Post
    you forgot one thing, the 9800gx2 will have 16lanes for 2 cards, so each gpu will have 8 lanes only, so itll be even slower than that
    thats what i was thinking too..

    i guess 2x 8800 in 8x PCIe could simulate that

  15. #40
    Xtreme Mentor
    Join Date
    May 2005
    Location
    Westlake Village, West Hills
    Posts
    3,046
    cool simulation
    PC Lab Qmicra V2 Case SFFi7 950 4.4GHz 200 x 22 1.36 volts
    Cooled by Swiftech GTZ - CPX-Pro - MCR420+MCR320+MCR220 | Completely Silent loads at 62c
    GTX 470 EVGA SuperClocked Plain stock
    12 Gigs OCZ Reaper DDR3 1600MHz) 8-8-8-24
    ASUS Rampage Gene II |Four OCZ Vertex 2 in RAID-0(60Gig x 4) | WD 2000Gig Storage


    Theater ::: Panasonic G20 50" Plasma | Onkyo SC5508 Processor | Emotiva XPA-5 and XPA-2 | CSi A6 Center| 2 x Polk RTi A9 Front Towers| 2 x Klipsch RW-12d
    Lian-LI HTPC | Panasonic Blu Ray 655k| APC AV J10BLK Conditioner |

  16. #41
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    Quote Originally Posted by saaya View Post
    you forgot one thing, the 9800gx2 will have 16lanes for 2 cards, so each gpu will have 8 lanes only, so itll be even slower than that
    16x lanes PCIe 2.0 has bandwidth like t16+16 PCIe 1.1! Dont forget on PCIe 2.0! On this chipset and cards!

  17. #42
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    How Is This News?!!!!

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  18. #43
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by DilTech View Post
    Only 3 things are known about the D9E(which is assumed to be the 9800GTX).

    1.) 512bit bus(this alone should make quite the difference)
    2.) 32 rops(see 1). AA will be a breeze for this thing.
    3.) 65nm

    It's assumed to also be the 1-tflop core NVidia talked about last year.

    I don't think that's a 8800GTS G92 with higher clocks. It may be based on that design, but the memory bus alone tells us it cannot be the same core.

    As for the 9800GX2, I'm not thinking it's going to be the flagship from nvidia. Either that, or because NVidia doesn't fear the 3870x2 they're just sitting on the D9E for now and will release it as a 9900. Why play the ace of spades when the queen will still win the book?
    well who says g92 doesnt have more rops and a 512bit bus?
    the yields are probabaly just not there yet, hence they started with 8800gt, then 8800gts and soon the top parts will be 8800gtx g92 and the rest will be 8800gts and gt

  19. #44
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by OBR View Post
    16x lanes PCIe 2.0 has bandwidth like t16+16 PCIe 1.1! Dont forget on PCIe 2.0! On this chipset and cards!
    but 8800 is already pciE 2.0 isnt it?
    so its still the same... the gx2 will have half the chipset/ram/cpu bandwidth of a real sli solution

  20. #45
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by saaya View Post
    but 8800 is already pciE 2.0 isnt it?
    so its still the same... the gx2 will have half the chipset/ram/cpu bandwidth of a real sli solution
    but no effect on performance, just look at the 8800gtx, pcie1.0 and similard scaling as 8800gt in SLI

    pcie x8 2.0 = pciex16 1.0

    i bet that the 9800gx2 will be the first card with lower performance on pcie 1.0 mobos
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  21. #46
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    I really don't understand why all the benches are without AA. I mean does anyone on here play with no AA if they could?
    Cool simulation regardless. Nvidia are only competing with themselves at the moment.

  22. #47
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    benches updated ...

  23. #48
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by generics_user View Post
    but no effect on performance, just look at the 8800gtx, pcie1.0 and similard scaling as 8800gt in SLI

    pcie x8 2.0 = pciex16 1.0

    i bet that the 9800gx2 will be the first card with lower performance on pcie 1.0 mobos
    so you mean pciE 2.0 16 is too much bandwidth hence no scaling and pciE 2.0 8x is the same as 16x pciE 1.0 so it should be enough?

    i dont think so... dont forget that the top scores with sli are with oced pciE bus to increase the b/w... so 16x pciE 1.0 is definately not enough to supply enough bw to even 1 card, let alone 2! hence 8x pciE 2.0 wont be anough for 1 card either... so 16x 2.0 wont be anough for 2.

    all speculation, but thats using logic looking at results we know.
    im pretty sure 9800gx2 will b slower than 2 8800gt/gts cards with the same specs and clockspeeds but each card having full 16x b/w

  24. #49
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    Quote Originally Posted by saaya View Post
    all speculation, but thats using logic looking at results we know.
    im pretty sure 9800gx2 will b slower than 2 8800gt/gts cards with the same specs and clockspeeds but each card having full 16x b/w

    Yeah, because that i underclocked cards ...

  25. #50
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by saaya View Post
    so you mean pciE 2.0 16 is too much bandwidth hence no scaling and pciE 2.0 8x is the same as 16x pciE 1.0 so it should be enough?

    i dont think so... dont forget that the top scores with sli are with oced pciE bus to increase the b/w... so 16x pciE 1.0 is definately not enough to supply enough bw to even 1 card, let alone 2! hence 8x pciE 2.0 wont be anough for 1 card either... so 16x 2.0 wont be anough for 2.

    all speculation, but thats using logic looking at results we know.
    im pretty sure 9800gx2 will b slower than 2 8800gt/gts cards with the same specs and clockspeeds but each card having full 16x b/w
    According to several reviews, the 8x pcie 1.0 versions have plenty of bandwidth except for o/c'd Ultras and SLI configs. 16x and 16x X2 are PLENTY of bandwidth for every card under every condition possible. The best part about pcie 2.0 right now is the increase in power to the slot. Until cards start maxing out 16x bandwidth, the extra it gives is useless.

    Kind of like ATI's decision to bring dx10.1 and sm4.1 in their cards now when its only a minor update......and an update that wont even be out until SP1 is officially released....and devs wont even be using these features until at least the end of 08 into 09.

    Can anyone tell I HATED that ATI tried to market those features?!?!?

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •