Page 3 of 4 FirstFirst 1234 LastLast
Results 51 to 75 of 79

Thread: My performance simulation of GeForce 9800GX2

  1. #51
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Location
    Victoria, Australia
    Posts
    948
    Man I like the theory behind your tests, I certainly do... But I truely beleive the Card itself will have a fair amount of variation to your findings. Its not that your doing anything wrong, but i think if nvidia wanted people to get results like this, they would have just left it to people to have 2 GTS to use.

  2. #52
    Xtreme Enthusiast
    Join Date
    Feb 2006
    Location
    UK
    Posts
    545
    Not everyone has a SLI mobo though.
    Q9650 @ 4Ghz - 1.216v
    E8600 @ 4500Ghz - 1.34v - Q822A435
    Q6600 @ 4014Ghz & E2160 @ 3.6Ghz
    Silverstone Zeus 850w, Maximum formula,
    Custom Water Cooling, 4GB OCZ 9200 Flex II,
    Xfire 4850's 800/1150, 3D Mark 06 , Vantage,
    Frontlineforce

  3. #53
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Location
    Victoria, Australia
    Posts
    948
    Quote Originally Posted by Devious View Post
    Not everyone has a SLI mobo though.
    Yer thats good point... I guess I am actually hoping that this card preforms a bit better than this to be honest.

  4. #54
    Xtreme Enthusiast
    Join Date
    May 2007
    Posts
    831
    Just get good drivers, 512-bit interface (is that 256+256 for cards, no?), and 1GB of memory, underclock all you want, and leave it to the Xtreme guys to get good cooling, clock it way up.

    And thats how it should be done.
    Atleast, that's what she said.
    Gigabyte P35-DQ6 | Intel Core 2 Quad Q6700 | 2x1GB Crucial Ballistix DDR2-1066 5-5-5-15 | MSI nVIDIA GeForce 7300LE

  5. #55
    Xtreme Mentor
    Join Date
    Jul 2004
    Location
    Ontario
    Posts
    2,780
    ah no, actually the cards don't double their bitrate and memory amount. The card will still run at 256bit with 512mb of memory. Unfortunately with SLI you cannot double them, the specs for one card is what is used for them both when running, they are just running parallel to each other.
    Silverstone Temjin TJ-09BW w/ Silverstone DA750
    Asus P8P67
    2600K w/ Thermalright Venomous X Black w/ Sanyo Denki San Ace 109R1212H1011
    8GB G.Skill DDR-1600 7-8-7-24
    Gigabyte GTX 460 1G
    Modded Creative X-Fi Fatal1ty w/ Klipsch Promedia 2.1
    1 X 120GB OCZ Vertex
    1 X 300GB WD Velociraptor HLFS
    1 X Hitachi 7K1000 1TB
    Pioneer DVR-216L DVD-RW
    Windows 7 Ultimate 64


    Quote Originally Posted by alexio View Post
    From the hip and aim at the kitchen if she doesn't approve your purchases. She'll know better next time.

  6. #56
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    above USA...below USSR
    Posts
    1,186
    why would u unerclock? they reduced the size of the gpu to lower heat out put. my money is that the clocks will be the same if not a little higher.
    Case-Coolermaster Cosmos S
    MoBo- ASUS Crosshair IV
    Graphics Card-XFX R9 280X [out for RMA] using HD5870
    Hard Drive-Kingston 240Gig V300 master Seagate 160Gb slave Seagate 250Gb slave Seagate 500Gb slave Western Digital 500Gb
    CPU-AMD FX-8320 5Ghz
    RAM 8Gig Corshair c8
    Logitech 5.1 Z5500 BOOST22
    300Gb of MUSICA!!


    Steam ID: alphamonkeywoman
    http://www.techpowerup.com/gpuz/933ab/

  7. #57
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by cantankerous View Post
    ah no, actually the cards don't double their bitrate and memory amount. The card will still run at 256bit with 512mb of memory. Unfortunately with SLI you cannot double them, the specs for one card is what is used for them both when running, they are just running parallel to each other.
    Quote Originally Posted by MuffinFlavored View Post
    Just get good drivers, 512-bit interface (is that 256+256 for cards, no?), and 1GB of memory, underclock all you want, and leave it to the Xtreme guys to get good cooling, clock it way up.

    And thats how it should be done.
    Atleast, that's what she said.
    The Card is Rumored to have Full 512bit with 1GB of Memory, Its not shared from my understanding.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  8. #58
    Xtreme Guru
    Join Date
    Aug 2003
    Location
    Athens, Greece
    Posts
    3,656
    Why ppl can't wait...?
    Project ZEUS II

    Asus Rampage II Extreme
    Intel I7 920 D0 3930A @ 4.50GHz (21 X 214mhz)
    3 x 2GB G.Skill Trident 1600 @ 1716MHz (6-8-6-20-1N)
    2 x Asus HD 6870 CrossFire @ 1000/1100MHz
    OCZ Vertex 2 60GB | Intel X25-M 120GB | WD Velociraptor 150GB | Seagate FreeAgent XTreme 1.5TB esata
    Asus Xonar DX | Logitech Z-5500 | LG W2600HP 26" S-IPS LCD

    Watercooling setup:
    1st loop -> Radiator: 2 x ThermoChill PA120.3 | Pump: Laing DDC-3.25 with Alphacool HF 38 top | CPU: Swiftech Apogee XT | Chipset: Swiftech MCW-NBMAX | Tubing: Masterkleer 1/2" UV
    2nd loop -> Radiator: ThermoChill PA120.3 | Pump: Laing DDC-3.2 with Alphacool HF 38 top | GPU: 2 x EK FC-6870 | Tubing: Masterkleer 1/2" UV


    Assembled in Mountain Mods Ascension Trinity
    Powered by Corsair Professional Series Gold AX1200

  9. #59
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    9800GTX will not be a 'new' chip. It will still be a G92 but this time FULLY unlocked, probably with 144 or 160 shaders and 24 ROPs (384bit ram) + even higher default clocks.

    G80 has 690Million transistors (including the 9 million in the NVIO chip)
    G92 has 754Million transistors but the current 8800GTS only has 16 ROPs compared to 24 on the G80.

    This makes no sense, why would NV make a chip that is basically identical to the G80 design with 64 Million extra transistors but 8 less ROPs??

    Sure, the G92 has double the TMUs of G80, but if you look at the 7800GTX vs 7900GTX; they have identical chip functionality but 7800GTX is 302 million transistors vs 278 million in the 7900GTX!! Nvidia was able to shave 24 million transistors off the 7800GTX while keeping all the features. I have no doubt that they were able to pull off a similar trick with the G80 vs G92 since they had about a year to work on the design.

    Obviously, there are still some parts of the G92 chip that are still hard locked. This would also explain why there are such delays with supplying G92 chips for the 8800GT and 8800GTS because each chip is being binned so thoroughly. It also makes more sense this way because designing a new chip specifically for the 9800GTX line would be too costly especially since it's one of the ultra high end cards and volumes will be very low.

    Nvidia is very smart taking this approach because it gives the highest yields possible. With the RV670 chip, if only 319 of the 320 shaders are working, then the whole chip is useless and scrapped. With the G92 chip, if all the shaders work, they bin it as a 9800GTX, if only 129 of them work: 8800GTS, if 113 of them work: 8800GT and if only 97 of them work: 8800GS! Very few wasted chips.

  10. #60
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Quote Originally Posted by Warboy View Post
    The Card is Rumored to have Full 512bit with 1GB of Memory, Its not shared from my understanding.
    You are correct - that's rumors. technologicaly this is not possible with current implementation. it still remains 2x256bit and 2xwhatever-memory-amount-per-card.

  11. #61
    Xtreme Enthusiast Shocker003's Avatar
    Join Date
    Jul 2007
    Location
    Germany
    Posts
    725
    Quote Originally Posted by EnJoY View Post
    Remember, this was just a simulation, not real results.
    You are damn right . We shouldn´t throw punches over a simulation, wait until these 9800 cards to hit the retail shops.


    MAIN RIG--:
    ASUS ROG Strix XG32VQ---:AMD Ryzen 7 5800X--Aquacomputer Cuplex Kryos NEXT--:ASUS Crosshair VIII HERO---
    32GB G-Skill AEGIS F4-3000C16S-8GISB --:MSI RADEON RX 6900 XT---:X-Fi Titanium HD modded
    Inter-Tech Coba Nitrox Nobility CN-800 NS 800W 80+ Silver--:Cyborg RAT 8--:Creative Sound BlasterX Vanguard K08

  12. #62
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by HKPolice View Post
    9800GTX will not be a 'new' chip. It will still be a G92 but this time FULLY unlocked, probably with 144 or 160 shaders and 24 ROPs (384bit ram) + even higher default clocks.

    G80 has 690Million transistors (including the 9 million in the NVIO chip)
    G92 has 754Million transistors but the current 8800GTS only has 16 ROPs compared to 24 on the G80.

    This makes no sense, why would NV make a chip that is basically identical to the G80 design with 64 Million extra transistors but 8 less ROPs??

    Sure, the G92 has double the TMUs of G80, but if you look at the 7800GTX vs 7900GTX; they have identical chip functionality but 7800GTX is 302 million transistors vs 278 million in the 7900GTX!! Nvidia was able to shave 24 million transistors off the 7800GTX while keeping all the features. I have no doubt that they were able to pull off a similar trick with the G80 vs G92 since they had about a year to work on the design.

    Obviously, there are still some parts of the G92 chip that are still hard locked. This would also explain why there are such delays with supplying G92 chips for the 8800GT and 8800GTS because each chip is being binned so thoroughly. It also makes more sense this way because designing a new chip specifically for the 9800GTX line would be too costly especially since it's one of the ultra high end cards and volumes will be very low.

    Nvidia is very smart taking this approach because it gives the highest yields possible. With the RV670 chip, if only 319 of the 320 shaders are working, then the whole chip is useless and scrapped. With the G92 chip, if all the shaders work, they bin it as a 9800GTX, if only 129 of them work: 8800GTS, if 113 of them work: 8800GT and if only 97 of them work: 8800GS! Very few wasted chips.
    I hope that's true, and it would make sense, but I thought the G92 was 128SP's max unless they derived another chip from it

  13. #63
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by jas420221 View Post
    According to several reviews, the 8x pcie 1.0 versions have plenty of bandwidth except for o/c'd Ultras and SLI configs. 16x and 16x X2 are PLENTY of bandwidth for every card under every condition possible. The best part about pcie 2.0 right now is the increase in power to the slot. Until cards start maxing out 16x bandwidth, the extra it gives is useless.

    Kind of like ATI's decision to bring dx10.1 and sm4.1 in their cards now when its only a minor update......and an update that wont even be out until SP1 is officially released....and devs wont even be using these features until at least the end of 08 into 09.

    Can anyone tell I HATED that ATI tried to market those features?!?!?
    yeah, the difference is small, but people here go nuts if they find a way to gain 0.1% extra performance... we can only guess what the difference from 2x16 2.0 and 1x 16 2.0 is for 2 cards in sli, but id say its 1-5% depending on the app. and if you use professional apps like 3dmax and catia then you DO see big differences between pciE 16 and 8 and pciE 1.0 and 2.0 afaik.

    about dx10.1... afaik it only makes some features of dx10.0 mandatory... its just atis way of showing off that they support 1 or 2 features nvidia doesnt or doesnt guarantee

    Quote Originally Posted by HKPolice View Post
    With the RV670 chip, if only 319 of the 320 shaders are working, then the whole chip is useless and scrapped.
    that doesnt make much sense, why would ati throw away gpus?
    im sure they are recycled and used as something else.
    and what makes you think rv670 only has 320 shader processors after all?
    i wouldnt be surprised if it has 360-400 shader processors overall

    about g92, i think it has more than 128 stream processors...
    and H and other sources already confirmed this:
    256 Stream Processors Total

  14. #64
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by saaya View Post

    about g92, i think it has more than 128 stream processors...
    and H and other sources already confirmed this:
    I thought they meant the 9800GX2 would have 256 SP's total, which is correct if it is 2 x 8800GTS-512 (full G92) cores put together, as that's 2 x 128...

    It's the same as saying there's 1GB of RAM when it's really just 2 x 512MB's, 512 for each core

  15. #65
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    Quote Originally Posted by zerazax View Post
    I hope that's true, and it would make sense, but I thought the G92 was 128SP's max unless they derived another chip from it
    Ya well everyone thought that G92 only had 112SPs when 8800GT first came out

  16. #66
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    Quote Originally Posted by saaya View Post
    yeah, the difference is small, but people here go nuts if they find a way to gain 0.1% extra performance... we can only guess what the difference from 2x16 2.0 and 1x 16 2.0 is for 2 cards in sli, but id say its 1-5% depending on the app. and if you use professional apps like 3dmax and catia then you DO see big differences between pciE 16 and 8 and pciE 1.0 and 2.0 afaik.

    about dx10.1... afaik it only makes some features of dx10.0 mandatory... its just atis way of showing off that they support 1 or 2 features nvidia doesnt or doesnt guarantee


    that doesnt make much sense, why would ati throw away gpus?
    im sure they are recycled and used as something else.
    and what makes you think rv670 only has 320 shader processors after all?
    i wouldnt be surprised if it has 360-400 shader processors overall

    about g92, i think it has more than 128 stream processors...
    and H and other sources already confirmed this:
    Uhhh do you even understand how chips are made? Not every single chip on a wafer of silicon is perfect. Many of them have defects so parts of the chip have to be disabled or the whole chip gets thrown away if there are too many defects.

    There are no other ATi cards based on the RV670 chip that has fewer shaders than 320. Maybe they're saving up all the defective chips with less than 320 shaders and ATi might release a lower end card using these chips later, but so far there has been nothing.

    H only confirmed that the 9800GX2 has 256 shaders, NOT the 9800GTX. You are confused.

  17. #67
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Ya well everyone thought that G92 only had 112SPs when 8800GT first came out
    Uh who thought it was just 112SP? I sure as hell didn't, and most people knew that they wouldn't cut the real SP down from 128, which was what the 8800GTX had. After all, if G92 is a derivative of G80, why would they cut SP's?

    And most reviewers and people who went to the unveiling knew that the architecture was mostly the same, and knew it was still 128 but had parts disabled.

  18. #68
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Warboy View Post
    The Card is Rumored to have Full 512bit with 1GB of Memory, Its not shared from my understanding.
    If it's anything like the 7950GX2, it's not shared.

    Current GPU architecture and design hasn't gotten around that yet. We'll see what R700 does though, if it lives up to all its hype

  19. #69
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by zerazax View Post
    If it's anything like the 7950GX2, it's not shared.

    Current GPU architecture and design hasn't gotten around that yet. We'll see what R700 does though, if it lives up to all its hype
    Yes I know, But If it really does have a 512bit bus, and 1GB memory. Then It will be a 8800GTX/Ultra killer.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  20. #70
    Registered User
    Join Date
    Nov 2005
    Location
    Ho Chi Minh City, Viet Nam
    Posts
    128
    I think, 9800GTX will still be a G92. It has 128 SP, 64 TMUs, 768MB Ram 384 bits, 24 ROPs.

  21. #71
    I am Xtreme
    Join Date
    Sep 2006
    Posts
    10,374
    Quote Originally Posted by STaRGaZeR View Post
    8800GTS 512 SLI slower than 8800GT SLI?
    Didn't see that ? one of us needs glasses or a bigger screen

    Secondly I hope the driver support for the 8800X2 is better than the 7950x2 I had, as it really was slower in many games than my 7900GTX....

    Do you like the Asus OBR ? might sell my Maximus and go for the P5N-T Dlx
    Last edited by Leeghoofd; 01-07-2008 at 11:44 AM.
    Question : Why do some overclockers switch into d*ckmode when money is involved

    Remark : They call me Pro Asus Saaya yupp, I agree

  22. #72
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Leeghoofd View Post
    Didn't see that ? one of us needs glasses or a bigger screen
    I was talking about the simulation, 8800GTS 512 SLI with lowered clocks to match future 9800GX2 specifications... One of us seems to need interpretation lessons

    First, in the simulation is slower than 8800GT SLI and 8800GTS 512 SLI. Second, NVIDIA will suck in the driver development, so maybe we will see even more perfomance hit. All of this 9800GX2 crap looks like exists only to face the 3870X2, at marketing level.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  23. #73
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by bakalu View Post
    I think, 9800GTX will still be a G92. It has 128 SP, 64 TMUs, 768MB Ram 384 bits, 24 ROPs.
    no, 512mbit ram per gpu only afaik... makes sense, to get more mem you basically have to pay 2x as much as with a normal videocard cause the mem isnt shared. so when going from 512mb to 1024mb for a single card means 50$ extra, but for a dual card it would mean 100$ extra...

    but its dissapointing... i thought the gx2 is supposed to be a high end card, the fastest possible card... well for that you would expect at least 768mb per gpu to have reasonable fps at high res with high aa. a 256 vs 512 vs 1024 compare has already shown that at high res plus high aa 1024mb per g92 gpu means fps bosts of 50%+ ...

  24. #74
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    411
    Thanks for the benches, any idea where crossfire 2900xt 512MB's will fit in there? Got them 16x16x, I can do my own benching, but config is different.
    System
    Intel C2D Q6600 @ 3.78Ghz 1.536V (1.55V bios)
    TC PA 120.3 , D-TEK Fusion, MCP655, Micro Res, 6 Yate Fans
    Mushkin HP2-6400 2X2GB @ 100Mhz 5-4-4-12
    Asus Maximus Formula 420x9
    4 Samsung Spinpoint 250GB SATA II in raid 0
    Crossfire HD 2900XT 512MB 900/900 1.25V
    Pioneer DVD-RW
    830 Mobo Tray, Wating on MM Duality
    PC Power and Cooling 750W , mobo/cpu/gpus/cdrom , Powmax 380W , hd, fans, pump
    Acer AL2616W 19200x1200 100Hz (75hz native)

  25. #75
    I am Xtreme
    Join Date
    Sep 2006
    Posts
    10,374
    Quote Originally Posted by STaRGaZeR View Post
    IOne of us seems to need interpretation lessons
    Really not much to interpret from that phrase "Originally Posted by STaRGaZeR : 8800GTS 512 SLI slower than 8800GT SLI ? " : I'm just a belgian , you know from that country where we had elections and 140 days later we still don't have a real government lol...
    Just a joke mate I knew what ya ment.

    On topic now : another thread came up, at Tomshardware ( reliable or not) this card is posted to hit the shelves at a retail price of 450 dollars ???? lol my 8800GTS 512 is outdated again and it didn't even reach my house looooooooooooooool

    Link to that news (no review though nor performance) here :

    http://www.xtremesystems.org/forums/...d.php?t=172440

    Could that price be real ? This card could do some heavy pounding in SLI config on the 3dmarks... if they get their drivers sorted and hopefully most games will support it... ( heavy doubts there though)
    Last edited by Leeghoofd; 01-08-2008 at 07:02 AM.
    Question : Why do some overclockers switch into d*ckmode when money is involved

    Remark : They call me Pro Asus Saaya yupp, I agree

Page 3 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •