Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 96

Thread: [Chiphell]nVidia 9800GX2 PCB Photo

  1. #26
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by ownage View Post
    I trust W1zzard the most on power consumption.



    So the HD3870X2 takes 51watt more then the 8800GTX. That's not bad for 2 GPU's.

    Why is the title named nVidia 9800GX2? It's not dualGPU, so i guess this ain't an GX2 card.
    I hope you are joking on that last part...
    As for that chart, everything seems normal except for the last 4 cards.

    HD2900XT using less than the GTX?
    1024mb using less than the 512mb?
    HD3870 using 30w more than the HD2900XT?

    That is very hard to believe. Link to the review?
    Does he state how he finds the power consumption? My guess is a Kill-a-watt.
    Read why most wattage meters mean nothing, with the exception of the Chroma.
    Last edited by LordEC911; 01-27-2008 at 11:53 PM.

  2. #27
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Seattle, WA
    Posts
    496
    You guys think the swiftech mcw60 waterblock is gonna fit this? maybe with some small mounting mods or do you think the core part is too big to be cooled by it?

  3. #28
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by LordEC911 View Post
    I hope you are joking on that last part...
    As for that chart, everything seems normal except for the last 4 cards.

    HD2900XT using less than the GTX?
    1024mb using less than the 512mb?
    HD3870 using 30w more than the HD2900XT?

    That is very hard to believe. Link to the review?
    Does he state how he finds the power consumption? My guess is a Kill-a-watt.
    It's the average consumption, so nothing strange.
    Look here for the max Peak power consumption, which is shows a better comparison.
    http://i3.techpowerup.com/reviews/HI...power_peak.gif

  4. #29
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by ownage View Post
    It's the average consumption, so nothing strange.
    Look here for the max Peak power consumption, which is shows a better comparison.
    http://i3.techpowerup.com/reviews/HI...power_peak.gif
    Average PC means nothing... which is why I was so confused.
    Awesome that he doesn't tell us how he gets those numbers.

  5. #30
    Xtreme Mentor
    Join Date
    May 2005
    Location
    Westlake Village, West Hills
    Posts
    3,046
    If you read the review, he awesomely explains it. Average of every 2 seconds during 3D marks 03. Garbage benchmark, but that's what he used.
    PC Lab Qmicra V2 Case SFFi7 950 4.4GHz 200 x 22 1.36 volts
    Cooled by Swiftech GTZ - CPX-Pro - MCR420+MCR320+MCR220 | Completely Silent loads at 62c
    GTX 470 EVGA SuperClocked Plain stock
    12 Gigs OCZ Reaper DDR3 1600MHz) 8-8-8-24
    ASUS Rampage Gene II |Four OCZ Vertex 2 in RAID-0(60Gig x 4) | WD 2000Gig Storage


    Theater ::: Panasonic G20 50" Plasma | Onkyo SC5508 Processor | Emotiva XPA-5 and XPA-2 | CSi A6 Center| 2 x Polk RTi A9 Front Towers| 2 x Klipsch RW-12d
    Lian-LI HTPC | Panasonic Blu Ray 655k| APC AV J10BLK Conditioner |

  6. #31
    Turkey Man
    Join Date
    Mar 2005
    Location
    Jakarta (ex-Australia)
    Posts
    2,560
    I feel embarassed asking, but looking at the second PCB, it appears from the placement of the power connector that this PCB is actually in reverse of the other.
    So does the seond PCB actually mirror the first back to back? If so then extreme cooling ftw!

  7. #32
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Nanometer View Post
    If you read the review, he awesomely explains it. Average of every 2 seconds during 3D marks 03. Garbage benchmark, but that's what he used.
    No, he explains the software he used to load the cards...
    He does NOT explain what he used to measure the power consumption. My guess is that he is using a cheap Kill-a-watt which isn't anywhere close to accurate and if you would have read that link, you would not have only figured that out but that all other devices up until the Chroma also have inaccuracies.

  8. #33
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    470
    So... this is 2 G92 cores, attached to one ty, thin heatsink, that blows hot air at the back of the case? This is a good idea... how?

  9. #34
    Turkey Man
    Join Date
    Mar 2005
    Location
    Jakarta (ex-Australia)
    Posts
    2,560
    I count two heatsinks.....

  10. #35
    Registered User
    Join Date
    Oct 2005
    Location
    Norway
    Posts
    35
    There is only one heatsink, the cores are faceing each other so the cooler will be sandwiched between the PCBs
    Core i7 920@4.3GHz TRUE, 2xHD4890, 6GB OCZ@1640MHz

  11. #36
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by [cTx]Warboy View Post
    well thats a combined TDP.
    8800Ultra is TDP 190W
    ATi 3870 is 135W
    So I'm guessing that the 3870x2 will run hotter then this.
    3870 is 90W and 8800gtx is 130W, 50% more
    http://www.xbitlabs.com/articles/vid..._13.html#sect0

    8800gt and 3870 are almost identical in power consumption, but if you look at the different approaches to a dual gpu card the ati version is definately superior. it only has a longer pcb and bigger heatsink while the 9800gx is made up of 2 custom pcbs that need to be assembled with the heatsinks in between.

    both cooling and mfg cost wise the ati version of a dual card seems much better.

    Quote Originally Posted by [cTx]Warboy View Post
    well, I think if K|ngp|n modded his LN2 pots, It would be easy.
    sure you can build a pot that cools both gpus, but it would be ONE pot and it would have to make contact with a gpu on each side which would be a pita... how do you want to fasten the pot to the cards on each side? its incredibly messy...

    i couldnt think of a worse approach than what nvidia did to build a dual gpu card actually... have both gpus face INSIDE in between the cards is just plain stupid. where do you get the frash air from and where do you exhaust it to?
    you drill a hole in the pcb and bend the heatsink fins to the air moves in a 90 degree angle and then leaves on top.

  12. #37
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Seattle, WA
    Posts
    496
    I was going to step up to this card when it comes from my 8800gt, but now i think I may just get the 8800gts. This sort of looks like a disaster closely related to the 7959x2...

  13. #38
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Posts
    579
    So this card is to G92 Gts with DX10.1 support? wouldnt it be better just to get two G92 8800GTS wouldnt they oce better.

  14. #39
    Turkey Man
    Join Date
    Mar 2005
    Location
    Jakarta (ex-Australia)
    Posts
    2,560
    Ahhh, it all makes sense to me now thanks.

    1 pot for two cards could actually be a good thing for SLI

  15. #40
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Seattle, WA
    Posts
    496
    looks we will have to wait till march to get these anyway
    http://64.233.179.104/translate_c?hl...om.microsoft:*

  16. #41
    Xtreme Mentor
    Join Date
    Mar 2007
    Posts
    2,588
    think of the inside space consumption for your chasis. with this kind of 9800gx2 the quad sli idea might be feasible without having to sacrifice large amounts of chasis space.

    Ofcourse though, this concern is more than likely to be irrelevant to the average gamer and/or enthusiast.

    PLease post some more pics of the 9800gx2 if you can get them,

  17. #42
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Why NVIDIA VGAs always looks crappy as hell? Compared to that PCB, 3870X2's one is pure pr0n.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  18. #43
    Xtreme Enthusiast
    Join Date
    May 2007
    Posts
    831
    Quote Originally Posted by Zytek_Fan View Post
    They must have received a board partner non-working sample
    It's enough to make me
    Gigabyte P35-DQ6 | Intel Core 2 Quad Q6700 | 2x1GB Crucial Ballistix DDR2-1066 5-5-5-15 | MSI nVIDIA GeForce 7300LE

  19. #44
    I am Xtreme
    Join Date
    Apr 2005
    Location
    Upstate, NY
    Posts
    5,425
    Holy crap ma, theres a hole in my new video card!
    Core i3-550 Clarkdale @ 4.2GHz, 1.36v (Corsair A50 HS/F) LinX Stable
    MSI H55-GD65 Motherboard
    G.Skill 4GBRL DDR3-1600 @ 1755, CL9, 1.55v
    Sapphire Radeon 5750 1GB
    Samsung F4 320GB - WD Green 1TB
    Xigmatek Utgard Case - Corsair VX550

  20. #45
    Registered User
    Join Date
    Sep 2007
    Location
    Pittsburgh, PA
    Posts
    98
    Quote Originally Posted by malik22 View Post
    So this card is to G92 Gts with DX10.1 support? wouldnt it be better just to get two G92 8800GTS wouldnt they oce better.
    The only 'advantage' of this card for consumers is that it can run on any chipset. Doesn't require a SLI specific mobo.

    For NVidia the advantage is capturing the performance crown for a single card solution.
    Asus Rampage Formula
    Intel Core2Quad QX9650 @ 3.3GHz
    Corsair H50
    GSkill 4x2GB DDR2 F2-8000CL5D-4GBPQ @ 5-5-5-12
    Corsair 750HX 750W PSU
    Sapphire Vapor-X Radeon HD 5870
    AuzenTech AZT-FORTE X-Fi Forte
    Audio-Technica ATH-A700 Headphones
    Intel X25-M G2 160GB SSD
    ASUS DRW-2014L1T
    Corsair Obsidian 800D
    Microsoft Windows 7 x64

  21. #46
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Location
    Brasil
    Posts
    534
    Coolest VGA Sandwich ever.
    Not as elegant than the 3870X2 but way better than the 7950GX2.
    Having an SLI bridge it must support QuadSLI.

  22. #47
    Banned
    Join Date
    May 2005
    Location
    Belgium, Dendermonde
    Posts
    1,292
    Quote Originally Posted by seamumc View Post
    The only 'advantage' of this card for consumers is that it can run on any chipset. Doesn't require a SLI specific mobo.

    For NVidia the advantage is capturing the performance crown for a single card solution.
    i dont call this single card, just 2 cards slammed together

  23. #48
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    I personally think the cooling solution is rather elegant.. dual GPUs in a compact chassis. I'd like to see what waterblock venders will do with this setup, two-faced waterblocks would be very cool but sadly limited to that single card. Sure it isn't as neat as the 3870X2 but don't knock the card until it's out in the wild.

  24. #49
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Posts
    738
    Quote Originally Posted by LordEC911 View Post
    I hope you are joking on that last part...
    As for that chart, everything seems normal except for the last 4 cards.

    HD2900XT using less than the GTX?
    1024mb using less than the 512mb?
    HD3870 using 30w more than the HD2900XT?

    That is very hard to believe. Link to the review?
    Does he state how he finds the power consumption? My guess is a Kill-a-watt.
    Read why most wattage meters mean nothing, with the exception of the Chroma.


    well of course the 1024mb card is going to use less power than the 512mb one has gddr3 the other has gddr4.
    Quote Originally Posted by Manicdan View Post
    real men like the idea of packing lots of stuff into a very small space, which is what the mac mini is
    ----------------------------------------------------

    Quote Originally Posted by Baron_Davis View Post
    PS. I'm even tougher IRL.

  25. #50
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    234
    Quote Originally Posted by [XC] gomeler View Post
    I personally think the cooling solution is rather elegant.. dual GPUs in a compact chassis. I'd like to see what waterblock venders will do with this setup, two-faced waterblocks would be very cool but sadly limited to that single card. Sure it isn't as neat as the 3870X2 but don't knock the card until it's out in the wild.
    Quite easy for water block makers really, just make a block the same thickness as the current cooler, double sided with entry/exit flow holes obviousley in the top, take the current GTS/GTX block for instance and slap another one on it opposite facing. after all thats all nVidia themselves have done, took 2 cards and slapped another one against it oppsite facing.
    Intel Core i7 920 2.66ghz @ 4.3ghz HT Enabled, core @ 1.34v, VTT @ 1.28v, NB @ 1.30v (212 x 21) Batch No: 3910A369
    6gb (3x2gb) G.Skill RipJawsX DDR3 17000C9 2133MHz @ 1640mhz 8-8-8-24-1T
    Gigabyte G1.Sniper Rev 1.0 @ 205fsb Beta Bios
    1 x OCZ Agility 3 120gb SSD, AHCI
    1 x Western Digital 500gb SATAIII 16mb Cache HDD, AHCI
    Sony DVD +/- R/RW/RAM x22 Dual Layer, AHCI
    2 x Saphire ATI Radeon R9 270 2gb (Crossfire)
    Gigabyte Odin 850w Modular PSU (Software Controlled)
    EK Supreme HF Copper LGA1366, Laing D5 Pump, 2 x 240mm Radiator & 120mm Radiator
    Cooler Master HAF XB LAN Box
    Windows Se7en Pro x64 bit
    .

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •