Page 2 of 5 FirstFirst 12345 LastLast
Results 26 to 50 of 114

Thread: G80 to get 12K in 3Dmark 06

  1. #26
    Xtreme Guru
    Join Date
    Dec 2003
    Location
    Vancouver, Canada
    Posts
    3,858
    Quote Originally Posted by fade2green514
    i guess it depends on system configuration - what if theres a physx card? 3dmark06 supports physx doesnt it?
    anyways - my X2 3800+ @ 2.6ghz and 7900gtx 512mb scored 6166 in 3dmark06... if they double the amount of power it could put out, and then made it a dual card like the 7950gx2... that would be SO SICK lol

    personally i prefer ATI but if a deal comes along then i'll go with the flow.. lol
    Considering how hot and big the single card is, it doesn't look too likely that we'll see a multi-GPU card anytime soon
    i5 750 4.20GHz @ NH-D14 | 8GB | P7P55DLE | 8800U | Indilinx SSD + Samsung F3 | HAF922 + CM750W
    Past: Q6600 @ 3.60 E6400 @ 3.60 | E6300 @ 3.40 | O165 @ 2.90 | X2 4400+ @ 2.80 | X2 3800+ @ 2.70 | VE 3200+ @ 2.80 | WI 3200+ @ 2.75 | WI 3000+ no IHS @ 2.72 | TBB 1700+ @ 2.60 | XP-M 2500+ @ 2.63 | NC 2800+ @ 2.40 | AB 1.60GHz @ 2.60
    Quote Originally Posted by CompGeek
    The US is the only country that doesn't use [nuclear weapons] to terrorize other countries. The US is based on Christian values, unlike any other country in the world. Granted we are straying from our Christian heritage, but we still have a freedom aimed diplomatic stance.

  2. #27
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Considering how hot and big the single card is, it doesn't look too likely that we'll see a multi-GPU card anytime soon
    Today 11:41 AM
    Yea, but they will do a die shrink and put them together (mabe even use GDDR4) in time for the R600

    Actually what die size is the G80, lol.
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  3. #28
    Xtreme Guru
    Join Date
    Dec 2003
    Location
    Vancouver, Canada
    Posts
    3,858
    Quote Originally Posted by rodman
    Yea, but they will do a die shrink and put them together (mabe even use GDDR4) in time for the R600

    Actually what die size is the G80, lol.
    No, they won't do a die shrink in 3 months Unless they have been already working on it for a while.

    Even then, the power reduction would probably not be enough unless they clock it down a LOT.

    Yields are already horrible, so no way they would have enough chips for dual-chip cards anyway.
    i5 750 4.20GHz @ NH-D14 | 8GB | P7P55DLE | 8800U | Indilinx SSD + Samsung F3 | HAF922 + CM750W
    Past: Q6600 @ 3.60 E6400 @ 3.60 | E6300 @ 3.40 | O165 @ 2.90 | X2 4400+ @ 2.80 | X2 3800+ @ 2.70 | VE 3200+ @ 2.80 | WI 3200+ @ 2.75 | WI 3000+ no IHS @ 2.72 | TBB 1700+ @ 2.60 | XP-M 2500+ @ 2.63 | NC 2800+ @ 2.40 | AB 1.60GHz @ 2.60
    Quote Originally Posted by CompGeek
    The US is the only country that doesn't use [nuclear weapons] to terrorize other countries. The US is based on Christian values, unlike any other country in the world. Granted we are straying from our Christian heritage, but we still have a freedom aimed diplomatic stance.

  4. #29
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    What die size is the G80?
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  5. #30
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    I think both have been working on smaller die varients for some time. The G80/R600 cards were never supposed to be energy efficient. They were designed to be fast at all costs and get to market ASAP. Your going to have to wait until later next year for something that takes less energy.

  6. #31
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    I wonder if it will really have 2 power connectors (edit) Nope I guess the retail versions will only have one, the prototype cards had them for testing. Either that or the GTX will have 2 and the GS one.
    Last edited by rodman; 10-20-2006 at 08:57 AM.
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  7. #32
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by ewitte
    I think both have been working on smaller die varients for some time. The G80/R600 cards were never supposed to be energy efficient. They were designed to be fast at all costs and get to market ASAP. Your going to have to wait until later next year for something that takes less energy.
    bingo

    Quote Originally Posted by ewitte
    What ever the CPU its probably stock clocks and the video is also probably not overclocked at all and on beta drivers. This card may make 15k single card pretty easy.
    sure, sure.. just like how each year 9700pro gained 1000 in 3DMark2005... right?

    The only time you ever really see any significant performance increases from drivers is when a new game just comes out, its all rough, rushed and unoptimized and they're patching it and the drivers to fix the mistakes. DX10 is new. Vista is new. We may very well see big driver performance improvements.. but I think that 12000 (or some say 11000) score came from WinXP setup.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  8. #33
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    Quote Originally Posted by ***Deimos***
    bingo


    sure, sure.. just like how each year 9700pro gained 1000 in 3DMark2005... right?

    The only time you ever really see any significant performance increases from drivers is when a new game just comes out, its all rough, rushed and unoptimized and they're patching it and the drivers to fix the mistakes. DX10 is new. Vista is new. We may very well see big driver performance improvements.. but I think that 12000 (or some say 11000) score came from WinXP setup.
    Usually the first 3-4 months are pretty big with gains. Then its pretty minimal from there. There were several times I got 500-1000 points gain. It tapered back to 100,50 or even nothing or negative after that. Usually thats with the newer versions while the older versions drop a little.
    Last edited by ewitte; 10-20-2006 at 02:33 PM.

  9. #34
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Old Vizima
    Posts
    952
    The 12k number floating around for 06 supposedly came from a system with an E6600 @ 3.6 or 400 x 9. If you look @ the 06 rankings here on the forum that's as fast as X1900 Crossfire @ around 750|850 with the same CPU support. Example, the number 7 score:

    7. rob[GL] - 12004.00 - dual Radeon X1900 XT @ 756/828mhz - Intel Conroe 3.75GHz X1900XT Crossfire 756/828

    So if it really puts up 12k for a single card, I'll certainly be looking @ it.
    Last edited by Blacklash; 10-20-2006 at 05:27 PM.

  10. #35
    Registered User
    Join Date
    Oct 2006
    Posts
    9
    Quote Originally Posted by Master_G
    I care more about how many watts this thing is chucking out, scores at this point are little more than speculation, reality will probably be good scores with a ridiculous heat output.

    G
    I agree! Sounds like it's going to drain stupid amounts of power, when are they going to work on improving the efficiency of these things.
    E6600 @ 3.5ghz 24/7 (Ultra120 / Panaflo) / 2GB OCZ Plat PC6400 (4-4-4-12) / Asus P5B-D / Seagate 320GB 7200.10 / XFX 7900 GTX 512MB @ 705/1.88 / Antec Truepower 550W / Antec Sonata II / BenQ FP202W 20" LCD / Altec Lansing MX5021 / 3DMark05 12617 / Folding for Team OCAU

  11. #36
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    12,338
    The stock cooler can deal with the heat and the stock cooler doesn't look like anything TOO special. No doubt this will have the heartiest power consumption of any card, ever.....but it won't be 300W, 250W, or even pushing the 225W envelope nV gave themselves at stock....but it will eat up a lot of power, and OCing should only push that higher.

  12. #37
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by Vapor
    The stock cooler can deal with the heat and the stock cooler doesn't look like anything TOO special. No doubt this will have the heartiest power consumption of any card, ever.....but it won't be 300W, 250W, or even pushing the 225W envelope nV gave themselves at stock....but it will eat up a lot of power, and OCing should only push that higher.
    lets assume hypothetically for a minute, that we're taking existing control logic from 7900 and doubling execution units, and using similar to 90nm fabrication. 7900GT uses 1.2V set at 450Mhz, and oc to about 550 without overvolting. GTX uses 1.4V set at 650Mhz.. and oc only little bit (700) without overvolting. The increases Mhz only slighly raises the power consumption.. but the voltage makes the big difference... GTX almost twice the power of the GT.

    G80 will be a big complex chip. Like I earlier rationolized, they will use lower voltage and lower clocks to reign in the power consumption. Lets say for example 1.2V. But, you get Macci, PCIce, OPPainter etc.. over-volting one of these devils on cascade... 1.3, 1.4, 1.5.. maybe even 1.7V!! That alone is going to be a HUGE increase in power. Will certainly put those 1000W PSU to good use. And, the increased clockrate on such an SLI system might just push even the high end PSU past the breaking point.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  13. #38
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  14. #39
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Do a google for PC welt and it should translate.

    Diagram chip Geforce 8800 GTX
    Code name G80
    Road price approximately 650 euro
    Transistors about 700 million
    Manufacturing 90 nanometers
    Chip clock 575 MHz
    Streaming processors (FR) 128
    Working frequency of the SPs 1350 MHz
    Theoretical pixel filling rate 36800 MPix/s
    Memory quantity 768 MT GDDR3
    Number of memory chips 12
    Storing act 900 MHz
    Memory interface 384 bits
    Memory range 86.4 GB/s
    Shader model 4.0
    Direct 3D version 10
    Open GL version 2.0
    SLI yes

    What does 650 euro end up in U.S dollars?
    Last edited by rodman; 10-21-2006 at 11:08 AM.
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  15. #40
    Xtreme Enthusiast
    Join Date
    Jan 2004
    Location
    London, UK
    Posts
    650
    €650.00 = $819.848

    http://www.xe.com/ucc

    Likely to not be a direct currency conversion on the retail price in the USA though.

    G
    Last edited by Master_G; 10-22-2006 at 12:41 AM.

  16. #41
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Does anybody else here think that 1350Mhz for the Streaming Processors is a really odd number. Why isn't it the chip clock of 575Mhz? Why isn't it at least a rational fraction multiple? How is nVidia able to create such a drastically different clockrate on the chip for a portion of it? How has nVidia managed to double the clockspeed compared to 7900's pixel/vertex shaders? If its true, what kind of IPC sacrifices were required (ie P4 vs Athlon)?

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  17. #42
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    12,338
    In tech....Euros are usually 1:1 with USD.

    In regards to the varying clocks, they could do it with the G7x as well, and did. As for why the difference is so much....we'll probably find out in a few weeks....

  18. #43
    Xtreme Member
    Join Date
    Oct 2006
    Location
    Ontario
    Posts
    494
    Yikes 0_0

  19. #44
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    Its time to make a trip with 4 of those guys with me

  20. #45
    Registered User
    Join Date
    Aug 2006
    Posts
    34
    good news

    AMD ATHLON 3000+ 512+512+256+256 KINGSTON DDR400 RAM +EPOX NF4 SLI+ SEAGATE 160G

  21. #46
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Why do I get the strange feeling that using a Kentsfield over a conroe will be needed to get the most out of the G80 I wonder if Intel and Nvidia worked something out with how the Intel chipset/cpu work together with the G80 (like using the 3rd and or 4th core for help in rendering somehow) to get the most performance

    They say the scores are like 1500 point higher with quad core, but I wonder if it's just cuz the cpu score is higher, thus giving you a higher final score. 05 and 03 do not include the cpu mark for the final score so i say 05 would be a better bench for pure GPU performance.
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  22. #47
    Xtreme Member
    Join Date
    Sep 2005
    Posts
    408
    Quote Originally Posted by rodman
    Why do I get the strange feeling that using a Kentsfield over a conroe will be needed to get the most out of the G80 I wonder if Intel and Nvidia worked something out with how the Intel chipset/cpu work together with the G80 (like using the 3rd and or 4th core for help in rendering somehow) to get the most performance

    They say the scores are like 1500 point higher with quad core, but I wonder if it's just cuz the cpu score is higher, thus giving you a higher final score. 05 and 03 do not include the cpu mark for the final score so i say 05 would be a better bench for pure GPU performance.
    By *getting the most out of* if you mean running 3DMarks, then yeah maybe. Single-core A64 @2.4GHz+ (preferably 1MB L2) will be all you need for gaming. Don't! Let's not talk about Alen Wake, YET.
    I don't check my PMs very often.

  23. #48
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Last edited by rodman; 10-23-2006 at 11:09 AM.
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  24. #49
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by rodman
    WOW...

    that PCB looks completely different.
    Its shorter. Still has 2 PCIE power connectors. But, many more electrolytic capacitors. I can count quite a few large inductors too (as expected for large power consumption device). Can't see under heatsink to check the 12 memory chip thing though. Dual slot dense fin heatsink similar to 7900GTX heatsink.. probably a bit bigger/heavier. I certainly hope that nVidia can continue the tradition of the excellent 7900GTX heatsink.. elegant, quiet and runs the card cool.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  25. #50
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    From what I see elsewhere its 12000 with quad core and about 10500 with a x6800. I'm almost thinking it would be a good idea to grab a used 7950gx2 for cheap of someone upgrading and wait for R600 I already know I'll probably not be able to resist waiting without something to keep me occupied.
    Last edited by ewitte; 10-24-2006 at 02:19 AM.

Page 2 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •