Today prices here,
5970-541 EUROS
GTX480-479 EUROS
55870-339 EUROS
So if you can compare 5870 vs 480, you can do the same with 480 vs 5970![]()
No. Listen, I'm running a 4830 myself right now and my recommendation for a high end desktop GPU is and will probably remain the 5850 for the near future, but you have to give credit where credit is due. Nv has managed to jam a ton of forward looking arch improvements and GPGPU stuff into this thing and still managed to make it fairly competitive in pure GPU terms right now. That's a major accomplishment that's going to pay off in the long run. Charlie can't seem to see anything beyond the current state of the high end desktop GPU market, and his predictions of financial d00m for Nv are totally off the mark. Nv is actually way ahead of the curve on a lot of things right now, and that is going to become painfully obvious to just about everyone over the next year or two.
Just when you thought your 850W PSU was enough for your stock PC:
It gets worse:
Power consumption figures in benchmarks can sometimes be quite revealing:
Its probable that the 5870 isn't using a good portion of its resources in Unengine Heaven.
Whether its poor driver optimization or design limitation is unknown.
Surely with crazy high power output, the GTX480 is being used 100% by driver!?
Its very possible, like i7/Phenom, that typical scenarios use much less power than special case (recall how AMD advertised "typical" not "max" Phenom power)
nVidia might then purposefully slows down shader execution in Fermi in these special non-game scenarios to ensure advertised 250W spec.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
My 6600GT also ran at 100C. But it was only like 50W. And the fan was stuck with dust. The 4850, runs near 100C on purpose to keep the fan speed as low as possible. You can tradeoff much lower temps at higher fan speed for a bit of rusling noise.
There is a HUGE difference.
On the GTX480 you cannot tradeoff fanspeed for lower temps. Fan is already at 70%+!!
If you dial it down to 30% the thing will have a meltdown = FIAL.
If you crank it up to 100%, you *might* get down to 80C - the highest I've reached when overclocking GTS250 = FAIL.
Power normalized to 5850=max (+/-20):
Code:card max heaven furmark Crysis comments 5970 300 264 315 258 looks pretty accurate GTX295 289 342 287 xbitlabs OCCT:GPU is smoking! GTX480 250 321 329 301 somebody is fibbing 5870 189 192 225 199 ~200 in games vs 189 advertised 5850 158 158 158 158 --------------------- heaven=toms, furmark=anand, crysis=anand![]()
Last edited by ***Deimos***; 03-26-2010 at 10:00 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
Well you can subtract 851-479 = 372W more for the extra card, assuming nothing else was changed. That's at the wall of course... taking a 0.85 efficiency, that's 316W (at 0.80 efficiency, its 298W). So either way, the cards are drawing WAY over the 250W Nvidia advertised.
Holy moly!
Best performance per watt ...
In ...
Unigine ...
....
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
did they have the usual pci express sticker on their box????
Last edited by Sn0wm@n; 03-26-2010 at 10:02 PM.
well if u take 5 reviews from top websites and average it. this card eats alot of power. on avg. about 147 watts more than a 5870 under full load and in game about 92 Celsius temp
Around 150W for everything in the system except video cards.
This is based on the total power output of 57xx and 58xx.
+/- maybe 20W.
Thus, the two 480s are sharing 700W between each other. Maybe 20-40W for error and another 20W for extra power of SLI chipset... that's still 640W!
Its impossible to justify how the system could possibly use 350W, given advertised 250Wx2.
The system by itself can't use more than 226W which already included a 5750.
850W probably because EVERYTHING - PSU, CPU, DRAM, HDD, and ofcourse 480s - are LESS EFFICIENT because of VERY HIGH TEMPS caused by the 480s. I would never store critical data on hard drives in there.
yes GTX480 is about 100W (45%) more than 5870.
The only card worse in perf/Watt is the GTX295.. sorry GTX480 #2 place.
Last edited by ***Deimos***; 03-26-2010 at 10:14 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
Kind of an eye opener thats for sure, too bad TSMC doesn't have a high-k metal gate process on 40nm, 28nm should be high-k metal gate or at least be an option with TSMC later this year.
I'm wondering how much of a role leakage may be contributing with the high transistor count.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
did anyone compare 2x5970's to 480 sli?
im starting to think that might be the only concrete nvidia win
imo everything other matchup is so close that you can argue for either, for the most part
Last edited by grimREEFER; 03-26-2010 at 10:19 PM.
DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis
I'd be more interested in seeing 3x 5870. Quad scaling usually sucks. However it seems that 480 sli is hit in miss right now. In the games with good scaling, the scaling is *really* good. In the games with poor scaling, you may as well turn off your heater and save on natural gas...
Feedanator 7.0
CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i
Nvidia partners prolly going nerdrage right now for this massive fail, after waiting so long for this...![]()
░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░
TMSC 40nm is NOT the problem. Somehow magically AMD made a DUAL-GPU card with less power and lower idle power.
Last time I checked, 5970=4BT. GTX=3BT. 4>3 isn't it?
The problem is the design. AMD put a LOT of focus on maintaining 4870's very high performance/TDP. nVidia was focused on top performance irregardless of means.
Its exactly the same way P4 fell from grace at 3.8Ghz.
(this is why Intel doesn't just hope for 1%performance/1%power, but requires it and much higher)
Now, where are all those Dual-Fermi folks? or the Fermi laptop weirdos.. I wanna see them spin their way out.
Who wants to be the first to offer 5 year warranty.. anybody? What about 3 year? Any takers? EVGA? BFG? They will provide SOME warranty (and house fire insurance), right?
Originally Posted by tomshardware
Code:idle core mem shader power 5870 157 300 157 27 480 50 67 100 60!?
Last edited by ***Deimos***; 03-26-2010 at 10:31 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
also, in the case of your 6600gt, it was getting hot, but it wasn't really heating up the environment too much because there wasn't much energy being effectively outputted.
in the case of a 480, it will be able to heat up a room pretty well cause it's certainly not getting hot just because of an incompetent cooler.
DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis
Bookmarks