The Cardboard Master Crunch with us, the XS WCG team
Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64
Can someone translate their testing methodology? IE: how was power draw on a single 8-pin determined and how was overall power consumption determined?
You pop up at the best times Sky.
I can only go so far as the Babblefish translator which is terrible.
How describes in the previous chapter, tension and river of the different supply lines must be measured to the determination of the capacity. We begin with the unproblematic stress measurement. For this a circuit analyzer came from the house Temna to the employment in this test, which agrees up to a right-of-comma position with our calibrated reference measuring instrument. The respective tensions were based directly at the PCI express Slot and/or on the PCIe-6-Pin or 8-Pin plug contacts.
For the current measurement we set Temna on the contactless measurement over a pliers ammeter - again from the house. The measurement of the current supply over the additional 6-Pin and 8-Pin-Stecker did not represent particular problems in this way. For the measurement of the power input over the PEG Slot however some changes were necessary. In order to be able to determine the rivers by means of pliers ammeters, current loops had to be soldered to the appropriate pins of the PEG Slots.
With the Zangenampermeter it concerns a TrueRMS, or too well German, real time rms measuring instrument. This is necessary, since with a diagram map not around a purely Ohm's load separate themselves it around a reactance acts, which exhibit a power input with high transient portion.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
That is a big WTF from me on that one. Was it a clamp meter or a daughter board used like they use at ATI and Nvidia? Also, did whatever they are talking about cause any resistance between the card and whatever measuring device was used? So many questions and not a UN translator in sight.![]()
Clamp meter
All along the watchtower the watchmen watch the eternal return.
If you look at total power consumption, these cards draw a lot of power. But if you're folding, these cards are way more energy efficient than your typical CPU.
MOBO Biostar TpowerX58 | CPU Intel i7 920 | GPU eVGA GTX 295 SLI | RAM Super Talent 12GB DDR3-1333 CL8 | SSD Super Talent 64GB | PSU Coolmax 1200W
PC Lab Qmicra V2 Case SFFi7 950 4.4GHz 200 x 22 1.36 volts
Cooled by Swiftech GTZ - CPX-Pro - MCR420+MCR320+MCR220 | Completely Silent loads at 62c
GTX 470 EVGA SuperClocked Plain stock
12 Gigs OCZ Reaper DDR3 1600MHz) 8-8-8-24
ASUS Rampage Gene II |Four OCZ Vertex 2 in RAID-0(60Gig x 4) | WD 2000Gig Storage
Theater ::: Panasonic G20 50" Plasma | Onkyo SC5508 Processor | Emotiva XPA-5 and XPA-2 | CSi A6 Center| 2 x Polk RTi A9 Front Towers| 2 x Klipsch RW-12d
Lian-LI HTPC | Panasonic Blu Ray 655k| APC AV J10BLK Conditioner |
Furmark is a GPU power virus. Of course it can cause the card to exceed it's TDP.
HD4870X2 consumes >100W more in Furmark than it peaks in games / other benchmarks. GTX285 peaks at bit less than 150W in real world situations.
HT4U's measurements are useless.
Last edited by largon; 02-03-2009 at 10:50 PM.
You were not supposed to see this.
Agreed, the whole article, this thread, its all completely useless. its like hooking a huge yacht to the back of a sportscar, driving it up a big mountain, and saying "OMFG Look how bad it is on gas"
The sports car wasn't made to pull a yacht.
The video card wasn't made for the software equivalent of pulling a yacht.
the video card is designed for what? To play games. So find the most power-hungry game on the market, and that is your test. If its not a game, it doesn't apply, simple as that.
That being said, I have a 1KW supply, and I waste my fair share of watts rendering. I agree, as long as its stable and fast, I don't care how much power it draws (to an extent). There are efficient video cards out there for people who want them. I am not one of those people.
Last edited by vinister; 02-03-2009 at 11:00 PM.
Wow! Just Wow! I don't even know where to start with this, but since it's not worth my time I won't except to say they do not have th capability to measure power on these cards.
Saying that, what they draw is irrelevant. These are performance cards. It takes power to run them. They are not for the meek or the penny pincher. They are neither cheap, nor cheap to run. Blasting performance cards isd about as rediculous as saying a top fuel dragster is inefficient...well DUH! What do you expect? It's almost as if these people think they are gonna get that kind of power for free like it will just magically appear from osmosis.
It's irrelevant how much power these things consume. I could care less. The only thing I look at with a video card is what is gonna give me the best performance. That is the only thing I look at. If I wanted "efficiency" and cheap to operate I sure as heck would not be looking at 400-500 dollar max performance video cards.
Man, I'm cracking up here on this. It just defies logic.
How many motherboards have auxiliary pcie power connectors? This article lets you know that they could be of some use if you are really stressing the vga. My board has a regular 4 pin auxiliary power by the pcie1 and it has a sata style auxiliary right by the pcie2.
i7 920...Intel DX58SO
12gb Patriot Viper Xtreme
Palit GTX 560 ti 2gb SLI
Antec Quattro 1000...Twelve Hundred Case
Intel X25-M 80gb x 4 Raid 0...Hitachi 7K3000 2tb
Sony GDM-FW900...Z-5500...X-Fi Forte
I wonder what a 4870x2 would draw with a voltmod and oc, lol.
TDP doesn't always mean MAXIMUM power draw, but can be also intepreted as AVERAGE power draw. There has been talk around AMD's TDP and Intel's TDP which are also not measured the same... This also explains why TDP is lower than actual power consumption.
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
The whole article shouldn't be named "real world power consumption of latest GPUs", but "What's the worst case scenario for latest GPUs concerning out of spec power draw with no relation to everyday use". I don't know how many of us play Furmark all day long, but it's interesting to read that the cards actually exceed specs instead of shutting down to minimize the chance of damaging other components.
1) TDP is one thing, power consumption is another
2) furmark is not representing real-world, so it's useless for drawing conclusions about the real world.
3) Anyway, I found the article usefull because it confirms what I was thinking about the power consumption of 4870X2 and GTX 295: 4870X2 is wasting more energy; it's significantly hotter, thus louder when it's on load.
/OFF TOPIC
The hybrid cars are the most energy efficient cars available at the market that have decent performance and can satisfy the average-joe's needs.
That plan is very ambicious. I don't beleive that it will be realised in 2011.There is a plan here in Denmark to have 100000(From TV news) pure electric cars running in 2011 together with an american organization and the biggest power company here. I´m not sure how many cars there are here. Something around 1 or 1˝million.
The key is to let better technologies enter the market. There are batteries with several times larger energy capacity then the best Li-ION batteries, capapble of recharching to full capacity within 30 min.The key would be standardlized batteries. Cars will have a 160km(100miles) range on a charge. And charging stations would simply swap battery. Its faster than putting fuel on your car today. You can ofcourse also charge it at home.
Hey, but let's not forget that there is engine using pure water as fuel. The only reason why we are still driving cars on fossil fuels are the big boses who want to milk as much money as they can from the old and outdated tech.
Mostly in the nuclear power plants.
I doubt, but it's a piece of cake to upgrade the power network, ofcourse if you have money.Will the current installed electrical grid supply the demand?
Last edited by gOJDO; 02-04-2009 at 12:37 AM.
I see that most of you don't care about power consumption, but these cards draw power way above the specs of the cables and connectors.
About using Furmark: Isn't it my decision what software I run on my GPU? If I decide to run Furmark on it, I expect that the card still operates according to the power connector specs, and that the on card power regulation can handle all the load a software can generate (according to the test, the GPU voltage regulation got really hot).
Also, if I want to use the GPU for computation - do I have to limit the load to (let's say) 80% to not burn the card/my computer?
Mainly windmills at night. And then there is hydropower from norway aswell at night. Both are "uncontrolled" and thereby having large excess capacities at night and with windmills also when its very windy.
Its calculated that 700 windmills can keep the entire danish carpark running on electricity. Even making gas into electricity is a large plant would be more efficient that out in the car
But again, we also burn biofuel for CO2 neutrality. So its a win/win.
Last edited by Shintai; 02-04-2009 at 01:38 AM.
Crunching for Comrades and the Common good of the People.
Well when we talk about gaming performance, power consumption is irrelevant. Also, for somebody willing to spend $500 for a graphics card, the delta in the bills doesn't matter at all. And don't forget, this is XS.
The "specs for the connectors and the cables" are BS. The cables and the connectors of every PSU can carry significantly more current and operate at significantly higher voltage than the max the PSU can give in a worst case scenario and for a long period.but these cards draw power way above the specs of the cables and connectors.
On the other side, the connectors on every graphcis card are designed to carry on much more than the needed power flow. No wonder no body has issues with the electric cable installation on an overvolted and OC-ed cards which are consuming way more energy than @stock.
It's also your decission if you are going to OC the card, increase voltage and work in a high ambient temperature.About using Furmark: Isn't it my decision what software I run on my GPU?
Again these "specs" are a crapload of BS.If I decide to run Furmark on it, I expect that the card still operates according to the power connector specs
If there were issues, there were going to be lots of dead cards, especially 4870X2's due to thier very high operating temeperature., and that the on card power regulation can handle all the load a software can generate (according to the test, the GPU voltage regulation got really hot).
You can safely use your card @stock for whatever computing purpose you want. Heck, you can OC your card and limit it to (let's say) 120% and not burn the card/computer.Also, if I want to use the GPU for computation - do I have to limit the load to (let's say) 80% to not burn the card/my computer?![]()
You know the companies are not paying huge sums of money to their engineers without a reason.
FurMark isn't a prefered benchmarking tool since it pushes GPUs above and beyond what they are specified to run at. You can't say that just because a FurMark produces abnormally high power consumption and that the GPUs are running out of spec. It is the PROGRAM that causes the GPUs to run above and beyond, NOT the GPUs themselves.
I've seen the way ATI tests for power consumption and they run every game you can think of at multiple resolutions to determine their max board power consumption and I am sure Nvidia does the same thing. However, I would go so far as to say that FurMark is buggy in the fact that it loads the GPU in ways it was not meant to be.
Bookmarks