Originally Posted by
SKYMTL
While I can't go into every one of the details, I trust my own results since we are using the proper equipment and tests to properly determine power consumption of the full system (Line conditioner, UPM power meter, etc.). In addition, since I personally spent weeks determining the 3D application which would put the most constant load on the GPU, I can am confident in any numbers we produce.
Many tests you see are highly influenced by the CPU power consumption fluctuating up and down. The trick to properly determining power consumption is to first see which application will properly load the GPU while letting the CPU sit as idle as possible. Then make sure that the program can put an almost constant load on the GPU and its memory with minimal load times in order to get an even testing field.
Finally, the test MUST be run for AT LEAT 45 minutes in order to determine a peak power consumption number. This is due to the fact that many power meters have a sampling rate of about 250ms to 750ms which means that peaks in the power draw may not be logged. That is why you need to have a very long test under constant load; so the power meter can pick up the peaks in consumption even if it misses it the first, second, third and so on time.
I also have to say that it is extremely important to use a line conditioner for power consumption tests. As many of us know, input voltage can fluctuate quite a bit from a household power outlet. This voltage fluctuation can have a pretty large impact on efficiency numbers of power supplies which in turn would influence the numbers generated by any GPU power consumption test.
That is my story, that is how I tested and I am sticking behind my 70W difference statement 100%. :up: