Quote Originally Posted by SKYMTL View Post
How many runs were done?

As many know, Vantage peaks in several different areas; many of which are less than a second long and may not be picked up by a standard power meter.

In addition, CPU usage is a HUGE factor and can increase / decrease number accordingly and in a non-linear fashion.

Looking at that chart, it seems like the calculations for some cards are VERY high while others are low. It could be that the monitor is picking up the areas where CPU + GPU peaks converge in some situations and registering situations of non-convergence in others.
whats your opinion if someone did a 1 hour loop of some game or graphical benchmark and looked at the total power consumption of say killawatt meter? do you think that a spiky consumption would result in horribly skewed total watt-hour consumption? while the display of a meter like that only updates every second or so, does the watt-hour part depend on those low update frequencies?

you take power readings to heart a little more than most sites do, so your opinion is quite valued.