No it is not. While TDP will increase as power increases, rated TDP not a good indicator of power draw except it can tell if it should be low or higher ranges.
Printable View
average power = heat
Every single watt that a circuit consumes is converted to heat. What would these watts else go? Other types of radiation would be very low compared to the heat.
Ok, energy consumption = heat output. Or power consumtion = heat output per second.
No, average power = average heat produced.
There is no moving parts or anything other. Some might be converted to sound trough vibrations in certain systems but in CPU we can safely say that 99,9999% of used power is converted to heat.
TDP = thermal desing power. So it is at some max that chip will consume. There is no point to give other than coarse TDPs.
In end TDP is just some max that chip will use in some test case. Usually used power is less even in "full" usage situations.
TDP doesn't say much about power consumption really. Indirectly yes, but thats all. The products usually fall into certain categories by their TDP, e.g. 9 W, 14 W, 19 W, but that just means that they use the same cooling solution and that cooling solution is designed to dissipate that much of heat without letting the chip being cooled to reach it's maximum junction temperature set by the manufacturer. We can with probability say that a part with TDP of 10 W will consume more than a part with TDP of 6 W, but we can not be certain of it. TDP does not prove it any way. For example it could be possible that for some unknown reason the manufacturer slapped a higher TDP for the part just to make sure it runs cooler.
Being an HTPC tragic this news changes everything for me, at last the right compromise (IMHO) of GPU/CPU power with a very small power draw so true on 24/7 HTPC's are finally doable :up: