rge came up with a great post a while ago about how to calibrate RealTemp.
http://www.xtremesystems.org/forums/...postcount=2429
The numbers he came up with were using his E8400 and right away the question was, "What about a Quad?"
I did some testing today at the recommended calibration point of ~1600 MHz and 1.10 volts. My E8400 was consuming 133 watts as measured with a Kill A Watt meter.
I swapped in my Q6600 - G0, and at the same settings and temperature, the meter showed 137 watts. During a CPU Cool Down Test with a Tuniq Tower, I've been seeing a 3 watt change of power consumption at the wall equaling a 1C change in core temperature. For a 45nm Dual Core that would be 1.5 watts per core per degree C.
If a Q6600 is only consuming an extra 4 watts at this setting (137w vs 133w) that works out to only 1 watt per core extra.
What this means is that a Q6600 at ultra idle should have core temperatures no more than 1C higher than the numbers that rge came up with.
I also put in my 65nm E2160 which only has 1MB of cache and it was using 129 watts in this test. In theory it should idle approximately 1C cooler than an E8400.
That's the thing about the ultra idle test. It doesn't matter too much what CPU you have, your results are going to be very similar to what rge found during his test plus or minus a degree.
Bookmarks