I think some people need a wakeup call regarding temperatures. There is a reason 3Ghz Phenom is 125W TDP, or even 95W if you want for the 2.8Ghz AM3 version.
AMD uses an uncalibrated sensor. Intel uses a calibrated one inside the hottest part of the core(s). Plus there is the tjmax etc values. Do I even have to tell that story? Its only recently we got actual track on 45nm Core 2s.
For some odd reason the power consumption=heat have been lost somewhere for a bunch of people. Lets be honest here. If CPU A uses 100W and CPU B uses 100W. Then 1 CPU wont be 60C and the other one 30C unless their cooling solution is pretty different. Its only different in the illusion.
Here is a cute example. 3 programs. 3 different temperatures.
http://download.intel.com/design/pro...nex/320837.pdf
PDF doc pg 46 wrote:
Intel does not test any third party software that reports absolute processor
temperature. As such, Intel cannot recommend the use of software that claims this
capability. Since there is part-to-part variation in the TCC (thermal control circuit)
activation temperature, use of software that reports absolute temperature can be
misleading.
See the processor datasheet for details regarding use of IA32_TEMPERATURE_TARGET
register to determine the minimum absolute temperature at which the TCC will be
activated and PROCHOT# will be asserted.




Reply With Quote
Bookmarks