Quote Originally Posted by Shintai View Post
I think some people need a wakeup call regarding temperatures. There is a reason 3Ghz Phenom is 125W TDP, or even 95W if you want for the 2.8Ghz AM3 version.

AMD uses an uncalibrated sensor. Intel uses a calibrated one inside the hottest part of the core(s). Plus there is the tjmax etc values. Do I even have to tell that story? Its only recently we got actual track on 45nm Core 2s.

For some odd reason the power consumption=heat have been lost somewhere for a bunch of people. Lets be honest here. If CPU A uses 100W and CPU B uses 100W. Then 1 CPU wont be 60C and the other one 30C unless their cooling solution is pretty different. Its only different in the illusion.
You point about power draw is very valid. However what you fail to note is that there is a difference between power draw and heat dissipation. Simply because you make a chip with a low wattage count does not gaurntee that it will not spew all of the power as heat disspation due to leakage.

Phenom = High Power Draw + SSDOI = Low Leakage = Low Temps
i7 = Lower Power Draw + High K = High Leakage = High Temps

Also you need to factor that Intel and AMD historically measure TDP differently. One is theoritcal TDP max (AMD) the other is normal operating enviroment (Intel)