At 0C ambient (outside) my i950 at 4.4ghz, 1.4 vcore loads at 50C and cpu uses and hence dissipates ~160W as heat (read by DES chip), and at 25C ambient but o/w same settings the cpu loads at ~75C and uses and hence dissipates ~180W heat using same loading program, seen same on multiple mobos and cpus. Cooler cpus consume less power, and since all consumed power in cpu is released as heat, I dont get your statement.
And as an aside, unlike in metals (conductors) where resistance increases as heat increases (from increased unwanted collisions from increased heat energy), in semiconductors and silicon and insulators in general, resistance decreases as heat increases. Here and scroll down, see negative Temp coefficients in germanium, silicon. Insulators have tightly bound matrix and dont conduct electricity well, so added energy via heat is beneficial to allow more free movement, ie conductance ie, less resistance. Same energy in metal, which already has good conductance, allows "too much movement" and start getting unwanted, interfering collisions.
But reason cool cpus generate less heat has to do with leakage. Hotter cpus leak more, hence more power consumption.




Reply With Quote
.
there u got a nice one ahah







*

Bookmarks