-
Gradient refers to the gradual temperature difference from hot core to the ambient (or IHS, etc). These relationships can be calculated by intel supplied formulas, or can be measured. The gradient is dependent on power, the higher the power, the higher the gradient or temp differential between hot core and ambient.
Here is thermal imaging of gradient from hot core to cooler area, see figure 2.
http://domino.watson.ibm.com/comm/re...nnovation.html
And here is a pic (from stanford paper) of sensor graph of pentium showing gradient or temp differential, in this case from hot spot to ambient (38C) on pentium at ~78W load.

Intel defines this gradient with a formula based on TDP. Higher watts, the higher the gradient or temperature difference between hot core and ambient.
At even 3-6 watts, undervolted, idle, underclocked, there will be still at least a 6-8C difference or gradient between hot core and ambient, for example 28-30C core would be ~ 22C ambient, given excellent cooling/water or very high end air/fans.
When you get up to ~45+ watts, for example core i7 D0 at idle, 1.4 vcore, 4.4+ghz, there will be a minimum gradient or temperature difference between hottest core and ambient of at least ~15-18C.
For example if I boot up at 4.5ghz, 1.4 vcore my average realtemp idle temperature of 4 cores is 44C. My ambient intake temps is 27C, so I have a 17C temperature difference (or gradient) from core to ambient, which is typical for 50 Watts. Others maybe ~10-20% difference, ie few C one way or another given similar cooling. But not possible to have for example an 6-8C temp difference between idle core and ambient (gradient core to ambient) for 40-50W.
Last edited by rge; 06-13-2009 at 06:49 AM.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
Bookmarks