Dua|ist: I've been reading about this on the E8500 forum but there's a problem with a lot of the data that's been gathered. Here's a link to the X-Bit article. It's easier to read in English!
Intel has documented that their 45nm sensors used in the Atom chips have plus or minus 10C of error at 50C. Slope error implies that as a CPU cools down, the amount of error would increase even further. If review sites or individual users do nothing to try and compensate for this then their reported temperatures or conclusions are meaningless.
X-Bit has decided to ignore the "slope error" and error at TjMax that these sensors have.
Their power consumption numbers are interesting but not consistent. It doesn't make sense to me that at idle and full load at default MHz that power consumption is identical for C0 vs E0 but when overclocked there is a 9 watt difference. I would need to see some more test data with more processors before I'd conclude anything. I'd use Prime95 small FFTs for consistent power usage. I have a wattage meter on my computer and I know it can float around a watt or two.
If the 9 watt difference is 100% accurate then that might translate into a 1C difference at full load but probably not into the differences that some users are seeing. Most of the differences that I've seen posted are more related to sensor error than anything else.
If you have a similar C0 and E0 then why not run them at the same FSB frequency and use the same multiplier. Use RealTemp and calibrate them at low MHz / low voltage so they both report about 5C above room temperature. Set them back to fully overclocked at the same MHz and core voltage and then compare Prime95 small FFTs full load temperatures. That would be interesting and a little more scientific than the X-Bit testing.