I don't think that's completely accurate. Keep in mind that not all chips are created equal, hence why they are binned according to TDP also, the current draw will vary between good/bad processors. Even though a line of processors for example has a rating of 125W TDP, it doesn't mean they will all use 125W at max load, some far less than 125, some close or at 125.
This is the reason Intel is charging more for a 65W line of Q9450s... these processors used to be in the standard line of Q9450s, but Intel decided to take advantage of the situation and segregate the ones with extremely low power consumption from the average ones.
Also, low VID chips are generally better because low VID chips have a trend of scaling better, as all chips have a point where the voltage scaling goes from linear to exponential-low VID chips stay in the range of acceptable voltage longer.
I still want some proof of your thoughts. I just do not understand how a lower VID is going to have higher temps.
So again can we see some proof??????
~ Little Slice of Heaven ~
Lian Li PC-A05NB w/ Gentle Typhoons
Core i7 860 @ 3gHz w/ Thor's Hammer
eVGA P55 SLI
8GB RAM
Gigabyte 7970
Corsair HX850
OZC Vertex3 SSD 240GB
~~~~~~~~~~~~
Bookmarks