Originally Posted by
gtj
It's a little deceiving. cpuz, rmclock and orthos are showing you the Voltage ID (VID) set by the processor which instructs the Voltage Regulator Module (VRM) what voltage to supply. The processor can reduce that voltage automatically (C1E or the thermal monitors) or the OS can reduce it via SpeedStep. Neither the the processor nor the OS EVER increases the voltage by themselves. Each processor has hard coded it's default VID. Most of us see this as 1.3250v but they can definitely vary. You can check the processor packaging if you got a retail kit OR look at the processor voltage as displayed at the bottom of the processor tuning BIOS page to see what your processor's default VID is.
Now the fun part... You can alter what the VRM actually delivers by changing the vcore (processor voltage override) in BIOS. Assuming you're processors default VID is 1.3250v and you set 1.4000v in BIOS what you're actually doing is telling the VRM "When the processor asks for it's default VID, (1.3250v in my case) really give it 1.4000v". You'd think that if the processor asks for a lower voltage that the VRM would scale accordingly but it seems that if you set a vcore other than the default, the VRM always delivers that voltage even if the processor asks for a lower one.
If you really want to see what's being delivered, you HAVE to use a program that reads the aSC7621 sensor chip. IDCC, IDU, Everest, Speedfan, Sandra, etc. My personal favorite is Everest because you get EVERYTHING. You'll probably notice that the voltage is slightly lower than what you expect. If you're vcore is set to 1.5000v and the processor is asking for the max voltage, you might see 1.49 or something. That's normal.
If you have a calibrated multimeter you can REALLY see what's going on by measuring the voltage at the output of the VRM. There's a post somewhere back at the beginning of this thread that shows the correct point on the MB.