If you were doing some Prime95 or similar testing where the VID and actual voltage were fairly consistent, you probably could come up with some sort of correction factor.

Power consumption in a CPU is proportional to voltage squared. If VID voltage was 1.20 volts and actual voltage was 10% higher at 1.32 volts then the correction factor would be:

1.10 x 1.10 = 1.21

In that case, actual power consumption would be about 21% higher than what RealTemp is showing.

Does any software show actual power consumption based on actual voltage? Any monitoring software that uses the Intel method and the Intel recommended power consumption register is not accurate.