i though that power draw scales ^2 with voltage. Or i was told so a while ago on this forums...
so if its 68W @ 1.1V it should be (1.25/1.1)^2*68= ~88W or 20W more at 1.25V
i though that power draw scales ^2 with voltage. Or i was told so a while ago on this forums...
so if its 68W @ 1.1V it should be (1.25/1.1)^2*68= ~88W or 20W more at 1.25V
My Rig X6 1055T|Crosshair IV Formula|8600GT|2x2024MB@1800|436GB storage
when you factor in C/W values... if you are ~30C above ambient at stock, raising the Vcore from 1.1 to 1.25 should add very roughly 8.8C to your load.
again, these are very rough equations, but i found they are pretty close to real world data. even if your cpu uses less wats than its TDP (and it actually does use less) its correct to certain margin of error, because than difference between actual power draw and TDP is on both sides on equation.
I experimented with them on my system to "guess" core temp at certain speed/volts and then i found they are +/- 1C from real values.
if something is unclear in my post, dont shoot me... i tried my best with my english lol
edit: Cronos, you were 3min late lol
My Rig X6 1055T|Crosshair IV Formula|8600GT|2x2024MB@1800|436GB storage
Okay, so I understand you right when I say that there must be a difference when I rise the Vcore from 1.1V to 1.2V.
Note that the CPU are working at there limit when I messure the wattage.
All eight cores are at fully load with QMC workunits and on my other systems i can see a difference when I change it only for 0.05V.
Bookmarks