Gah XS timed out and lost my huge post. I'm not going to write it all again.
Basically as you increase DDR frequency MCH load increases, and voltage jitter limits decrease at the same time, so increasing Vddr to 2.4v lets say to get 1200mhz while using 1.55v Vmch to run 475mhz fsb at PL6 and 1.40v Vcc to run 4.2ghz, you might push the Vreg electronics to the point where 95% of the circuit operates correctly at say 90c, but 5% of the electronics can't guarantee the same consistent operational behaviour as the other 95%, this appears as instability, inconsistenly and errors.
Lets say 100% of the circuit meets operational specs at 85c with 90% output of its maximum, but your circuit is operating below 85c at 95% capacity at 75% output load, but once the output load increases past 75%, Vreg circuitry temps begin to shoot upto 95c in some areas and it just happens these areas also contain say 5% of electronic components which fail to meet operational specs at 95c and > 75% output, but the other 95% don't show any signs of faltering. You might pick up 1-2% of loss from the rest of the components over compensating, but you still have 3% which can not be compensated by the rest of the components working at their limits.
Nothing is perfect, and you need to sacrifice voltages sometimes to guarantee stability for the rest, Vddr then Vmch are usually the first ones that need slight reductions, if this means you lose some headroom and performance you have no choice, as the gains from PL/DDR frequency are much smaller than FSB/CPU frequency, and more important Vtt/Vcc filtering circuitry are designed to be much more robust and more importantly be broad enough to cope with scenarios that may occur only in 0.0001% of operation, but are bad enough to throw a stable system into a BSOD or corruption. It's more likely that CPU voltage is going to be increased rather than DDR, and GTL+ bus design is more sensitive to this kind of random occurance than DDR bus is.





Reply With Quote
Bookmarks