OK, now I finalised my testing and I am not particularly pleased with the results, but here goes:
Maximum overclock: (stock voltage/v-modded voltage)
Core: 713 --> 771 (+8.13%)
Shader: 1458 --> 1566 (+7.4%)
Memory: 1377 (no v-mem mod)
I couldn't get the shader 100% stable at 1620 no matter what. 1566 is stable at 1.25V and I went up all the way to 1.42 as an attempt to get 1620 stable and still no go. In fact the more I kept increasing the voltage, the earlier the tearing/artifacting seemed to occur (load temps mid to high 60s). Another note is that I never had a core overclock instability, since setting the clocks with rivatuner I was unable to set higher core clocks with the givven low shader clock overclockability. If flashing the BIOS gets around the core/shader 1/2 limitation, I'll give it a shot.
Looks like yield problems are important for the GT200 chips. G92s can do 2.0 GHz shader with 1.1-1.15V and I need 1.25V to stabilize this at 1.56 GHz. It has so many more SPs, but no more than ~40% more shader performance in practice due to these yields.
** update **
It seems that at 1.30V the card artifacts when the core temperature exceeds 75 Celcious degrees (this is not the case at stock voltage). In fact, I wonder where the temperature sensor is located. If the temperature sensor doesn't read the read die temperature, then I must calculate that for a givven temperature reading, the actual die temperature is a lot higher when higher voltages are used. Maybe 1620 MHz was inevitable because of cooling issues after all. I'm having a hard time controlling the temperature of this thing with the weather here at 35 - 45 Celcious degrees; I'll only know in the winter how much of a bottleneck the temperature is.
Bookmarks