Thanks for the help, I got it to work by /wi3,70,17,3c :)
Printable View
justageek95 how did you get it to work?
i read the article and tryed the things you have tried but nothing happened
DISCLAIMER: I TAKE NO RESPONSIBILITY FOR ANY DAMAGE THAT MAY HAPPEN IF YOU DO THIS!
1. Find the I2C bus of your card by running the following CLI commands:
RivaTuner.exe /ri0,70,1A
RivaTuner.exe /ri1,70,1A
RivaTuner.exe /ri2,70,1A
RivaTuner.exe /ri3,70,1A
Three of these four commands will return "invalid" take note of the one that doesn't (for me it was /ri3,70,1A)
This will find the I2C bus (highlighted in red in my example)
2. Get your voltage register values
Using the I2C bus number found above, (0-3, highlighted in red) run the following CLI commands, replacing "#" with the I2C bus number.
RivaTuner.exe /ri#,70,15
RivaTuner.exe /ri#,70,16
RivaTuner.exe /ri#,70,17
RivaTuner.exe /ri#,70,18
Take note of the return value for each.
3. Convert voltage register values to actual voltage
For each of the values returned in step 2 do the following:
A. Convert the value to decimal format (the returned values are in hexidecimal)
B. Calulate actual voltage by the formula: voltage = (VID * 0.0125) + 0.45
C. Compare the 4 resulting actual voltages to the voltage reported in 3D mode in Rivatuner hardware monitoring.
D. The closest value should be your 3D voltage (ex: for me Rivatuner showed 1.13v, I got 1.250v
E. Take note of the register that is associated with that value. (highlighted in red in step 2)
4. Calculating the voltage to use
A. Decide what voltage you want to set.
B. Find the VID for that voltage using the formula VID = (voltage - 0.450) / 0.0125
C. Convert the VID to hexadecimal
5. Setting a new voltage
You can set the voltage by writing the new VID in hexadecimal form to the register.
A. Run the CLI command: (replace # with IC2 bus number, and VID with the VID in hexadecimal form)
RivaTuner.exe /wi#,70,17,VID
The new voltage should now be set!
http://img530.imageshack.us/img530/8606/vmodhr3.th.jpg
Example: GTX 260, desired voltage = 1.2v
1. All the commands return "invalid" except RivaTuner.exe /ri3,70,1A which returns "0A"
2. I get the following values:
RivaTuner.exe /ri3,70,15 returns 3B
RivaTuner.exe /ri3,70,16 returns 31
RivaTuner.exe /ri3,70,17 returns 36
RivaTuner.exe /ri3,70,18 returns 2F
3. Calculating the voltages of each:
Hex: Decimal: Voltage:
3B......59......1.1875v
31......49......1.0625v
36......54......1.1250v
2F......47......1.0375v
Rivatuner was reporting 1.13v in 3D mode so the third one is my 3D voltage register.
4. I wanted 1.2v so:
VID = (1.2 . 0.45) / 0.0125 = 60
60 = 3C in hexadecimal
5. I set the new voltage by running:
RivaTuner.exe" /wi3,70,17,3C
TX man!
works on my gtx 280
http://www.easycalculation.com/decimal-converter.php
http://www.maxi-pedia.com/hex+to+decimal+converter
hex to dec and dec to hex converters
edit: does anyone know how to get it running at rivatuner/pc startup?
It working on my GTX280 too but I have a question though. The given cli only change the VID for gpu0, how do I change the VID for gpu1 since im using SLI.
I tried checking the diagnostic report. The only difference between the two GPU is one is on
$0000000003 Location : bus 2, device 0, function 0
and the other one is on
$0000000003 Location : bus 3, device 0, function 0
My guess is that i need to change the deviceid, am I correct?
You should apply changes to both GPU VRMs. CLI works with GPU selected in the main tab of RivaTuner. There is /sd<device_index> or /selectdevice<device_index> command allowing to change device selection via CLI. <device_index> is 0-based logical display device index, i.e. if there are 4 virtual display devices (2 heads representing independent displays for the primary GPU and 2 more heads for the secondary) then your should select the first (0) then the third (2) logical devices when applying the changes. In this example, command line for probing VRMs of both GPUs can look like:
RivaTuner.exe /sd0 /ri3,70,1a /sd2 /ri3,70,1a
i have little experience at writing scripts, can someone help to write up a script to adjust the vgup (just for one vr) at windows startup?
I've tried to that running cmd.exe and doing what you say but all ways it's not a known command either internal or external
RivaTuner's task scheduler module was designed special for such tasks. Go to <Scheduler> tab, click <Add new task> button and type in task name, e.g. "Voltage mod", select <Launch application on specified schedule> task type in %RT% macro as path (RivaTuner will expand it into fully qualified path to itself when executing) and desired I2C writec commands in the command line field then selct desired schedule type, e.g. <Run task at RivaTuner startup>.
Holy smokes, here is the answer to the ultimate VGPU BIOS mod everyone's looking for.
Does this work on G92 cards?
Well I see where you're coming from. But there are a bunch of things you could do to make sure you got the message across. For example, the first few times you go to the voltage adjustment page, it could pop up a very explicit warning of the danger and potential damage that may result, and this message could be programmed to display for 3 minutes, with no way to bypass it (the first few times at least).
Surely, that would alert any potential user to the damage they may cause, no?
And about asking. Ok, is it possible to adjust the memory voltage, and if so, how? Any other major voltages that could be modified? :)
That depends on the PWM controller used for the memory.
If I recall correctly the memory is "powered" by another PWM Cont. on the VGA, so you have to check which controller is being used on your card ( I'm not 100% sure but I think I've seen some GTX 280s with different PWM Cont. for the memory ) and if it supports VID setting via the I2C BUS.
Very interesting :)
Now to blow up my 280 :D
Actually, no.
A soldered mod WILL be handicapped by OVP because there is no OVP mod as we don't even know how VT1165 does OVP as the datasheet is under NDA. Software VID change (via Rivatuner, eg.) just "bypasses OVP".
Reference G92 boards use the Primarion PX3544, of which Primarion says the following:
->Quote:
The PX354x incorporates an industry standard I2C-bus serial interface for control and monitoring. Through the serial interface, the power supply designer can quickly optimize designs and monitor system performance. The interface allows the PX354x to provide digitized information for real time system monitoring and control.
http://www.primarion.com/PrimarionMa...wer_PX3540.htm
:yepp:
Great job, guys! :up:
Maybe some mod should split up this thread in two and put the last part in the modding section.
...Unfortunately my GTX 285 has another brand/type of controller on board (obviously the cause for delays with those 55nm cards):
http://img528.imageshack.us/img528/1...x285mt6.th.jpg
Here's the data sheet for the intersil chip on your GTX 285 http://www.intersil.com/data/fn/FN9276.pdf
Maybe different companies use different chips on these too.
Unless OVP/OCP kicks in early I'd still prefer to hard mod.
*EDIT - OVP or OCP apparently kicks in at 1.30v, so hard mod will be next to useless without OVP/OCP mod also.
Hi!
Thanks Justageek, very good info! :clap:
Check at the guide, in part 3B. The correct formula for voltage should be:
Voltage = (VID * 0.0125) + 0.45
I'm trying this with my HD4870. Just to check if it applies.
rivatuner.exe /ri1,70,1a returned I2C 01 70 1a : 0a, the other three were "invalid".
The problem is that every register (15,16,17,18) returns 41. Wich in fact it is my actual Voltage.
v = (65 * 0.0125) + 0.45
v = 1.2625 :yepp:
And I think it is correct, because I edited the BIOS so there is no energy-saving mode, it's allways at full speed.
But there isn't a different value for the other registers until I get to... 45, for example, that returns 5E... and that will set 1.65V.
Question is, wich are valid register? Can I set whatever register I want that returns a valid voltage value?
Sorry for my English!
You really think so?
#1 that is a small increase in voltage
#2 you could say the same thing about motherboards that offer voltage that could potentially kill hardware (see: all enthusiast mobo's)
I think people that tweak will use this and people that don't won't even know they have the feature. Just as it has always been with pc hardware.
I put a thread here in Xtreme BIOS, but i think it needs to be moved to Xtreme Mods:
http://www.xtremesystems.org/forums/...d.php?t=215521
I just tried it on my PX3544-chipped G92GTS but I can't seem to find a i2c bus (0-99) that doesn't slap me with "invalid".
It only works on cards with Volterra VT11xx digital voltage regs, NV GTX260 65nm GTX280 65nm ATI HD4870 4850? and a few other cards.
It doesn't work on any G92 cards from what I can recall because none of them were built with Volterra regs.
If you aren't even sure what voltage regulator is on your graphics card then you probably shouldn't be playing with I2C bus commands unless you know what you are doing.
If you do make sure its a 65nm chip for playing with voltage. 55Nm chips got budgeted and slapped with a cheaper 4 phase VR. Cutting costs on the shrink you know how it is ;(
G200 chips don't respond all that well to voltage anyhow, well they respond with heat not so much groundbreaking performance though. Even with increased voltage to around 1.25v they still become shader limited early, most cards won't do more than 1674mhz shader and 780mhz core, even then keeping the card cool is a feat in itself. If you can keep the temps near zero you might have more luck with lots of voltage ;)
well if it has writable registers for selecting voltage and you can figure out which ones they are then awesome ;)
Not all voltage regs have support for selecting voltage through register writes. You need to check the datasheet for the VR and hope it has some kind of useless register descriptions!
But the guide posted on writing is only for Volterra VT11xx registers, by this I mean register address mapping, ie 0x1A.
Just wanted to add that the 285s dont have the same Volterra IC as the 280/260. It has an Intersil ISL6327, and here is the data sheet........
http://www.intersil.com/data/fn/FN9276.pdf
Dont know what to look for, so not sure if it supports software voltage control.
http://pctuning.tyden.cz/ilustrace3/...9/voltage1.PNG
Working! GREAT!
1.6V... kinda a big jump in voltage if normal is 1.2V. 33.3%
what were your clocks at 1.2V vs 1.6V?
this really made my day! now we need a thread with Vcore/OCgain chart or something.
Well you can have my Vcore/OCgain first. Its big fat zero. Like what some of the knowledgeable folks on this forum has said, the temp matters more than the Vcore on this architecture.
Anticipating an increase in Vcore, I removed my whole system out of my ghetto case for better airflow/temp. And the Vcore did nothing, zilch, nada. The default Vcore on my card is 1.19v. The cards are GTX280 Zotac AMP and MSI Super OC. Both are already highly oc'ed at 702/1402/1150. I went even to 1.35 but that did nothing. Tried everything in between too. The same, nothing. But the temp improvement from the open air system does improve oc little bit from 712/1456/1332 to 720/1512/1332 at default Vcore.
Just ran furmark and my vrms hit 95c at 80% fan speed! Is that normal? I dont think that bumping up my voltage will do much for me if my vrms are running hot as is.
I ran 740/1566/2600 on 1.21v just then, only started artifacting in Furmark at 70% and its pretty damn warm here this morning in AU. I'll hit up some air conditioned testing later on this morning.
I can normally run 740/1512/2700 no problems at 1.19v, and on a cold day 730/1566/2700.
i'm running on chilled water so the temps are manageble. btw, the unwinders advice on how to get it to work at rivatuner startup worked like a charmp.s. guys for how long do you run ati tool artifact scanning (or is there a better utility?) before you know the oc is good? 10 minutes enough?Quote:
RivaTuner's task scheduler module was designed special for such tasks. Go to <Scheduler> tab, click <Add new task> button and type in task name, e.g. "Voltage mod", select <Launch application on specified schedule> task type in %RT% macro as path (RivaTuner will expand it into fully qualified path to itself when executing) and desired I2C writec commands in the command line field then selct desired schedule type, e.g. <Run task at RivaTuner startup>.
I'm still testing: 1.3V 758 core, 33C gpu.
next i'll try 780, then 800
edit: 758 stable for 10 min, 783 stable 10 min 810 artifacts from the beginning. should i try more V or over 1.3 is risky?
edit2: tryed 810 with 1.3 1.33 1.38 still artifacts...
so 783/1566/1152 for the moment.
Anyone thinks unlinking the shaders might help?
edit3: i've been looking at the atitool fur cube artifact scanner for an hour now, looking for arti, and it is messing with my head, will continue tomorrow.
over 1.3 is useless. you have reached chips limits it seems mate. I can't remember how high k1ngp1n managed to clock a 280 on liq nit, but even with that it wouldn't have held out for continuous use.
what shader have you hit? thats the critical factor. 1674 is the highest most can push.
i want a 24/7 oc so might aswell stick to 783. Should i clock shaders higher? any benefit in that or is it better to keep em linked?
um depends what shader clock you are using linked heh :D I never link shaders so i'm not familiar with what shader you are using at 783!
shaders are double the core in linked mode, so 1566
oh then hell yes ;) try 1620 if that is ok, go for the big daddy 1674 hell if by some fluke that passes too jump to 1728 :D lower core a little while testing 1620 and 1674, shader clocking on the 280 is more beneficial than core, core doesn't give much improvement in Vantage and other benchmarks while shader makes more of a difference from one to the next. If you don't go too crazy on core clock, you may be alright for stability. 760-770 would be as far as I'd go on core, and get shader as high as you can.
finally i cant resist to kick in:
verified this on my gtx295 and works as expected! the default voltage is 1.04V. guess nvidia reduced the voltage here to keep the cards a bit cooler, so increasing vcore here might help a bit more. have not yet tested to overclock the card, just verified if this works here and it does.
cooling is an issue tough, as i have already reprogrammed the fan controller and temps are hitting 84C during hours of crysis wars. fan duty at about 70%, still some headroom here!
wow 1.04v is pretty low even for a 295! expected at least 1.1-1.11v not bad at all for a shrink. you'll have stability issues with too much shader freq at temps like that but give it a try anyway interesting to see results none the less!
is it possible to change the voltage on GTX260 55nm Cards?
1.04 is idle v i think.
Also to people who say increasing v does not help:
i tried 810, 1620 shader and with 1.3 lots of art, 1.4 much less, 1.41 almost gone, 1.42 stable for 5 min.
also unlinking the shaders got me artifacts...
edit:
i tryed again to unlink shaders, and this time i left them at my stable 1566, and got to 810core (at 1.3V!) with no arti for 5 minutes. when i tried 810 and 1620 shaders i got arti. apparently my gpu is shader limited.
obviously not on the 295! it remains 1.04 even if it switches to 3d mode (clocks increase from 300MHz up to 576MHz)!
as i have not yet found vcore voltage measuring points i'm up to the readings i get from software. will verify this with a voltmeter later, as soon as i know where the measure points are.
It's probably almost useless to be turning up the voltage if your using air cooling. As the cards get pretty hot at stock voltage. anymore voltage is just gonna create more heat.
Anyone burn out their cards yet? lol
atitool and furmark
Has anyone that thas tried this out noticed a higher vrm temp? Im wondering how much that I should give my chip (if any). I would like to break a 750mhz core. I can run 725 now stable.
I hope they get this working for the GTX 285.
I was trying to decide between tri-SLI 285s or quad-SLI 295s and ended up going for the 285 (one for now, two more later on).
But this tool looks like it could push the 2 295s to the top over the three 285s unless the 285s can use this tool. For the lower price, the 295 quad-SLI becomes much more attractive with this tool.
Just curious, what was your stable OC at stock 1.2V?
No amount of OC is going to give the GTX 295 Quad SLI setup 1GB of framebuffer - that is always going to limit it at high resolutions.
GTX 285 Tri-SLI is going to be much better overall than GTX 295 Quad SLI for 2560x1600 gamers. Certainly with AA enabled.
Nope, unless you got the Volterra-Controller. But I highly doubt that. Have a look at the end of your card and check what's written on the IC. My 285 has an Intersil IC:
http://img528.imageshack.us/img528/1...x285mt6.th.jpg
http://www.xtremesystems.org/forums/...&postcount=106
working great for both my cards, haven't tried in SLI though. using this method to under-volt the crunchers and it has worked beautifully.
Doh, missed the post.
But now I'm running into a very strange issue. I successfully can change the 2d voltages for both cards. But no matter what I do, I can't change the 3d voltages. I have tried writing to all 4 Ic2 bus numbers and still nothing.
EDIT: Nvr mind, figured it out
won't be the IC2 bus numbers, the registers are for idle and load. 15,16,17,18, one will be idle, one will be load. for each card in the one rig with two cards crunching, they each had different registers for idle and load. perhaps thats what you meant but hope that helps.
there should be 2 different I2C bus addresses for each cards VR if I'm not wrong. Unwinder is the man to talk to about this he could do it with his eyes closed standing on one hand while drinking a beer ;)
nope, I2C BUS can be the same, you just have to specify the different devices. for instance, this is the line in my under volt config for two GTX 260s not in SLI for crunching...
/sd0 /wi3,70,16,2d /sd2 /wi3,70,18,2d
device 1 is /sd0, the other is /sd2...both on the fourth I2C BUS one using register 16 for load, the other card using register 18 for load voltage.
hmmm. I wonder can I use all this for my 8800 GTX? Must see does it have Voltera IC's. Regardless I ran
RivaTuner.exe /ri0,70,1A
RivaTuner.exe /ri1,70,1A
RivaTuner.exe /ri2,70,1A
RivaTuner.exe /ri3,70,1A.
All invalid, for kicks I ran RivaTuner.exe /ri4,70,1A. Screen went white. :D Restart and it was fine.... *sigh*
Rivatuner voltage adjustment works perfectly with my reference HD4870 512MB. Both overvolting and undervolting is fully functional. Awesome.
I got it working but when both GPU's are set at 100% fan and 1.2v, I get hard locks when overclocking to speeds that are known good with stock volts. The strange thing is that if I only raise the voltage of 1 gpu at a time, I'm fine. I'm wondering if I'm starting to run into power issues - I'm using a Zippy 700w with only 45w on the 12v line, and have 2 GTX 260 Core 16's, an E6850 @ 3.8ghz 1.45v, 4gb ram, 3 hard drives, 9 fans, etc. The 12v line measured with a multimeter is 12.09 idle and 12.08 under load, but still I'm starting to wonder.
I guess it doesn't matter since I have 2x GTX 285's coming soon via step up, but if I'm having power issues with the GTX 260's, I'm certainly going to have them with the 285's. The psu is running at basically ambient temperature due to a dual fan mod on it, but still...
I figured being a zippy I would be okay, but I guess not.
My gosh:shocked:, are you serious? How much does it help you overclock your 4870 over stock volts overclocking? How much are you able to overvolt it?
I'm so tempted to buy it right now just because of this... (at least, if it can reach 850-900 MHz without hardware v-modding) Is there another thread on this RT voltage adjustment tweak?
I've just been messing around with the under/overvolting. Down to 1.0v and up to 1.4v, so far, with confirmed lower and higher power draw from the wall. Don't have a DMM. Still seeing what the limits are though.
Haven't done any overclocking with the new voltages yet.
So any news when we can get our hands on the EVGA Beta or Official version of this program. I'm not a fan of Riva Tuner and I have a EVGA card anyways.
N.A.S.A....The wheel has just been re-invented
yes uni's(;))R.T programe has had this feature for about a decade for some ati and nv cards:D
now guys type with your head not over it.
when i use this method i like to melt solder at the same time just for ambionce of hard mod
They said soon, so no release date, but knowing EVGA when they they soon they get it done. Maybe a month? Don't know just guessing. It looks like they have a beta going there becasue they already tested it on a card. That's why I'm guessing a month or less. I really have no idea though. Just a guess.
Well have changed the voltage on my card but now the card always run in 2d performance mode :p:
Mine can do 850 out of the box at 1,263, but it seems more voltage does nothing... Tried up to 1,45v, a bit more and I get monitor signal out of range. I can't confirm with a DMM but it draws more power from my GB power supply also. So no 850+MHz love for me, but at least this sucker now draws the same power as my old 3870 at idle (with 0,90v and 500/500, memory clock is key here), which is absolutely nice giving its horrible stock idle consumption. I've done a simple .cmd file that loads automatically at startup and it's working nicely. Also be aware, if you get a driver restart the new values will return to default.
Thanks for all that contributed to this thread and especially to Unwinder for giving us RT to play with. :clap:
Working great with a GTX295 here (just raised it from 1.04 to 1.1 on both to test. i notice that gpu1 vrm's get a whopping 10° hotter than gpu0, is that normal? bad contact maybe?).
Out of interest, what are standard volts for gtx285 type cards?
1.15V iirc.
Thank you very much, so it seems I'm still in the safe zone voltage vise. :)
I have problems getting the second gpu perform as good as the first one. ATM I'm just testing the shaders (with which gpu1 has problems from 1441 on, while gpu0 is still stable at 1476) for folding stability. It needs few hours of torture to be sure, so I'll write tomorrow in the morning if that bump did something. I'm returning this card anyway and go for an xfx one so I don't really care, but it's nice to know that I _could_ stabilize it that way (within safe limits of course).
i did the mod to 1.2v on my 260 and i didnt get much of a boost at all, 3dmark locked up at 750 core. shaders holding me back of course. dunno what to do to get that oc better ;(
tried the first 4 cli commands, ri0,70,1A thu ri3,70,1A and they all came up as invalid. using a g92 8800gs. Any ideas?
I'm doing this on my 4870 right now too. These are the ones I got on a stock 4870.
ri1,70,1A = Valid
ri1,70,15 = 41 = idle
ri1,70,16 = 41 = ?
ri1,70,17 = 41 = 3d load
ri1,70,18 = 41 = ?
Still on a stock cooler on the thing though, so I haven't dared to take it too far up in the volts. Idle works great at 0.9v though, power usage way down.
The only thing I could report tomorrow before I refund her is if the raise in voltage helped the shaders to at least stay stable a 1440. But if I've got the XFX I definitely will report what this one can do. What I can say now is that so far it helped because there's no error yet. With the little bump of .06v it's now drawing a whopping 50 watts more from the wall. :p:
Oh and temps with this are no problem even on air, with the same fan setting it raised the cards temp by just 10°C, which is still manageable.
Alright, after a bit of testing. I bumped the volts up to 1.3 on my 4870, hits 840/1100 stable just fine and lives for 15 minutes in furmark. Problem being, it gets a VPU crash within seconds when I try to play FEAR. 820/1050 is the max stable for that. This is at the same 1.3v. The temps are quite a bit lower than the furmark benching too. At stock volts both furmark and FEAR fail at the same settings, so I don't know what's up. Any ideas?
Some games just don't like high OC's. Did you try lowering the OC to see if it cleared up?
Yeah, at 820/1050 it cleared right up. Any higher and it would induce and instant crash. I'll wait till my HR-03 GT gets here and then see how much higher I can take it voltage wise, and OC wise. For now I'll go back to stock volts and my lower OC, but still take advantage of being able to set lower volts for my idle clock. 0.9v is nice for idle, lol. :p:
I'll probably take my 8800 out too, just for good measure. Nothing I can really use physx in these days anyway.
newls1, there was no error through the night with that problematic second GPU at 1476 (I know that's not much, but FAH is pretty demanding), and I have a feeling that there's more in it even with that little voltage raise. I might say that at least my current 295 seems to benefit from extra volts. Maybe some watercooling would help her out some more.
Can you imagine the power requirements that it would take to run two volt-modded 4870X2's. I can barely run my two 4870 X2's as is.
A few tips and tricks:
1) Once you've determined index of I2C bus containing VRM on some display adapter (e.g. I2C bus 3 on GTX 200 series), the same index can be safely used on the same display adapter model by others. Display adapters have a few I2C buses assigned for differed functions (e.g. for connecting DDC and for external devices like VRMs, thermal sensors, fan PWM controllers and for on), VRM's I2C bus is defined by PCB design so it is fixed for the same display adapter families.
2) Don't try to scan more I2C buses than the GPU actually has (there was some posting with attempt to scan buses 0-99 in hope to find VRM on G92). Each GPU architecture supports fixed number of I2C buses, e.g. G80 and newer GPUs have only 4 I2C buses, pre-G80 supports 3 buses, pre GF4 supports just 2 buses and so on.
3) I see that many users started to enable VT1103 plugin now. Please pay attention to the following info from RivaTuner's release notes and always remember about it when using this plugin:
"Please take a note that Volterra voltage regulators are rather sensitive to frequent polling and may return false data under heavy load, so it is not recommended to use VRM monitoring in daily monitoring sessions"
4) There were some questions about finalizing these new VRM settings in NVIDIA VGA BIOS. You cannot use Nibitor for that because the tool knows nothing about VRMs and works with BIOS voltage tables only, it is only allowing you to change associations between performance levels (i.e. 2D/3D modes) and 4 fixed voltages stored into VRM registers 15-18 by default. However, you can easily edit your BIOS with any hex editor to reconfigure initialization scripts writing these 4 fixed voltages to VRM during POST. It is rather simple task, taking my 65nm EVGA GeForce GTX 260 as example the following script command in VGA BIOS is configuring VT1165:
4D 80 E0 06 15 3B 16 31 17 36 18 2F 1D 55 19 01
The command uniquely identifies I2C serial byte write operation, encodes target I2C device address (E0 is 8-bit encoding of VT1165's 7-bit address 70 including read/write flag in the first bit), tells script processor how many bytes have to be written (06) and finally defines register addressed and data to be written to each register (register 15 -> 3B, register 16 -> 31 and so on).
The voltages can be different for different VGA BIOS images, so the easiest way to locate this command in any GTX200 BIOS image is to search for 4D 80 E0 byte chain.