It's all in the readme.
Printable View
Great utility! thanks for posting.
For people using the gtx 295, you will need to use device /sd0 for the first and /sd2 for the second gpu, for this tool that means choosing the sli/cf mode option rather than multi-gpu (as multi gpu assumes /sd0 and /sd1)
This can ofcourse be confirmed using rivatuner's monitoring graph.
ok, here is probably my last post on this:
just uploaded a rivatuner plugin with allows changing voltages from rivatuner:
see my thread here
Good job mate , hope the 285 and 260 (55) will get support too soon
I feel like crap now that i sold my GTX280 and put in the new GTX285 that is not supported. :(
It's not supported yet, but it possibly will be later. I doubt EVGA would tell their users there's gonna be a version with support for 55nm card later, if they knew they could never program the Intersil!
on the gtx 285 (ISL6327CRZ) nvidia most probably connected some GPIO pins to the VID input pins but this will give you like 3 or 4 selectable voltages in a rather small range. it will definitely not be what you can get with the i2c based chips.
I just bought a 65nm GTX 260. With this voltmod, it has a slight advantage, IMO.
1. larger process, handles more volts than 55nm
2. higher quality stock HSF and shroud
3. voltera VRM that is programmable for the softmod
4. $190 new
this thing, even with 192 shaders, might be able to perform somewhere between a gtx280 and 285 once i get the core > 800mhz.
That makes a lot of sense Wiz!
Something along those lines would be executed by controlling say 3 or 4 dedicated GPIO pins which would be used to assert/deassert the VID pins based on the required voltage table bit mask to obtain 3 or 4 preset Performance Level VIDs? Ie. use different open GPIO pin combination to get 4 bit mask range, 00 01 10 11 of which each has a preset VID entry to connect VID0-VID7 pins based on what Voltage Table mask is needed to output each performance level voltage.
I'm not all that knowledgable on the GPIO functionality so I've just taken a stab at how i think it may be done, feel free to correct me if i've missed the target heh. :D
I've been using Riva Tuner to adjust voltage on my GTX 260. Running clocks at 706/1501/1306, on 1.15V (up from 1.13V stock),in COD:WAW I'm getting a solid 60 fps with V-sync enabled, all eye candy enabled, run at 19x12 resolution. Other games behave similarly. I also force triple buffering with Riva Tuner. How much more could I get with a GTX 280 or 285?
Just being able to adjust voltage to stabilize clocks helps these 65 nm cards render at the same level of the higher cards. I really don't think that you'll see all that much of an improvement on the higher level cards as compared to the 260's. With this voltage tuner, EVGA just extended the life of the 65nm 260's.
Finally released:
http://www.evga.com/articles/00462/
Correct. But normally amount of such GPIO controlable pins is reduced to required minimum during the PCB design (the rest VRM's VID pins are hardwired), so normally there is just 1 GPIO pin connected to VRM if 2 different voltages are needed, 2 pins if up to 4 voltages are needed etc. So maximum you can expect is having limited set of fixed voltages. And the maximum one is normally aready in use.
For some reason or another, I cant log into EVGA? It seems that there server or some sh*t is down. If I click "Member login" I get a "PAGE CAN NOT BE DISPLAYED" error everytime. Damn it, i've been waiting for so long for this app. EVGA FIX YOUR WEBSITE.
I'm on the site now and it is fine
Just downloaded it will be testing it on my GTX260 216 (65nm) hopefully some higher clocks.
Stock voltage at 2D settings is 1.11V by RT. I also have an EVGA Superclock card. That accounts for the higher 2D stock voltage. The 3D stock voltage is 1.13V. Don't know how high the gpu core will go for max clocks, but I have noticed that with a slight adjustment, with <<max clocks, the performance increase is very noticeable.
Edit: EVGA Voltage Tuner works like a charm. RT is better because it makes you think about what the hell you're actually doing!
here is 820mhz core stock cooled from August 08 @ techpowerup
http://img.techpowerup.org/080801/bios.jpg
http://img.techpowerup.org/080801/bios.jpg
@Sailindawg: how high does your 260's core go?
Finally have been able to get 720c/1570shader/1170 stable using only 1.15v on my GTX295:up: Awesome job evga
Wow, that post was from Tech Power-Up was pretty sweet. And it's stock cooled like mine. Don't know what max clocks are, I ought to figure it out!
Update: Been playing around all afternoon. The most stable clocks I was able to get were 774/1548/1306 @ 1.165V, by EVGA Voltage Tuner. Shaders did not like getting much beyond 1548 or so. This was Vantage stable. Higher volts had no impact on stability. It only helped things crash faster. Max gpu temps were 58C with fan at 96% duty cycle. I ran this using windows 7 64 bit beta and 185.20 beta drivers. MMMV with XP 32 bit and a different driver. Still, this 260's got a lot of unlocked power.