comon evga.. release the damn thing already
Printable View
comon evga.. release the damn thing already
ive tryed to get this working for a long while and its no go for me at all followed instructions voltage control dont do anyhthing for me i hope something more simple will come out although my card is BFG so no go when this cums out anyway. :(
If you cannot do the mod, or use Diverge's slider, try this voltage app by Pederrs:
http://www.xtremesystems.org/forums/...&postcount=220
Many many pages ago I tried to warn you guys. It can be done, but it's not easy. For those with some experience in coding (which is NOT my forte') it can be troublesome, but can be done. I used to mess with stuff like this years ago, but I started cutting stuff because it was consuming all my time. Alot of it was wasted. With the EVGA vTuner coming I knew right then to wait it out before all this RT stuff even got rolling. Command line and C coding is fairly advanced stuff. It can be powerful, but without knowing all the particulars and the differences in cards ahead of time it could go around in a vicious circle and this thread could keep going into the 100's of pages and there would be more problems than success.
I'm not really speaking directly to you dan, just speaking in general terms as your case is not unusual.
Not true, this mod does not take any programming knowledge. This is simpler than day one of a C class lol. What it takes is patience and the ability to follow directions.
Maybe 2nd week :rofl:
Depends on how you approach it though.
I would personally use a config file that allows you to set params like BUS_NUM, BUS_OFFSET, etc and then read it in to the app. Reading configs is a bit more difficult than day one heh but yeah you're right it's not hard at all.
Edited my BIOS after some heavy testing, reduced 1,19v stock 3D voltage to 1,13v (everything still works perfectly) and saw a reduction of about 4-5 amperes in consumption as well as over 7 degrees celsius while gaming/benchmarking. Yay!
Mind if I ask what clocks you have set in bios for Performance 3D clock table?
If you keep below 670 core, 1404 shader then 1.13v will help you drop temps a bit, you may be able to go slightly lower too.
2D voltage for 200 core, 400 shader only needs around 1.03-1.05v
3D low for 300 core, 600 shader only needs around 1.06-1.08v
3D perf for 602 core, 1296 shader should only need 1.10-1.12v.
My original reference 280 bios was using 1.03v, 1.06v, 1.11v I believe it was.
All the later EVGA bios (bios version ending with greater than 80h)all seem to have 1.06, 1.11, 1.19v for the values set.
If you guys want any 260/280/etc bios modding done let me know in a PM and I'll give you a hand or modify myself. I can help you out set up Automatic Linear Fan Control through the bios if you are a bit confused, the default method of Dynamic fan ratio is hardly acceptable for anybody overclocking on reference cooler.
I'll post up a Nbitor screenshot of my bios too later on for reference and a quick reference for the values.
To get an idea of what I mean my fan duty is set to 50% minimum, Tmin where the fan duty begins to ramp vs temperature begins around 43-44c, by 56-58c fan duty is at 100%, and begins to drop once temp falls. Also I've set up the ramp rate to be gradual to make it next to unnoticable.
The clocks i'm using for 3D are 702/1235/1525 and it works perfectly with only 1,1250v. I made a video but didn't come out good, somehow reversed the order of the scenes, but might give an idea:
Youtube Video
EDIT: Card works fine in Windows with just 0,8v but i didn't get any power save there by dropping from default 1,03 nor less idle temperature (about 30º watercooled).
12C is I2C i'd say. I2C is low pin count serial bus that uses 7bit address'.
-12C? Is this I2C access switch for program?
[ 03h 70h 15h ] is BUS_ADDR_OFFSET, I2C_BUS_ADDR, VR_REGISTER_ADDR
so basically 15h is located on I2C bus 70h offset 03h, which is what Volterra use in most cases so to not conflict with other electronics which also use I2C to read/write registers which control assert/deassert of pins.
70h = I2C UNIQUE BUS ADDRESS for Volterra VT1165/1140 VR serial access
00-03h = BUS OFFSET on I2C bus 70h, 0h 1h 2h 3h. 4 Offsets which can holder register addresses.
15-18h hold VID values, the one which holds the largest hexadecimal value will be the 3D Performance VID.
You need to convert the value you read to decimal with a hex/bin/dec calculator.
ie. 0Fh = 16 decimal
1Fh = 32 decimal
2Fh = 48 decimal
3Fh = 64 decimal and so on.
00h = 0, 01h = 1, 02h = 2...09h = 9 0Ah = 10, 0Bh = 11, 0Ch = 12, 0Dh = 13, 0Eh = 14, 0Fh = 15
10h = 16....1Fh = 31....20h = 32...2Fh = 47, 30h = 48.....3Fh = 63..
VID is just a hexadecimal value which is converted to a floating point voltage, and converted back to write new VID register value.
15h = VID Register 0
16h = VID Register 1
17h = VID Register 2
18h = VID Register 3
If you can't understand what I've just written, try reading over the thread again, mostly the posts by Unwinder and others relating to how I2C read/writes are done. If you still can't grasp the concept of it, I'd say it might be a bit over your head and it'd be better to wait for someone to release a dumb slider application to do it for you.
if you are really stuck, here is a great little hex/bin/programmable calculator.
http://ccalc.shanebweb.com/CCalc_setup.exe
You can create functions like this to automate conversion, and as long as you set the base, ie hex, bin, dec it'll give right answers.
VIDtoV(x)=(x*0.0125)+0.45
VtoVID(x)=(x-0.45)/0.0125
ie.
> VIDtoV(0x3e)
ans = 1.225
> VtoVID(1.20)
ans = 0x003C
HTF do they restrict it to "registered" users? I understand if it only works on EVGA but you can't keep it from working on an unregistered EVGA card.
little nice tool from my german collegues of Awardfabrik.de
Voltage Factory V1.0
http://rapidshare.com/files/19383349...dfabrik.de.rar
no more needed EVGA Voltage Tuner tool ;)
Supports all VT11XX IC Cards (GX2,260-65nm,280,295,4850,4870)
Well guys I'm writing up a backend for my own voltage editor using RT API for access. I'm writing in C as thats my preferred language, so if there is anybody who is good with C++ and ATL/MFC for UI I'd be happy to give the back end to them to write a UI for. UI isn't my piece of cake. Might take a day or 2 to finish writing, gonna be busy with work.
Hope they adapt soon to 260 (55) and 285 cards... looks pretty easy to use, good job
...
Post #416 :)
Well, we went from the EVGA Voltage Tuner which was the subject of this thread onto Riva Tuner, then onto RT Advanced Command Line stuff, then into all the different chip versions, then into coding and recoding, then into the issues with RT not working for some and why, then into C programming and GUI stuff, then into Sliders, and then into many branches of different people making different programs.
I should've dug up that trainwreck pic I had on another 'puter where a track switch went wrong and one trian went in two different directions at the same time. :D
It's all cool though. I was just being a smart butt trying to be funny.
Great utility! thanks for posting.
For people using the gtx 295, you will need to use device /sd0 for the first and /sd2 for the second gpu, for this tool that means choosing the sli/cf mode option rather than multi-gpu (as multi gpu assumes /sd0 and /sd1)
This can ofcourse be confirmed using rivatuner's monitoring graph.
ok, here is probably my last post on this:
just uploaded a rivatuner plugin with allows changing voltages from rivatuner:
see my thread here
Good job mate , hope the 285 and 260 (55) will get support too soon
I feel like crap now that i sold my GTX280 and put in the new GTX285 that is not supported. :(
It's not supported yet, but it possibly will be later. I doubt EVGA would tell their users there's gonna be a version with support for 55nm card later, if they knew they could never program the Intersil!
on the gtx 285 (ISL6327CRZ) nvidia most probably connected some GPIO pins to the VID input pins but this will give you like 3 or 4 selectable voltages in a rather small range. it will definitely not be what you can get with the i2c based chips.
I just bought a 65nm GTX 260. With this voltmod, it has a slight advantage, IMO.
1. larger process, handles more volts than 55nm
2. higher quality stock HSF and shroud
3. voltera VRM that is programmable for the softmod
4. $190 new
this thing, even with 192 shaders, might be able to perform somewhere between a gtx280 and 285 once i get the core > 800mhz.
That makes a lot of sense Wiz!
Something along those lines would be executed by controlling say 3 or 4 dedicated GPIO pins which would be used to assert/deassert the VID pins based on the required voltage table bit mask to obtain 3 or 4 preset Performance Level VIDs? Ie. use different open GPIO pin combination to get 4 bit mask range, 00 01 10 11 of which each has a preset VID entry to connect VID0-VID7 pins based on what Voltage Table mask is needed to output each performance level voltage.
I'm not all that knowledgable on the GPIO functionality so I've just taken a stab at how i think it may be done, feel free to correct me if i've missed the target heh. :D
I've been using Riva Tuner to adjust voltage on my GTX 260. Running clocks at 706/1501/1306, on 1.15V (up from 1.13V stock),in COD:WAW I'm getting a solid 60 fps with V-sync enabled, all eye candy enabled, run at 19x12 resolution. Other games behave similarly. I also force triple buffering with Riva Tuner. How much more could I get with a GTX 280 or 285?
Just being able to adjust voltage to stabilize clocks helps these 65 nm cards render at the same level of the higher cards. I really don't think that you'll see all that much of an improvement on the higher level cards as compared to the 260's. With this voltage tuner, EVGA just extended the life of the 65nm 260's.
Finally released:
http://www.evga.com/articles/00462/
Correct. But normally amount of such GPIO controlable pins is reduced to required minimum during the PCB design (the rest VRM's VID pins are hardwired), so normally there is just 1 GPIO pin connected to VRM if 2 different voltages are needed, 2 pins if up to 4 voltages are needed etc. So maximum you can expect is having limited set of fixed voltages. And the maximum one is normally aready in use.
For some reason or another, I cant log into EVGA? It seems that there server or some sh*t is down. If I click "Member login" I get a "PAGE CAN NOT BE DISPLAYED" error everytime. Damn it, i've been waiting for so long for this app. EVGA FIX YOUR WEBSITE.
I'm on the site now and it is fine
Just downloaded it will be testing it on my GTX260 216 (65nm) hopefully some higher clocks.
Stock voltage at 2D settings is 1.11V by RT. I also have an EVGA Superclock card. That accounts for the higher 2D stock voltage. The 3D stock voltage is 1.13V. Don't know how high the gpu core will go for max clocks, but I have noticed that with a slight adjustment, with <<max clocks, the performance increase is very noticeable.
Edit: EVGA Voltage Tuner works like a charm. RT is better because it makes you think about what the hell you're actually doing!
here is 820mhz core stock cooled from August 08 @ techpowerup
http://img.techpowerup.org/080801/bios.jpg
http://img.techpowerup.org/080801/bios.jpg
@Sailindawg: how high does your 260's core go?
Finally have been able to get 720c/1570shader/1170 stable using only 1.15v on my GTX295:up: Awesome job evga
Wow, that post was from Tech Power-Up was pretty sweet. And it's stock cooled like mine. Don't know what max clocks are, I ought to figure it out!
Update: Been playing around all afternoon. The most stable clocks I was able to get were 774/1548/1306 @ 1.165V, by EVGA Voltage Tuner. Shaders did not like getting much beyond 1548 or so. This was Vantage stable. Higher volts had no impact on stability. It only helped things crash faster. Max gpu temps were 58C with fan at 96% duty cycle. I ran this using windows 7 64 bit beta and 185.20 beta drivers. MMMV with XP 32 bit and a different driver. Still, this 260's got a lot of unlocked power.
Me and a friend was able to remove the brand/card model restriction, and the voltage limit. If you guys allow a patch to be posted, I will post it. (I won't post the original program, just a patcher file) However, I was unable to test, as I don't have an NVIDIA card anymore.
Here's a screenshot of it running with the patch :)
http://i42.tinypic.com/2s8qkoh.jpg
By "limit removed", I mean set the max to 2.00v ;) I can make it higher, but I don't think that's necessary.
I can't download it because I dont have a EVGA card. AgentGOD can you post the original program and patch please.
Post the patch please!
You can pm it to me? What would be the point of your original post if your not going to share?
ahem...torrent? :wasntme:
b/c that would be against eVGAs policy for the program. it is there program for there cards ONLY!!
Exclusive to EVGA Members!
You Must Be Logged In and have one of products below registered on your account.
he made a patch that he will share to people who have the actual program.
and.. ?
Maybe instead of crying about it you guys should simply learn to use the rivatuner plugin.
I don't know when this forum became so whiney...
amen Sly
Well at least it works, Rivatuner flawlessly detects the adjusted voltage for all 3 of my cards (yes I have Tri-SLi and I'm an idiot because of it :p: lol)
This EVGA thingy is extremely easy to use though, that's why people want to see it patched for all brands and higher voltage... makes sense to me ;)
Very Nice! I downloaded it and will be playing with it later. I already have some high clocks and still have not found my max yet, so this will be quite an interesting and long evening. I guess it's time to break out the Jack Daniel's again. We're gonna wring it out full on Xtreme style! :yepp:
EVGA FTW!!! :up:
This thing hasn't been out but for hours and we're already seeing st00pid voltage. A serious OC'er does things in small increments and tests results, and only gives what is needed. Voltage used wisely is a good thing and can increase perofrmance, but too much won't do anything but raise heat, and set off OVP, and potentially cause degradation.
Dont think he actually has the voltage set to 2v, just showing how high it can go with his patch.
This program is very easy to use, right now im running my 295gtx @ 732/1635/1200 with 1150mV atitool stable. i increased my vantage score from 23200 to 24730. 2 thumbs up evga
EVGA GPU Voltage Tuner 1.0
I'm using MSI GTX280 & flash to EVGA GTX280 FW :D
http://i306.photobucket.com/albums/n...T280-1188v.png
http://i306.photobucket.com/albums/n...T280-1350v.png
Why risk a BIOS flash when you can use my patched version :)
Are you a Congressman???..... LMFAO!
Why do it your way when you can do it my way.... Oh wait, you cant do it my way because I cant let you..... IMHO you should have never even mentioned you hacked the program until you had all your ducks in a row.
Out of respect for EVGA co. I am removing this link. I know it won't stop people from getting it, but it's the least we can do for our great friends at EVGA - runmc
My card's overclockability doesn't increase when i increase to voltage from 1.125 to 1.2, 702/1404/2600 is still the max i can even with that significant voltage bump :\
The original 192 shader / full copper / 14 layer PCB / volterra VRM / 4 phase Pwr / 65nm gtx260's are obviously the most well built & robust and ATM are cheapest:
http://en.expreview.com/2009/02/07/g...-cut-cost.htmlQuote:
At the end of last year, GeForce GTX 260 became the first one of NVIDIA GTX200 series graphics cards to use 55nm processing technology. It featured P654 reference design, which cost less than earlier P651. The number of PCB layers reduced from 14 to 10, and it has abandoned the expensive Volterra chip to cut cost. Quite soon, the third-generation GTX260 design plan with codename of “P897/D10U-20″ will be also surfaced.
You might be interested in reading: The First Review of 55nm GeForce GTX260
The schedule drawing of P897 PCB design
According to the P897 design plan that NVIDIA sends its partners, it uses 4/6 phase NVVDD power solution ADP4100. It changes FBVDDQ power solution from 2-phase to single phase. And the MOSFET package has been changed from LFPAK to DPAK to save cost. Another noticeable change is the PCB layer decreases from 10 to 8. The length of PCB keeps unchanged, while the height of it is reduced by 1.5cm. In order to cut cost further, they will change DVI connector, crystal and probably change BIOS Rom from 1M to 512K.
But for the GT200 and NVIO2 chip, you might mistake GeForce GTX260 of P897 PCB design for GeForce 9800GTX+ (as they look so similar). Compared with P654 design, P897 GeForce GTX260 is expected to save cost of $10 to $15, which will undoubtedly improve its competitiveness. According to our source, this product will be available in the third week of this month.
We also manage to get some pictures of new GTX260 from a Chinese manufactuer Colorful. Coming from their iGame Series, this GeForce GTX260 card is designed based on P897 PCB design. They have just changed the TV-out connector to HDMI and added a set of overclocking jumper.
If you can still find one.
OOOPS....Get them while they're hot
Tried this with the patch on my 9600GT. GVT loaded up with the slider bar all the way to the right. Didn't work......
Currently still bumping my clocks after installing my watercooling lastweek. I still haven't bumped voltage, but may be getting in that range now. I'm at 730/1566/1250 stock voltage on my EVGA GTX280 SSC. It has a DTek GFX2 waterblock and Unisink. GPU Temps are running 38-39C currently, and GPU-Z is showing VRM Temps ranging 54-63C with a GPU2 Folding client folding a 511 point WU.
Cooling is what does it on these cards. Before my temps idled higher than this thing runs at loaded and OC'd as high as it is now. It used to load in the 60's. An OC like I have now would've been impossible without watercooling. It just wouldn't do it. I tried. Especially on the shaders. Shader power rules all with these cards. :yepp:
Would someone mind emailing me Agent GOD's Evga voltage tuner patch? They're pulling links left & right. jaredpaceatgmaildotcom
thank you!
Might wanna put some spaces in that e-mail addy or the bots are gonna have a field day with it.
Ah thanks. And thanks I've received it now. hehe - quick!
SSC are sort of binned GPU's and mostly run at slightly higher volts then vanilla's..; I had 3 vanilla cards ( 2 EVGA and one POV) and all capped out around the same shader speed for benching (480 region) GPU around 745 and rams were at 1250-1300 max,;
Very nice clocks T-Flight, praise yaself lucky mate hope the tuner brings out even more !!
I did have my Fan at 100%. Stock cooling will not do it. As soon as you put a load to them they go over the 50's and can get into the 60's. This thing loaded never gets above the low 40's. It makes a HUGE difference. I couldn't get above 675/1462 before no matter what I did. It just wouldn't do it. They like to run cool. The watercooling is what did it for me.
That surprises me, I would have figured that in the 50's and 60's stability wouldn't really be an issue yet. But as you pointed out, sounds like these cards like to be cool.
With all this voltage tuning talk and whatnot I want to get a waterblock myself. :D
Just hope that an MCR-320 could handle it + a Q6600. At the moment... I'm not that optimistic. :shrug:
It should. Look in ym sig for my setup. I'm running a single loop. Those Kazes are running at 2000 via a controller. The HW Labs Rads are a little bit more efficient with the faster fans, but I still think you should be good, becasue I have an i7 and you have a Q6600. I'm running at 4.2 with HT also.
I do believe it's the cooling that's doing it. It seems like all the stock watercooled EVGA cards seem to do better, and the ones with custom watercooling do also. That's not saying they all will. You know YMMV on that when it comes to OC'ing, but I really believe it has something to do with the reduced temps.
Another thing I noticed also with the Unisink is it brought down VRM temps which might add to stability, but that's only a theory of mine. I have to do more research on that. It was said by Unwinder on the EVGA Forums in a thread that these VRM's can handle up to 150C. I wouldn't ever get near that, but it's nice to know there is some wiggle room in case something would go wrong. It might be that when they are cooler, they give better electrical properties though.
Aw, and guys, if you read this, please don't push those VRM's. They have a design spec of that, but what we don't know is the time:temp ratio. In other words how long will they handle that? It might be seconds or it might be minutes. I wouldn't push them and find out. hehe :)
someone could pm me the patch link ? ty
They removed the link for a reason. why don't you be respectful of that and stop linking more.
Yea, I'd imagine that there's no way my Q6600 is going to produce more heat than an i7. Those things are beastly. :up:
Good points. I think I'm gonna get the mcw-60 + unisink as well. It just seems like the all around best setup in terms of price/performance/future compatability. :cool:Quote:
Another thing I noticed also with the Unisink is it brought down VRM temps which might add to stability, but that's only a theory of mine. I have to do more research on that. It was said by Unwinder on the EVGA Forums in a thread that these VRM's can handle up to 150C. I wouldn't ever get near that, but it's nice to know there is some wiggle room in case something would go wrong. It might be that when they are cooler, they give better electrical properties though.
@ AgentGOD/and your friend - awesome work guys ! :up: :up:
i got both patches and thanks to the patches especially the ~2V.. ill be getting a gtx280 to play with after 2x gtx285
i hope you guys will spread similar patches after a voltage tuner is released for the isl6327
I also have a fan blowing directly at the face of the card more towards the right hand side where the power section is. It too is an Ultra Kaze 3000 running at 2000 rpms. It made a 4C difference in VRM temps. When I install everything in the case I'm gonna mount that diretcly to the bottom of the case with an air hole I'm gonna cut. Airflow does help with the Unisinks. It is much more efficient than the stock cooler though by far.
It's there in most of these cards. The higher binned GPU's do seem to do a little better also. I really am amazed at the clocks these things will do when they run cool.
Indeed, lower temps made huge difference with GTX295! Just installed DD-GTX295 block and the load temps never go above 45C (and they used to hover upwards of 85C-90C when gaming, even with fan @ 100%). Increased voltage now allow significantly higer stable clocks than before (I realize this seems obvious, lower temps = more stable clocks, but it might help with people who are dissapointed that increased voltage isn't offering much headroom over stock clocks on stock cooling.