This is going to be Sweet!
*First version of this utility will support all EVGA GTX 260, 280 and 295 cards.
*Will work ONLY on EVGA graphics cards, and ONLY for EVGA registered users
http://www.evga.com/forums/tm.asp?m=...BF%BD%EF%BF%BD
Printable View
This is going to be Sweet!
*First version of this utility will support all EVGA GTX 260, 280 and 295 cards.
*Will work ONLY on EVGA graphics cards, and ONLY for EVGA registered users
http://www.evga.com/forums/tm.asp?m=...BF%BD%EF%BF%BD
flash to evga bios and use anyone?
That real nice. I wonder will they backdate it to support other cards?
Good to see this does indeed support 65nm.
Looking forward to this app!
OMG I think I just bought an evga nvidia card gtx 295 anyone
sorry ATI
why hasnt someone made this possible before now?
Unwinder needs to reverse engineer it just like Evga did to RT.
This is so SEX... they just need to add Mem voltage changing and it becomes ORGY
must be able to flash bios to another card and use this? im gonna touch myself in a minute :ROTF:
Standard Voltage (1038mV)
Increased Voltage (1263mV)Quote:
Check out the below example, a small VCore boost increases my clockspeed 100MHz!
Yeah, super small boost :ROTF:
Man I wish ATI had this, my 4870 needs a bump in voltage. Any word on why there is no support for G92 cards?
all watercooled cards are severly lacking in voltage to oc any further.
lol by a long shot
Yeah, AFAIK Unwinder worked on Precision for eVGA.
And this thing looks like his "art" work :)
Dang, nice... shame my 280's a PNY :(. Wonder if this utility will support the 285, or later on, whatever GT212/300 are.
EDIT: Says only works on registered eVGA cards, maybe some kind of online activation in it? *shrug* They said in the thread that 285 support isn't going to be in the initial release though.
Quote: "Sorry guys, unfortunately in Rev1 the GTX 285 is not supported, there are technical reasons for this..."
Does not look promising for the 285.....
Soon isnt soon enough!
Gigabyte's Gamer HUD has been offering the same feature for about 6 months now.
Grrr...technical reasons my ass. They just wanna sell their recently released SSC version.
hmmmm, I wish they would just warranty volt mods for everyone.... volt mods are pretty effing easy once you figure them out, I did my 4870x2 is 10 minutes
Volts mods physically modify the cards though. Not to mention not everybody can vmod properly or know how to turn to the proper resistance.
I don't care about warranty if they would make nice pads or holes to solder resistors directly into. That was nice on some older cards.
oh my god evga
i love you so much
TBH I think a warranty on non PCB damaging volt mods is better, some people shouldn't be raising the voltage on their cards, subsequently these are these are the people that are incapable of volt modding on their own. EVGA is digging a deep grave for RMA's by releasing this kind of functionality to the general public. ( like the guy who says 1.063 to 1.263 is only a small amount of voltage :rolleyes:)
Oh ----- My ----- God..............
EVGA You just owned all others with that move...Dayyyyyyyyyam!!!!!!! That rules man. That seriously rules. We got a full OC'ing suite for these things now. It's time to go kick some butt! Screw kicking butts...some heads may roll after this move.
EVGA! YOU ROCK!!! :yepp:
i'm sure they have rules for this :P they wouldn't blindly release this without thinking about that.
people like me however will not be stupid and this will benefit me in amazingly. i don't have the time or patience to learn how to volt mod. if i knew how i'd gladly do it. however i will be cooling and watching the volts and heat as if i did the volt mods myself. ta da!
I'm really liking this utility, something that should be possible to use for any card. NvTools any1? :p That's my first thought when seeing AtiTools in the screenshot next to that app. :p:
They have capped the voltage at a reasonably safe level of 1.35v so that nubs who don't understand anything about volt modding cant fry their cards.There are so many people out there who don't know a mosfet from a capacitor asking for help volt modding, when they do not posses the skills/knowledge to do so, and many attempts consequently end in disaster. At least this software method is a little more 'nub proof' and may actually decrease RMA's due to the aforementioned kinds of people breaking cards.
Yep, it's not gonna allow Silly voltage. At least I hope not. Surely they are not gonna allow that. For those that want to go higher they'll do the regular volt mods. He showed an Xtreme jump in shader clocks with a 295 which is pretty impressive consider the GPU's are stacked right on top of each other, and he did it on air.
I'm gonan try it on my GTX280, but I've heard it doesn;t do much for the shaders. Maybe I'll get a bit out of them though, and will get some out of the ememory and the GPU when I get my water on there next week.
Now excuse me, because I feel the need to go blow some stuff up. I think I'm gonna choose Fallout 3 tonight. :yepp:
Lookos great. Make's voltage mods useless
I am sure they are going to limit it to voltages that are safe. Most cards have a OCP or OVP limit as well which will help to protect the cards as well.
So when is this going to be available? I'd really like to get my hands on it sooner rather than later. :)
Does anyone know if increasing the voltage will have an effect on the shader clocks? My card is not limited on the core at all, but on the shaders. My card is extremely weak on the shaders and can only do 1350mhz max before I artifact in anything really stressful. This limits my max core speed to 675 as you cannot set the core speed to anything over half of what the shaders are clocked at. Increased vcore is useless to me if my shaders still won't budge.
Oh yeah i think it should help you push the shader clock higher. From my experience, it's also very important to keep the temperatures as low as possible so you could overclock it more.
For example, on my 8800GTX with stock cooler that usually reached 85-87 degrees max, it could only overclock to 612 MHz core, but with Accelero Xtreme 8800 that keeps it under 65 degrees all the time, I can push it to 648 MHz core (and shaders higher too). Those large chips like it cool, and at higher densities (65nm, then 55nm) it's even more important to keep it cool to reduce power leakage, so that you also save electricity.
Wow, I'm surprised that eVGA has the balls to do this!! I mean, stock cooling on GTX 280's usually reached 85 degrees and some cards had cooling issues with reaching 105 C. I'm guessing that this issue only belonged to the earliest batch of cards during the first few weeks, perhaps because of faulty heatpipes or something.
I loved this feature with my X1900XTX, and was disappointed to find this software voltage tweak disabled on 3870 and 4870 cards that certainly had greater headroom for voltage increases with their stock coolers. The cooler on the X1900XTX could barely keep it cool enough at default voltage--now it just doesnt make sense how ATI isnt allowing 3870/4870 owners to increase voltage..... :confused:
The 285 is now looking that much more tempting. Would it be possible to fit my Accelero Xtreme 8800 on the GT200 cards?
Yea I know, pathetic. I could probably game with it higher than 1350, but when testing I want absolute, 100% stability. At 1350/675/1242 I am 100% game, furmark and ATitool stable, clean and artifact free. Any higher and I start to notice things. If this new software allows me to reach the same level of stability, and artifact free testing then I will be a very happy man. Anyone know if we can flash these new voltage settings to our card to keep them permanent? I never did like having to run software in the background to get my video cards to run the way I like.
I sure hope so. I OC with EVGA Precision and close it out and the OC's stick. I would hope those voltages would as well.
These EVGA cards are special. They really have a winner. This is just another thing that sets them apart now. They really got their stuff together at EVGA. Their support is awesome. They have great offers, and upgrade plans. They give special offers to EVGA customers. The cards are just georgeous in the looks department. They run super. Tghey have all kinds of OC versions. They even have watercooled versions. They have the EVGA Precision, and now they have this. This si the total package when you get one of these things.
I'm really happy with mine.
I have never reached 80's ever. I've only reached 70's with stock cooling a couple times when I left the fan at default 40% when it was folding (forgot to slide the fan up). 71 actually. My card folds 24/7. I keep my fan at 80% at night, and 100% the rest of the time.
It's running now at 57 degrees folding with the GPU2 client. That's an SSC card and it's OC'd to 670/1451/1180. The people that had those issues had bad heat sinks or bad mounts that had somehow got by. I would not have a card that ran that hot...I'd send it back. There were probably a few in there that had the things packed into cases that had no airflow also. That happens.
I know they say these things are OK into the 80's to 90's, but I'll never run one at that ever. Rule #1 of the Overclocker has always been to control the heat. I'm putting mine on water to further lower the temps. I should get my stuff next week, and will have to see how long it's gonna take me to get that all tested out and setup. I bet the clocks go up. Talonman had great results with his EVGA GTX 280 after putting it on water. His bench scores went up, performance went up, temps went down, and clocks went way up.
I think you are missing the point. The original post was mine, and in it I say, with irony, that 0,2v is by no means a small increase as the EVGA representative says. Then some people missed the joke and started making things up. Just read the thread from the beginning :shakes:
did anyone notice that he gets better results(he says air) then people did with a hard mod on air?
Really nice utility. Still we cant change memory voltage right?
No, he does not: http://forums.guru3d.com/showpost.ph...60&postcount=9
I dunno about you guys, but my 55nm gtx 260 runs hot overclocked as is. Adding more voltage is a great idea if it were water cooled, but for the stock air cooling if you are using a case it is probably going to result in overheating. Granted im in a small dorm room that i have 0 control over the heat might make my heat issues only apply to me. :2cents:
Does this mean that I could flash a higher voltage directly onto the card because that would make things much easier. I was told that you cant do any volt mods in the bios of a GTX260 a while back.
Actually you can.
But not with the tools that are publicly available.
The same way eVGA Voltage Tuner does it, just implemented in the firmware ( in simple terms "they use the same 'switch', the Voltage Tuner will more than likely be a 'apply each time you enter windows(automatically or not)' thing while the firmware (BIOS) way is permanent".
Wish there were tools for all cards as I did not get an EVGA (availability) but an XFX instead. :(
/edit. ^^ But who's gonna be that brave man to develope such a tool with the help of given material (Rivatuner...)?
If I gotta waste my time with a stupid cmd line, I'll just vMod the card. He talks about that like that's something an enthusiast would waste their time with. Who in the hell wants to screw with cmd line interfaces? This guy so matter of factly talks "Welcome back to 1985".
If I had the time and knowledge I'd be jumping all over this. I'm sure you'll ask "but why, that is so 1985!" and the simple response is "no dead cards from soldering".
Right now I'm using an Asus Matrix HD4870 with their iTracker(Apple product?) which gives me up to 1.6vGPU and 1.8vGDDR5, more than sufficient for anything but LN2/DI. So much easier to just punch in voltages than screw around with adjustable resistors :up:
Jeez, that was easy... took me all of 15 minutes to figure out how to modify my GTX 280 with RivaTuner to run at 1.19v (1.20 input) reading by monitor, versus the 1.04 stock. Thanks for reposting that post, I never thought of looking in the source there.
EDIT: Testing instructions fully, I made an error in here and will repost shortly.
Since all GTX 280's have the same IC allowing software to change core voltage then this tool will eventually be available for all cards not just EVGA.. ie. Rivatuner
Where did you get that plugin?
It should write to VT1140 I2C address space and modify the register/s responsible for setting the main VR output voltage. If you can read the voltage through I2C, and its not a read only register you can definitely write it no problems.
Goldentiger,
Sounds like you wrote up a simple little C++/ASM function to handle the voltage adjustments? If so cool mate :up: Does changing the register affect all Performance Level voltages or just 3D Perf? Had a quick look over the VT1103 plugin source and looks like it wouldn't be hard at all to write a little tool to do it. I think I got the gist of the registers, 0x19 looks like it holds performance level value, 0x1c looks like the money man :)
So when will this proggy be available ?
Heya,
I am using rivatuner's command-line functions to do it... just trying to get everything nailed down before I actually release instructions. I'm a beginning programmer, so I'm not really capable of writing such a thing myself yet ;).
All: I will release instructions once I have everything nailed down perfectly... I'm still fiddling a little but am getting close. I don't want people to fry their cards if I make an error.
EDIT: I need to learn to take better notes. In case you're wondering, the reason it's taking so long is that I calculated everything in my head, it worked, and now that I was going to document it I'm not making the calculations the same way I was doing them before somehow, resulting in the wrong registers :mad:. I've been trying to retrace my steps but haven't had a heaping bunch of luck so far. I'll keep at it, hopefully early tomorrow...
Thanks GT, looking forward to your guide.. :up:
OK, I feel like an idiot now... stock GTX 280 voltage is 1.2v, correct? If that's the case, I didn't end up actually changing anything :(. Still, the basic procedure should be on-target, I just lack the actual address info and haven't had luck figuring it out yet. I will keep trying but here's what I've learned so others can try to dig through it too. Hopefully this results in something later either from me or someone else, and I apologize for the false hope offhand :).
A few notes... 0x0a appears to be the address for the VT1165, which is what RivaTuner says the chip is on my 280. You can activate the voltage monitor under the plugins inside RivaTuner's monitoring section (open the plugin for VT1165 voltage monitor and add it to your window). The source code for the plugin is under the SDK folder in your rivatuner folder, which is where I was getting values from to test.
Other key values to look at are 0x19 and 0x1c as noted by Mikeyakame.
The source code for the plugin is at: C:\Program Files (x86)\RivaTuner v2.22\SDK\Samples\Plugins\Monitoring\VT1103\
(Your program path may vary) Source file is the VT1103.cpp, which includes functions for VT1103, VT1105, and VT1165. You'll need a basic understanding of C++ to get what it's doing.
You can dump the register data with: rivatuner.exe /i2cd
You can read with: rivatuner.exe /ri<I2C_bus>,<I2C_device>,<reg>
Example: Bus 2, deviceID 37, reg 0x0a:
rivatuner.exe /ri2,37,0x0a
To write, substitute the "r" for a "w", and add one more comma then the data to write (hex, 2-digits). example: rivatuner.exe /wi2,37,0x0a,7D ..... would write to bus2,device37,addressregister 0x0a,data written 7D
Here's what the guide was looking like... note this is the correct procedure, but it doesn't have the right values...
Quote:
Originally Posted by GoldenTiger
Isnet this a bit like Gamer HUD?
GoldenTiger, you are doing a great job! If you had the correct value in your head before, I'm sure it will "click" eventually :)
Err... if this is all possible with RivaTuner, then would it not be possible to also modify the memory voltage? Are there any other significant voltages that can be changed?
Oh, and I have to say I am a bit disappointed with unwinder. Not with the program, of course, it's free, and you don't look a gift horse in the mouth. But his attitude toward voltage adjustment. He could probably give us all the answers we need very easily, and even add voltage adjustment to the RT gui. The whole argument of "I don't want to be responsible for people killing their cards".
Err... this is XS. All you need is a disclaimer that you are not liable for any damages caused by the program, and you're good to go. And if you really feel bad about cards being killed ... just add explicit warning boxes or something that indicate the danger and possible damages that can occur with voltage adjustment. Obviously, if someone will still do it after seeing all that, then they clearly want to do it and understand the risks involved. I mean there are so many people that would love to see voltage adjustment in RivaTuner. Maybe even have it as a separate download or something, with explicit warnings.
I personally don't see how not wanting to be the cause of dead cards is a very strong argument... people will do what they want with their hardware, and you just have to live and let live. Hmm. Well I sure would donate money for RT if gui voltage adjustment was added, and I'm sure others would as well. That alone should indicate just how much people want it. :)
Sorry for the rambling post... I'm tired.
...and now everybody knows why I quit messing with that stuff years ago. The ammount of time you will spend trying to figure it out, you could've already had the mod done and working with just a basic understanding of electronics and soldering.
Then add to the fact that one mistake ina cmd line and you might accidentally...well, you could do anything to your card, and you might not be able to *undo* it. That's the achilles heel of a cmd line interface.
Then, let's assume you do manage to figure this all out and then find that the voltage bump did absolutely nothing for your clocks. Oh Joy! Now you get to undo it, and pick up all the hair you pulled out because it got you fire breathing mad. heh :D
I think I'll wait on EVGA and keep my card under warranty and step up, and continue to steer well clear of cmd lines. Bring it on EVGA we're waiting. Heck go ahead and send me a beta! I'm ready.
Before judging, please try to support the software with rather big community yourself.
Yes, this is XS, but RT's user base is much much bigger than XS and not all users have good (or even minimum!) technical knowledge level. During the last 5 years RT was downloaded from the primary mirror almost 10.000.000 times and 5.000 - 15.000 users are joining RT community every day. And unfortunately 90% of these users are never reading any warnings/disclaimers, don't use built-in help and often tend to "tune" the system by enabling everything and dragging all sliders to the maximum. I really enjoyed supporting RT in the very beginning in 2000 because it was used mostly by enthusiasts and there were interesting technical discussions in support forums. But with the current user base size support transformed into real hell, most of questions were answered 100 times before or contain something like “I enabled something, now my system is working improperly, fix your crap”. Due to domination of such questions I had to shut down support via ICQ completely and also reduced support in Guru3D forums to minimum.
So I'm sorry, but I definitively wouldn't like to make RT support more troublesome than it currently is. Even now I have to deal with users configuring the tool blindly then spreading "RT is crashing the systems, beware" in different forums. And I won't like to give new sharp axe like _easy_ voltage control to these users, sorry. That's my responsibility and my choice.
Only those who think and ask get the answers ;) I'm always glad to assist those who try to go beyond standard ideas and approaches. For example, a few months ago one Russian user was analyzing RivaTuner's VT1103 plugin code and got the idea of using it for voltmodding his AMD 4870. He got full assistance and and even published small article about it:
http://people.overclockers.ru/CoolCmd/record1
Also, during discussing the article in the forums I hinted that GTX200 series use absolutely the same VRM so these _public_ instructions can be used for voltmodding it. But it looks like there were no smart GTX 200 owners there.
Hey, hey, hey...
Let's be more positive here, please.. I know that you've been "burned" by all these forum noobs whining and flaming ya...
Well, I trust that you know that Rivatuner is probably by far the most popular "add-on" video card driver tool. It continues to rise in popularity despite your declining concern or support for answering these noobish complaints.
I mean, if I were you, I would not worry at all about including voltage adjustment stuff. It would only further boost Rivatuner's popularity, if anything. If Evga has the balls to do it, you probably have the balls too, right? God's law: Those balls shalt not whine!
After reading this article I understand everything except how to find the address of the VRM registers. I think they can be found from the VT1103 source code but I am not good with C++. Would anyone happen to know the address of the VRM registers for a GTX 260? Is there is a way to find them besides looking at the C++ code?
Volterra VRMs are standartized I2C devices, so device/register addresses don't depend on display adpter model and are the same as in the article. The only difference between NV and AMD display adapters is that the VRM can be located at different I2C bus. But you can easily find desired bus by probing VRM address on all available buses, e.g.
RivaTuner.exe /ri0,70,1A /ri1,70,1A /ri2,70,1A /ri3,70,1A
It sould give you valid reading only at one bus. That's exactly what the plugin is doing - it is scanning all buses and trying to locate VRM on one of them.
/ri3,70,1A returned a value of 0a, the other buses returned invalid. If I understand correctly this should be the hexadecimal value for VID. This confuses me because if I use this formula:
VID = (voltage - 0.45) / 0.0125
I get an actual voltage of 0.575v
To test, I tried changing the value from 0a to 0b with this command:
/wi3,70,1A,0b
But after running that if I read the value again it still shows 0a
Why do you try to write to register 1A instead of those mentioned in the article? Register 1A is a hardwired read-only VRM model identification register. Value 0A, which you read from this register is VT1165 Model ID, you cannot (and not supposed to) re-program it.