So any news when we can get our hands on the EVGA Beta or Official version of this program. I'm not a fan of Riva Tuner and I have a EVGA card anyways.
Main Rig
- Intel Core i7 4790K CPU Stock @ 4.4Ghz
- Asus Maximus VI Extreme Motherboard
- 32GB GSKILL Trident X 2400MHZ RAM
- EVGA GTX 980 Superclocked 4GB GDDR5
- Corsair TX850W v2 TX Power Supply 70A 12V Rail
- Swiftech Apex Ultima w/ Apogee Drive II & Dual 120 RAD w/integrated res
- 2X Seagate 333AS 1TB 7,200 32MB HD's in RAID 0
- 2X Samsung 830's 128GB in RAID 0
- Windows 8.1 Pro x64
- Coolermaster HAF-XB
- Dual Asus ProArt PA248Q 24" IPS LED Monitors
- Samsung 46" 5600 Series Smart HDTV
- iPhone 6 Plus 64GB AT&T & Xbox One
UNOFFICIAL Rampage II Extreme Thread
N.A.S.A....The wheel has just been re-invented
yes uni's()R.T programe has had this feature for about a decade for some ati and nv cards
now guys type with your head not over it.
when i use this method i like to melt solder at the same time just for ambionce of hard mod
Last edited by slapmehard; 01-25-2009 at 06:00 AM.
Main Rig
- Intel Core i7 4790K CPU Stock @ 4.4Ghz
- Asus Maximus VI Extreme Motherboard
- 32GB GSKILL Trident X 2400MHZ RAM
- EVGA GTX 980 Superclocked 4GB GDDR5
- Corsair TX850W v2 TX Power Supply 70A 12V Rail
- Swiftech Apex Ultima w/ Apogee Drive II & Dual 120 RAD w/integrated res
- 2X Seagate 333AS 1TB 7,200 32MB HD's in RAID 0
- 2X Samsung 830's 128GB in RAID 0
- Windows 8.1 Pro x64
- Coolermaster HAF-XB
- Dual Asus ProArt PA248Q 24" IPS LED Monitors
- Samsung 46" 5600 Series Smart HDTV
- iPhone 6 Plus 64GB AT&T & Xbox One
UNOFFICIAL Rampage II Extreme Thread
They said soon, so no release date, but knowing EVGA when they they soon they get it done. Maybe a month? Don't know just guessing. It looks like they have a beta going there becasue they already tested it on a card. That's why I'm guessing a month or less. I really have no idea though. Just a guess.
Well have changed the voltage on my card but now the card always run in 2d performance mode
Mine can do 850 out of the box at 1,263, but it seems more voltage does nothing... Tried up to 1,45v, a bit more and I get monitor signal out of range. I can't confirm with a DMM but it draws more power from my GB power supply also. So no 850+MHz love for me, but at least this sucker now draws the same power as my old 3870 at idle (with 0,90v and 500/500, memory clock is key here), which is absolutely nice giving its horrible stock idle consumption. I've done a simple .cmd file that loads automatically at startup and it's working nicely. Also be aware, if you get a driver restart the new values will return to default.
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
Thanks for all that contributed to this thread and especially to Unwinder for giving us RT to play with.
Working great with a GTX295 here (just raised it from 1.04 to 1.1 on both to test. i notice that gpu1 vrm's get a whopping 10° hotter than gpu0, is that normal? bad contact maybe?).
Out of interest, what are standard volts for gtx285 type cards?
Last edited by p2501; 01-25-2009 at 09:18 AM.
1.15V iirc.
Thank you very much, so it seems I'm still in the safe zone voltage vise.
I have problems getting the second gpu perform as good as the first one. ATM I'm just testing the shaders (with which gpu1 has problems from 1441 on, while gpu0 is still stable at 1476) for folding stability. It needs few hours of torture to be sure, so I'll write tomorrow in the morning if that bump did something. I'm returning this card anyway and go for an xfx one so I don't really care, but it's nice to know that I _could_ stabilize it that way (within safe limits of course).
i did the mod to 1.2v on my 260 and i didnt get much of a boost at all, 3dmark locked up at 750 core. shaders holding me back of course. dunno what to do to get that oc better ;(
Intel Core i7 920 #3841A437 @ 3.8ghz 1.26v HT off
Thermalright True-120 Extreme
Gigabyte ex58-UD3R @ 190x20
6gb (3x2gb) G.Skill Pi @ 1520mhz 7-7-7-21-1T
PNY 8800gtx @ 640/1000
abs (tagan) 700w
Antec Nine Hundred
Seagate 320gb
LG 25.5" LCD
Logitech G11 + mx518, Logitech x530 5.1 + plantronics DSP500
tried the first 4 cli commands, ri0,70,1A thu ri3,70,1A and they all came up as invalid. using a g92 8800gs. Any ideas?
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
I'm doing this on my 4870 right now too. These are the ones I got on a stock 4870.
ri1,70,1A = Valid
ri1,70,15 = 41 = idle
ri1,70,16 = 41 = ?
ri1,70,17 = 41 = 3d load
ri1,70,18 = 41 = ?
Still on a stock cooler on the thing though, so I haven't dared to take it too far up in the volts. Idle works great at 0.9v though, power usage way down.
Intel i7-2700k@ 4.7ghz (46x102)
Asus P8Z68 Deluxe GEN3
G.Skill 2x4gb RipjawX 2133 11-11-11-30
GTX 680 1220/7000
Corsair TX750W
Razer Lachesis w/ Razer Pro|Pad
1x160gb Seagate HDD, 2x1tb Seagate HDD
LG 22" 226WTQ & BenQ G2400WD
Windows 7 Ultimate x64 SP1
The only thing I could report tomorrow before I refund her is if the raise in voltage helped the shaders to at least stay stable a 1440. But if I've got the XFX I definitely will report what this one can do. What I can say now is that so far it helped because there's no error yet. With the little bump of .06v it's now drawing a whopping 50 watts more from the wall.
Oh and temps with this are no problem even on air, with the same fan setting it raised the cards temp by just 10°C, which is still manageable.
Alright, after a bit of testing. I bumped the volts up to 1.3 on my 4870, hits 840/1100 stable just fine and lives for 15 minutes in furmark. Problem being, it gets a VPU crash within seconds when I try to play FEAR. 820/1050 is the max stable for that. This is at the same 1.3v. The temps are quite a bit lower than the furmark benching too. At stock volts both furmark and FEAR fail at the same settings, so I don't know what's up. Any ideas?
Intel i7-2700k@ 4.7ghz (46x102)
Asus P8Z68 Deluxe GEN3
G.Skill 2x4gb RipjawX 2133 11-11-11-30
GTX 680 1220/7000
Corsair TX750W
Razer Lachesis w/ Razer Pro|Pad
1x160gb Seagate HDD, 2x1tb Seagate HDD
LG 22" 226WTQ & BenQ G2400WD
Windows 7 Ultimate x64 SP1
Some games just don't like high OC's. Did you try lowering the OC to see if it cleared up?
Yeah, at 820/1050 it cleared right up. Any higher and it would induce and instant crash. I'll wait till my HR-03 GT gets here and then see how much higher I can take it voltage wise, and OC wise. For now I'll go back to stock volts and my lower OC, but still take advantage of being able to set lower volts for my idle clock. 0.9v is nice for idle, lol.
I'll probably take my 8800 out too, just for good measure. Nothing I can really use physx in these days anyway.
Intel i7-2700k@ 4.7ghz (46x102)
Asus P8Z68 Deluxe GEN3
G.Skill 2x4gb RipjawX 2133 11-11-11-30
GTX 680 1220/7000
Corsair TX750W
Razer Lachesis w/ Razer Pro|Pad
1x160gb Seagate HDD, 2x1tb Seagate HDD
LG 22" 226WTQ & BenQ G2400WD
Windows 7 Ultimate x64 SP1
newls1, there was no error through the night with that problematic second GPU at 1476 (I know that's not much, but FAH is pretty demanding), and I have a feeling that there's more in it even with that little voltage raise. I might say that at least my current 295 seems to benefit from extra volts. Maybe some watercooling would help her out some more.
Can you imagine the power requirements that it would take to run two volt-modded 4870X2's. I can barely run my two 4870 X2's as is.
A few tips and tricks:
1) Once you've determined index of I2C bus containing VRM on some display adapter (e.g. I2C bus 3 on GTX 200 series), the same index can be safely used on the same display adapter model by others. Display adapters have a few I2C buses assigned for differed functions (e.g. for connecting DDC and for external devices like VRMs, thermal sensors, fan PWM controllers and for on), VRM's I2C bus is defined by PCB design so it is fixed for the same display adapter families.
2) Don't try to scan more I2C buses than the GPU actually has (there was some posting with attempt to scan buses 0-99 in hope to find VRM on G92). Each GPU architecture supports fixed number of I2C buses, e.g. G80 and newer GPUs have only 4 I2C buses, pre-G80 supports 3 buses, pre GF4 supports just 2 buses and so on.
3) I see that many users started to enable VT1103 plugin now. Please pay attention to the following info from RivaTuner's release notes and always remember about it when using this plugin:
"Please take a note that Volterra voltage regulators are rather sensitive to frequent polling and may return false data under heavy load, so it is not recommended to use VRM monitoring in daily monitoring sessions"
4) There were some questions about finalizing these new VRM settings in NVIDIA VGA BIOS. You cannot use Nibitor for that because the tool knows nothing about VRMs and works with BIOS voltage tables only, it is only allowing you to change associations between performance levels (i.e. 2D/3D modes) and 4 fixed voltages stored into VRM registers 15-18 by default. However, you can easily edit your BIOS with any hex editor to reconfigure initialization scripts writing these 4 fixed voltages to VRM during POST. It is rather simple task, taking my 65nm EVGA GeForce GTX 260 as example the following script command in VGA BIOS is configuring VT1165:
4D 80 E0 06 15 3B 16 31 17 36 18 2F 1D 55 19 01
The command uniquely identifies I2C serial byte write operation, encodes target I2C device address (E0 is 8-bit encoding of VT1165's 7-bit address 70 including read/write flag in the first bit), tells script processor how many bytes have to be written (06) and finally defines register addressed and data to be written to each register (register 15 -> 3B, register 16 -> 31 and so on).
The voltages can be different for different VGA BIOS images, so the easiest way to locate this command in any GTX200 BIOS image is to search for 4D 80 E0 byte chain.
Last edited by Unwinder; 01-25-2009 at 11:30 PM.
Bookmarks