Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
Great utility! thanks for posting.
For people using the gtx 295, you will need to use device /sd0 for the first and /sd2 for the second gpu, for this tool that means choosing the sli/cf mode option rather than multi-gpu (as multi gpu assumes /sd0 and /sd1)
This can ofcourse be confirmed using rivatuner's monitoring graph.
2600K H2O
MSI 7970 OC
Asus P67 Pro
G.Skill 2133
ok, here is probably my last post on this:
just uploaded a rivatuner plugin with allows changing voltages from rivatuner:
see my thread here
Processor: Intel Core i7 990X
Motherboard: ASUS Rampage III Extreme
Memory: Corsair CMT6GX3M3A2000C8
Video Card: MSI N680GTX Lightning
Power Supply: Seasonic S12 650W
Case: Chieftec BH-01B-B-B
Good job mate , hope the 285 and 260 (55) will get support too soon
Question : Why do some overclockers switch into d*ckmode when money is involved
Remark : They call me Pro Asus Saaya yupp, I agree
I feel like crap now that i sold my GTX280 and put in the new GTX285 that is not supported.
|ASUS Sabertooth Z77|Intel Core I7-2700K|32GB G.Skill TridentX F3-2400C10Q-32GTX|Corsair AX1200W|
|ASUS GTX TITAN + Zotac GTX680 for PhysX|Samsung S27A950D|Corsair Obsidian 800D|Corsair Hydro Series H100|
|ASUS Crosshair IV Formula|AMD Phenom II X6 1090T BE|16GB G.Skill RipjawsX F3-12800CL7D-8GBXH|
|2X AMD Radeon HD6970 Crossfire|TT Kandalf - MOD|Corsair HX1000W|Corsair Hydro Series H70|
It's not supported yet, but it possibly will be later. I doubt EVGA would tell their users there's gonna be a version with support for 55nm card later, if they knew they could never program the Intersil!
on the gtx 285 (ISL6327CRZ) nvidia most probably connected some GPIO pins to the VID input pins but this will give you like 3 or 4 selectable voltages in a rather small range. it will definitely not be what you can get with the i2c based chips.
I just bought a 65nm GTX 260. With this voltmod, it has a slight advantage, IMO.
1. larger process, handles more volts than 55nm
2. higher quality stock HSF and shroud
3. voltera VRM that is programmable for the softmod
4. $190 new
this thing, even with 192 shaders, might be able to perform somewhere between a gtx280 and 285 once i get the core > 800mhz.
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
That makes a lot of sense Wiz!
Something along those lines would be executed by controlling say 3 or 4 dedicated GPIO pins which would be used to assert/deassert the VID pins based on the required voltage table bit mask to obtain 3 or 4 preset Performance Level VIDs? Ie. use different open GPIO pin combination to get 4 bit mask range, 00 01 10 11 of which each has a preset VID entry to connect VID0-VID7 pins based on what Voltage Table mask is needed to output each performance level voltage.
I'm not all that knowledgable on the GPIO functionality so I've just taken a stab at how i think it may be done, feel free to correct me if i've missed the target heh.
Last edited by mikeyakame; 02-07-2009 at 07:24 AM.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
I've been using Riva Tuner to adjust voltage on my GTX 260. Running clocks at 706/1501/1306, on 1.15V (up from 1.13V stock),in COD:WAW I'm getting a solid 60 fps with V-sync enabled, all eye candy enabled, run at 19x12 resolution. Other games behave similarly. I also force triple buffering with Riva Tuner. How much more could I get with a GTX 280 or 285?
Just being able to adjust voltage to stabilize clocks helps these 65 nm cards render at the same level of the higher cards. I really don't think that you'll see all that much of an improvement on the higher level cards as compared to the 260's. With this voltage tuner, EVGA just extended the life of the 65nm 260's.
Last edited by Sailindawg; 02-07-2009 at 09:26 AM.
Asrock Extreme 4 | 2600K | 16G G Skill | Powercolor LCS 7970 | PCP&C 910 Silencer | Crucial C300 + Intel X-25V + Raptors | D-Tek Fusion v2 | Swiftech MCR Drive 360 | HP LP2475
Finally released:
http://www.evga.com/articles/00462/
Gaming/Rendering rig:
eVGA X58 Tri-SLI
Intel i7-970 w/ Corsair H100
24gigs Corsair 2000s
eVGA GTX580 3GB
Too many HDD's
LG Blu-ray player
Corsair hx1050 psu
Corsair 800D case
Correct. But normally amount of such GPIO controlable pins is reduced to required minimum during the PCB design (the rest VRM's VID pins are hardwired), so normally there is just 1 GPIO pin connected to VRM if 2 different voltages are needed, 2 pins if up to 4 voltages are needed etc. So maximum you can expect is having limited set of fixed voltages. And the maximum one is normally aready in use.
For some reason or another, I cant log into EVGA? It seems that there server or some sh*t is down. If I click "Member login" I get a "PAGE CAN NOT BE DISPLAYED" error everytime. Damn it, i've been waiting for so long for this app. EVGA FIX YOUR WEBSITE.
Nothing anymore
I'm on the site now and it is fine
Gaming/Rendering rig:
eVGA X58 Tri-SLI
Intel i7-970 w/ Corsair H100
24gigs Corsair 2000s
eVGA GTX580 3GB
Too many HDD's
LG Blu-ray player
Corsair hx1050 psu
Corsair 800D case
Stock voltage at 2D settings is 1.11V by RT. I also have an EVGA Superclock card. That accounts for the higher 2D stock voltage. The 3D stock voltage is 1.13V. Don't know how high the gpu core will go for max clocks, but I have noticed that with a slight adjustment, with <<max clocks, the performance increase is very noticeable.
Edit: EVGA Voltage Tuner works like a charm. RT is better because it makes you think about what the hell you're actually doing!
Last edited by Sailindawg; 02-07-2009 at 10:51 AM.
Asrock Extreme 4 | 2600K | 16G G Skill | Powercolor LCS 7970 | PCP&C 910 Silencer | Crucial C300 + Intel X-25V + Raptors | D-Tek Fusion v2 | Swiftech MCR Drive 360 | HP LP2475
here is 820mhz core stock cooled from August 08 @ techpowerup
http://img.techpowerup.org/080801/bios.jpg
@Sailindawg: how high does your 260's core go?
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
Finally have been able to get 720c/1570shader/1170 stable using only 1.15v on my GTX295 Awesome job evga
Nothing anymore
Wow, that post was from Tech Power-Up was pretty sweet. And it's stock cooled like mine. Don't know what max clocks are, I ought to figure it out!
Update: Been playing around all afternoon. The most stable clocks I was able to get were 774/1548/1306 @ 1.165V, by EVGA Voltage Tuner. Shaders did not like getting much beyond 1548 or so. This was Vantage stable. Higher volts had no impact on stability. It only helped things crash faster. Max gpu temps were 58C with fan at 96% duty cycle. I ran this using windows 7 64 bit beta and 185.20 beta drivers. MMMV with XP 32 bit and a different driver. Still, this 260's got a lot of unlocked power.
Last edited by Sailindawg; 02-07-2009 at 03:04 PM.
Asrock Extreme 4 | 2600K | 16G G Skill | Powercolor LCS 7970 | PCP&C 910 Silencer | Crucial C300 + Intel X-25V + Raptors | D-Tek Fusion v2 | Swiftech MCR Drive 360 | HP LP2475
Bookmarks