If you do make sure its a 65nm chip for playing with voltage. 55Nm chips got budgeted and slapped with a cheaper 4 phase VR. Cutting costs on the shrink you know how it is ;(
G200 chips don't respond all that well to voltage anyhow, well they respond with heat not so much groundbreaking performance though. Even with increased voltage to around 1.25v they still become shader limited early, most cards won't do more than 1674mhz shader and 780mhz core, even then keeping the card cool is a feat in itself. If you can keep the temps near zero you might have more luck with lots of voltage
Last edited by mikeyakame; 01-24-2009 at 10:34 AM.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
well if it has writable registers for selecting voltage and you can figure out which ones they are then awesome
Not all voltage regs have support for selecting voltage through register writes. You need to check the datasheet for the VR and hope it has some kind of useless register descriptions!
But the guide posted on writing is only for Volterra VT11xx registers, by this I mean register address mapping, ie 0x1A.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
Just wanted to add that the 285s dont have the same Volterra IC as the 280/260. It has an Intersil ISL6327, and here is the data sheet........
http://www.intersil.com/data/fn/FN9276.pdf
Dont know what to look for, so not sure if it supports software voltage control.
Lian-Li V1000B || i5 2500k || Gigabyte P67-UD4 || 4GB Dominator 1600s || eVGA GTX 570 || 60GB OCZ Vertex || Corsair HX850
Working! GREAT!
1.6V... kinda a big jump in voltage if normal is 1.2V. 33.3%
what were your clocks at 1.2V vs 1.6V?
Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
this really made my day! now we need a thread with Vcore/OCgain chart or something.
Well you can have my Vcore/OCgain first. Its big fat zero. Like what some of the knowledgeable folks on this forum has said, the temp matters more than the Vcore on this architecture.
Anticipating an increase in Vcore, I removed my whole system out of my ghetto case for better airflow/temp. And the Vcore did nothing, zilch, nada. The default Vcore on my card is 1.19v. The cards are GTX280 Zotac AMP and MSI Super OC. Both are already highly oc'ed at 702/1402/1150. I went even to 1.35 but that did nothing. Tried everything in between too. The same, nothing. But the temp improvement from the open air system does improve oc little bit from 712/1456/1332 to 720/1512/1332 at default Vcore.
Last edited by auchkoenig; 01-24-2009 at 01:11 PM.
Just ran furmark and my vrms hit 95c at 80% fan speed! Is that normal? I dont think that bumping up my voltage will do much for me if my vrms are running hot as is.
I ran 740/1566/2600 on 1.21v just then, only started artifacting in Furmark at 70% and its pretty damn warm here this morning in AU. I'll hit up some air conditioned testing later on this morning.
I can normally run 740/1512/2700 no problems at 1.19v, and on a cold day 730/1566/2700.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
i'm running on chilled water so the temps are manageble. btw, the unwinders advice on how to get it to work at rivatuner startup worked like a charmp.s. guys for how long do you run ati tool artifact scanning (or is there a better utility?) before you know the oc is good? 10 minutes enough?RivaTuner's task scheduler module was designed special for such tasks. Go to <Scheduler> tab, click <Add new task> button and type in task name, e.g. "Voltage mod", select <Launch application on specified schedule> task type in %RT% macro as path (RivaTuner will expand it into fully qualified path to itself when executing) and desired I2C writec commands in the command line field then selct desired schedule type, e.g. <Run task at RivaTuner startup>.
I'm still testing: 1.3V 758 core, 33C gpu.
next i'll try 780, then 800
edit: 758 stable for 10 min, 783 stable 10 min 810 artifacts from the beginning. should i try more V or over 1.3 is risky?
edit2: tryed 810 with 1.3 1.33 1.38 still artifacts...
so 783/1566/1152 for the moment.
Anyone thinks unlinking the shaders might help?
edit3: i've been looking at the atitool fur cube artifact scanner for an hour now, looking for arti, and it is messing with my head, will continue tomorrow.
Last edited by drifter; 01-24-2009 at 02:01 PM.
over 1.3 is useless. you have reached chips limits it seems mate. I can't remember how high k1ngp1n managed to clock a 280 on liq nit, but even with that it wouldn't have held out for continuous use.
what shader have you hit? thats the critical factor. 1674 is the highest most can push.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
i want a 24/7 oc so might aswell stick to 783. Should i clock shaders higher? any benefit in that or is it better to keep em linked?
um depends what shader clock you are using linked heh I never link shaders so i'm not familiar with what shader you are using at 783!
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
shaders are double the core in linked mode, so 1566
oh then hell yes try 1620 if that is ok, go for the big daddy 1674 hell if by some fluke that passes too jump to 1728 lower core a little while testing 1620 and 1674, shader clocking on the 280 is more beneficial than core, core doesn't give much improvement in Vantage and other benchmarks while shader makes more of a difference from one to the next. If you don't go too crazy on core clock, you may be alright for stability. 760-770 would be as far as I'd go on core, and get shader as high as you can.
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
finally i cant resist to kick in:
verified this on my gtx295 and works as expected! the default voltage is 1.04V. guess nvidia reduced the voltage here to keep the cards a bit cooler, so increasing vcore here might help a bit more. have not yet tested to overclock the card, just verified if this works here and it does.
cooling is an issue tough, as i have already reprogrammed the fan controller and temps are hitting 84C during hours of crysis wars. fan duty at about 70%, still some headroom here!
Processor: Intel Core i7 990X
Motherboard: ASUS Rampage III Extreme
Memory: Corsair CMT6GX3M3A2000C8
Video Card: MSI N680GTX Lightning
Power Supply: Seasonic S12 650W
Case: Chieftec BH-01B-B-B
wow 1.04v is pretty low even for a 295! expected at least 1.1-1.11v not bad at all for a shrink. you'll have stability issues with too much shader freq at temps like that but give it a try anyway interesting to see results none the less!
DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP
is it possible to change the voltage on GTX260 55nm Cards?
1.04 is idle v i think.
Also to people who say increasing v does not help:
i tried 810, 1620 shader and with 1.3 lots of art, 1.4 much less, 1.41 almost gone, 1.42 stable for 5 min.
also unlinking the shaders got me artifacts...
edit:
i tryed again to unlink shaders, and this time i left them at my stable 1566, and got to 810core (at 1.3V!) with no arti for 5 minutes. when i tried 810 and 1620 shaders i got arti. apparently my gpu is shader limited.
Last edited by drifter; 01-24-2009 at 02:37 PM.
obviously not on the 295! it remains 1.04 even if it switches to 3d mode (clocks increase from 300MHz up to 576MHz)!
as i have not yet found vcore voltage measuring points i'm up to the readings i get from software. will verify this with a voltmeter later, as soon as i know where the measure points are.
Processor: Intel Core i7 990X
Motherboard: ASUS Rampage III Extreme
Memory: Corsair CMT6GX3M3A2000C8
Video Card: MSI N680GTX Lightning
Power Supply: Seasonic S12 650W
Case: Chieftec BH-01B-B-B
It's probably almost useless to be turning up the voltage if your using air cooling. As the cards get pretty hot at stock voltage. anymore voltage is just gonna create more heat.
Anyone burn out their cards yet? lol
Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
Bookmarks