Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 76

Thread: GTX 280 voltmod thread

  1. #26
    Registered User
    Join Date
    Nov 2006
    Posts
    77
    Sorry for the delay guys, way too much going on around here.

    The mod went fine, but the results really were a mixed bag. First thing is its a hot chip and it responds really well to lower temps, but no so much increased voltage.

    Here is what the card looked like with the EVGA block removed, lots of white silicone, with pretty bad contact on the memory chips. I re-set the block with AS Ceramique to check contact and it was better with a little more paste and more torque on the block. Nylon spacers are used between the block and the PCB.



    The finish on the block was so-so, the machine work was a little rough but all that buffing costs money I guess. I pulled the block apart to clean the leftovers from tapping the inlet/outlet holes for fittings.






    I'm sure you've all seen EVGA's take on cooling, the logo is pretty neat and it seems to work pretty well, but seeing how I was already in there I figured I'd break out the dremel and create a little more surface area. I didn't go that deep as it is not that thick to start with. Flame on if you want, I couldn't think of how this would be detrimental, should at least create more turbulence through the "pin" array.



    The mod was refreshingly simple, with nothing really to burn up. I used to mount the pots legs directly to the solder point, but if the card is going to be moved around a lot, I have found the small wires give a little more without damaging the board.



    I actually forgot to install a vGPU check line before bolting the block up so I had to do this while the block was in place, otherwise I would have used a point closer to the outer edge of the board.





    While I had the block off I polished all the visable surfaces, and hit the ram and cpu seating surface lightly, which took off some edges you could feel with yoru fingernail. Its pretty sharp all polished.





    Results:

    Using chilled water (water temp is maintained around 60F with a 5000btu Daewoo AC) the card responded well. Idle temps were 20C while loaded temps were 28-31C.

    I found there is an issue with clocking the card past the high 700s and 800s range using Riva/Precision. You can get past this by using Nibitor, but your core clock will be reported as 100mhz in Riva/Precision. Also, as I have never loved Nvidia's dynamic clock switching (anyone remember the 7900 not switching?) I set all the clocks and VID the same in the bios. This caused a rather nasty corrupt screen when loading windows so I flashed it back, clock switching will stay I guess.

    I was able to run 826 core 1620 shader and a pretty impressive 2754 on the memory. I was able to run 2800 on memory for benchmarks, but it showed some errors in ATITool. 1674 shader was not doable at any voltage. Also, forcing different shader clocks in the bios has mixed results, as does forcing odd clock frequencies. Setting core to 794 and shader to 1512 would cause instant hard-lock in ATITool, setting core to 794 and shader to 1566 was stable. Setting core to 795 would instant hard-lock. All clock changes had to be made with bios as Riva/ATItool would not set clocks above 783 core.

    As far as voltage goes, anything over 1.350v caused it to pop pixels in ATItool constantly. Even with loaded temps in the low 30s. The card was bench-stable/game-stable at these speeds with the chilled water, but was not stable once loaded temps passed 45-47C (with the chiller off). Voltage was at 1.338 for tests.

    The card belongs to a friend was visiting, so we didn't have a ton of time to test it. Its going in a system with a QX 9650 and I'll post up some benchmarks once he gets it running in that system.

    Here is a 3DMark06 run on my Q6600:
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	3Dmark06-21585.gif 
Views:	6366 
Size:	76.0 KB 
ID:	81723  

  2. #27
    Xtreme Member JaD's Avatar
    Join Date
    Oct 2002
    Posts
    257
    What you are missing is that core and shader clocks are related even in unlinked mode.
    You can't set the core over certain frequencies if you don't clock the shaders higher aswell: when the shaders are at 1458, max core frequency will be 712; at 1512, the core will top at 748 etc.
    That is caused by the different core/shader ration GTX series have, which is different than 8800's.
    I've heard there are people who are trying to change that ratio through the bios but don't know the details.

    About software voltage adjustments, considering the same Volterra chip is being used I really hope W1zzard can sort it out like he did with ATItool for 2900s.
    Looking forward for the ATITool substitute he's been talking about

  3. #28
    Xtreme Member
    Join Date
    Apr 2006
    Location
    NY
    Posts
    137
    Quote Originally Posted by VETDRMS View Post
    Sorry for the delay guys, way too much going on around here.

    The mod went fine, but the results really were a mixed bag. First thing is its a hot chip and it responds really well to lower temps, but no so much increased voltage.

    Here is what the card looked like with the EVGA block removed, lots of white silicone, with pretty bad contact on the memory chips. I re-set the block with AS Ceramique to check contact and it was better with a little more paste and more torque on the block. Nylon spacers are used between the block and the PCB.



    The finish on the block was so-so, the machine work was a little rough but all that buffing costs money I guess. I pulled the block apart to clean the leftovers from tapping the inlet/outlet holes for fittings.






    I'm sure you've all seen EVGA's take on cooling, the logo is pretty neat and it seems to work pretty well, but seeing how I was already in there I figured I'd break out the dremel and create a little more surface area. I didn't go that deep as it is not that thick to start with. Flame on if you want, I couldn't think of how this would be detrimental, should at least create more turbulence through the "pin" array.



    The mod was refreshingly simple, with nothing really to burn up. I used to mount the pots legs directly to the solder point, but if the card is going to be moved around a lot, I have found the small wires give a little more without damaging the board.



    I actually forgot to install a vGPU check line before bolting the block up so I had to do this while the block was in place, otherwise I would have used a point closer to the outer edge of the board.





    While I had the block off I polished all the visable surfaces, and hit the ram and cpu seating surface lightly, which took off some edges you could feel with yoru fingernail. Its pretty sharp all polished.





    Results:

    Using chilled water (water temp is maintained around 60F with a 5000btu Daewoo AC) the card responded well. Idle temps were 20C while loaded temps were 28-31C.

    I found there is an issue with clocking the card past the high 700s and 800s range using Riva/Precision. You can get past this by using Nibitor, but your core clock will be reported as 100mhz in Riva/Precision. Also, as I have never loved Nvidia's dynamic clock switching (anyone remember the 7900 not switching?) I set all the clocks and VID the same in the bios. This caused a rather nasty corrupt screen when loading windows so I flashed it back, clock switching will stay I guess.

    I was able to run 826 core 1620 shader and a pretty impressive 2754 on the memory. I was able to run 2800 on memory for benchmarks, but it showed some errors in ATITool. 1674 shader was not doable at any voltage. Also, forcing different shader clocks in the bios has mixed results, as does forcing odd clock frequencies. Setting core to 794 and shader to 1512 would cause instant hard-lock in ATITool, setting core to 794 and shader to 1566 was stable. Setting core to 795 would instant hard-lock. All clock changes had to be made with bios as Riva/ATItool would not set clocks above 783 core.

    As far as voltage goes, anything over 1.350v caused it to pop pixels in ATItool constantly. Even with loaded temps in the low 30s. The card was bench-stable/game-stable at these speeds with the chilled water, but was not stable once loaded temps passed 45-47C (with the chiller off). Voltage was at 1.338 for tests.

    The card belongs to a friend was visiting, so we didn't have a ton of time to test it. Its going in a system with a QX 9650 and I'll post up some benchmarks once he gets it running in that system.

    Here is a 3DMark06 run on my Q6600:
    Nice read my friend!

  4. #29
    Registered User
    Join Date
    Nov 2006
    Posts
    77
    Quote Originally Posted by JaD View Post
    What you are missing is that core and shader clocks are related even in unlinked mode.
    You can't set the core over certain frequencies if you don't clock the shaders higher aswell: when the shaders are at 1458, max core frequency will be 712; at 1512, the core will top at 748 etc.
    That is caused by the different core/shader ration GTX series have, which is different than 8800's.
    I've heard there are people who are trying to change that ratio through the bios but don't know the details.
    Thanks JaD, I am able to change the core using Nibitor, hopefully the ratio will be directly editable at some time in the future.

  5. #30
    Xtreme Member
    Join Date
    Apr 2008
    Location
    Poland
    Posts
    240

    GTX 280 v-mod

    Hej! what you have the standard voltage GPU?
    i7 2600K @ 4.5GHz 1.28v 24/7 with offset mode + Venomous X black - ASUS Maximus IV Extreme R3.0 - EVGA GTX 570 SC - Corsair Dominator GT 2x2GB 2000MHz CL8 @ 2133 Cl9 + AIrFlow PRO- Corsair AX850W - Silverstone Fortress 2 - LC back soon My Gaming account on >> YouTube

  6. #31
    Xtreme Member
    Join Date
    Feb 2007
    Location
    Kentucky
    Posts
    407
    Has anyone flashed a GTX280 yet? I was thinking of trying to find a FTW edition bios and flash to it. Or can someone check the bios voltage settings from theirs? Mine currently has Extra set at 1.18V, the 3D set at 1.06V, and the 2D set for 1.03V.

    Oh Seba84, would you attach your bios please?

    Asus Maximus Formula (Rampage Conversion Bios 0403)
    Q6600 G0 stepping@4.05 Ghz
    8GB G. Skill 1066 @ 1081 Mhz
    EVGA 280 GTX
    Auzentech Prelude
    2X 74 GB WD Raptors in Raid 0 Windows 7 Ultimate 64
    2X 1 TB WD Caviar Blacks in Raid 0 Vista Ultimate x64
    Thermaltake Toughpower 1200W
    DTek Fuzion V2 in a custom CPU only water loop
    Coolermaster Stacker 830
    Hanns-G 28" widescreen
    Klipsh Pro-Media 2.1
    G15 Keyboard and G9 Laser Mouse

  7. #32
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Nice 06 scores, wow over 20k on a single gpu, ATI can't touch that Currently i'm waiting for a Koolance waterblock along with a Exos 2-LS to see if I can push my card anymore for stable gaming use. I have a DD Tieton on a Koolance Exos 1 running 1/4 in tubing so I imagine the results will improve with the new block (thinking the backplate and actual block quality will help) and Exos 2-LS. Right now load temps hit 63c and I still manage clocks of 756/1512/2650 rock solid stable. I'm hoping to be able to raise the shaders higher as 1600 is stable till it heats up past 50c like you mentioned. This would allow me to go for 800/1600/2700 however I'm not sure, esp with no voltage adj.

    About the votage adj. Did you adjust them all with a bios flash or hard mod? Does anyone know if Voltage adj will be available though a driver level like ATI? I mean Nvidia's own performance tab shows GPU and Memory voltage and drop down tabs for adjustmants yet they are greyed out.

    Any additional info would be great
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  8. #33
    Xtreme Member
    Join Date
    Apr 2008
    Location
    Poland
    Posts
    240
    Quote Originally Posted by speedfreak86 View Post
    Has anyone flashed a GTX280 yet? I was thinking of trying to find a FTW edition bios and flash to it. Or can someone check the bios voltage settings from theirs? Mine currently has Extra set at 1.18V, the 3D set at 1.06V, and the 2D set for 1.03V.

    Oh Seba84, would you attach your bios please?
    bios from the version FTW anything does not differ, there is also extra 1.18v 3D 1.06v and 2D 1.03v

    write You to me on priv this give link to bios from FTW

    I have flash bios from the version HC

    my PC: GTX 280 & water block from Aquacomputer GTX 280










    Last edited by seba84_2005; 07-09-2008 at 05:58 PM.
    i7 2600K @ 4.5GHz 1.28v 24/7 with offset mode + Venomous X black - ASUS Maximus IV Extreme R3.0 - EVGA GTX 570 SC - Corsair Dominator GT 2x2GB 2000MHz CL8 @ 2133 Cl9 + AIrFlow PRO- Corsair AX850W - Silverstone Fortress 2 - LC back soon My Gaming account on >> YouTube

  9. #34
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    [QUOTE=seba84_2005;3128532]bios from the version FTW anything does not differ, there is also extra 1.18v 3D 1.06v and 2D 1.03v

    write You to me on priv this give link to bios from FTW

    I have flash bios from the version HC

    Say what, lol. Are you saying the bios from the FTW has increased voltages? The engineers from EVGA said that they do not add voltage to any cards they sell.

    What are you using to flash the bios? Nitbitor?
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  10. #35
    Xtreme Member
    Join Date
    Jun 2007
    Location
    North Rose, NY
    Posts
    280
    No, he said the voltages are the same. And that if you want the FTW BIOS, send him a PM.

    Now the pics of his rig are just flat out sex. VERY clean, I like very much.

    Now, will the GTX 280 mods work on the 260?
    Intel i9 7900X @ 4.6GHz @ 1.126v
    ASUS X299 TUF MARK 1
    32GB G.Skill DDR4 2800
    2x ZOTAC AMP EXTREME Core GTX1080Ti's in SLi
    Loop 1: Apogee GTZ CPU block, MCR360 rad w/ Scythe fans, Micro-res, Aquaextreme 50Z pump
    U2-UFO case

  11. #36
    Registered User
    Join Date
    Jun 2005
    Location
    Thessaloniki
    Posts
    72
    Finished my GTX 280 v-mod yesterday, but my water pump is malfunctioning and the water flow is very poor, making me unable to test the overclock before I change it. At least I was happy to find a VGPU measure point on the top of the card, cause I couldn't use those shown at vr-zone as I had already placed the waterblock.

    BTW VETDRMS, I'm starting to worry about power consumption. In practice the stock GTX 280 eats approximately 200W under load according to intel from reviews. At 1.338V and 826 MHz core it should require about 75% more, hitting 350W while its power inputs are rated at 300W. I hope my Enermax 1KW Galaxy is capable of feeding a lot more than the rated 75 & 150W for the 6 and 8 pin outputs.
    Last edited by Orion24; 07-17-2008 at 01:58 PM.
    CPU: Core i7 920 D0 @4.2 GHz 21x200, 3.8 GHz uncore, 1.41875 Vcore, 1,56V QPI/VTT
    Cooling: Zalman CPNS9900 with AS5
    Mainboard: Giga-Byte X58A-UD7
    RAM: 12 GB Corsair 1600 CMD12GX3M6A1600C8
    Video Card: EVGA GeForce GTX 280 713/1428/1350 @ stock voltages
    Video Card cooling: Thermalright HR-03 GTX, heatspreader removed, AS5
    PSU: Enermax 1KW Galaxy
    Storage: Intel X25-M G2 160GB, 2x300GB VRaptors RAID-0, 3x1TB Samsung SpinPoint F3 RAID-0

  12. #37
    Champion
    Join Date
    Jun 2003
    Location
    cape town
    Posts
    1,172
    anyone seen a vmem mod for these cards?

  13. #38
    Registered User
    Join Date
    Aug 2006
    Posts
    91
    Saw the 4.2ghz q6600 and was like WoW... then say the nearly 1.7v Vcore and was like... omg nvm lol.

    The GTX280 is a good card and yes it is the most powerfull single card out there but Price vs Performance is where its at IMO. I grabed two 4850's for $150 each and am stuck at 19,400 3dmark06 right now being held back by my cpu because I cant break 3.7ghz on my quad, but im all air cooled and no volt mods :P

    So curious what would your score be with your q6600 @ 3.8ghz? I wonder if it would be equal to mine. I assume slightly better since dual cards should use more cpu overhead.
    Specs: Q6600 @ 3.65ghz 1.42v / Asus P5Q Deluxe / 2x2gb Mushkin DDR2@810 / PC P&C 750w / 2x ATI 4850 Crossfire @ 710/1050

  14. #39
    Registered User
    Join Date
    Jun 2005
    Location
    Thessaloniki
    Posts
    72
    OK, now I finalised my testing and I am not particularly pleased with the results, but here goes:

    Maximum overclock: (stock voltage/v-modded voltage)

    Core: 713 --> 771 (+8.13%)
    Shader: 1458 --> 1566 (+7.4%)
    Memory: 1377 (no v-mem mod)

    I couldn't get the shader 100% stable at 1620 no matter what. 1566 is stable at 1.25V and I went up all the way to 1.42 as an attempt to get 1620 stable and still no go. In fact the more I kept increasing the voltage, the earlier the tearing/artifacting seemed to occur (load temps mid to high 60s). Another note is that I never had a core overclock instability, since setting the clocks with rivatuner I was unable to set higher core clocks with the givven low shader clock overclockability. If flashing the BIOS gets around the core/shader 1/2 limitation, I'll give it a shot.

    Looks like yield problems are important for the GT200 chips. G92s can do 2.0 GHz shader with 1.1-1.15V and I need 1.25V to stabilize this at 1.56 GHz. It has so many more SPs, but no more than ~40% more shader performance in practice due to these yields.

    ** update **

    It seems that at 1.30V the card artifacts when the core temperature exceeds 75 Celcious degrees (this is not the case at stock voltage). In fact, I wonder where the temperature sensor is located. If the temperature sensor doesn't read the read die temperature, then I must calculate that for a givven temperature reading, the actual die temperature is a lot higher when higher voltages are used. Maybe 1620 MHz was inevitable because of cooling issues after all. I'm having a hard time controlling the temperature of this thing with the weather here at 35 - 45 Celcious degrees; I'll only know in the winter how much of a bottleneck the temperature is.
    Last edited by Orion24; 07-27-2008 at 10:43 AM.
    CPU: Core i7 920 D0 @4.2 GHz 21x200, 3.8 GHz uncore, 1.41875 Vcore, 1,56V QPI/VTT
    Cooling: Zalman CPNS9900 with AS5
    Mainboard: Giga-Byte X58A-UD7
    RAM: 12 GB Corsair 1600 CMD12GX3M6A1600C8
    Video Card: EVGA GeForce GTX 280 713/1428/1350 @ stock voltages
    Video Card cooling: Thermalright HR-03 GTX, heatspreader removed, AS5
    PSU: Enermax 1KW Galaxy
    Storage: Intel X25-M G2 160GB, 2x300GB VRaptors RAID-0, 3x1TB Samsung SpinPoint F3 RAID-0

  15. #40
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Does anyone know if Nitbitor can set voltage and have it stick on the G200? I would like to play around with some voltage adj and OCing but do not want to hard mod the card. For one i can't solder for crap and 2 it does not seem like volt mods help at all unless you have extreme cooling solution. That was how Fugger managed such high OCs on no volt mod. It seems like it's best to put your money into better cooling and not volt mods.

    I still would like to be able to set and test out some volts and OCs myself though
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  16. #41
    Xtreme Member
    Join Date
    Jun 2007
    Location
    North Rose, NY
    Posts
    280
    BIOS volt modding is not an option with most current nvidia cards.
    Intel i9 7900X @ 4.6GHz @ 1.126v
    ASUS X299 TUF MARK 1
    32GB G.Skill DDR4 2800
    2x ZOTAC AMP EXTREME Core GTX1080Ti's in SLi
    Loop 1: Apogee GTZ CPU block, MCR360 rad w/ Scythe fans, Micro-res, Aquaextreme 50Z pump
    U2-UFO case

  17. #42
    Xtreme Member
    Join Date
    Aug 2004
    Location
    rutgers
    Posts
    465
    Is the vmod the same for the GTX260, they appeared to be the same pcb, but wondering if anything such as trimmer resistance should be changed.

  18. #43
    Xtreme Member
    Join Date
    Nov 2002
    Location
    Netherlands
    Posts
    232
    Quote Originally Posted by Orion24 View Post
    OK, now I finalised my testing and I am not particularly pleased with the results, but here goes:

    Maximum overclock: (stock voltage/v-modded voltage)

    Core: 713 --> 771 (+8.13%)
    Shader: 1458 --> 1566 (+7.4%)
    Memory: 1377 (no v-mem mod)

    I couldn't get the shader 100% stable at 1620 no matter what. 1566 is stable at 1.25V and I went up all the way to 1.42 as an attempt to get 1620 stable and still no go. In fact the more I kept increasing the voltage, the earlier the tearing/artifacting seemed to occur (load temps mid to high 60s). Another note is that I never had a core overclock instability, since setting the clocks with rivatuner I was unable to set higher core clocks with the givven low shader clock overclockability. If flashing the BIOS gets around the core/shader 1/2 limitation, I'll give it a shot.

    Looks like yield problems are important for the GT200 chips. G92s can do 2.0 GHz shader with 1.1-1.15V and I need 1.25V to stabilize this at 1.56 GHz. It has so many more SPs, but no more than ~40% more shader performance in practice due to these yields.

    ** update **

    It seems that at 1.30V the card artifacts when the core temperature exceeds 75 Celcious degrees (this is not the case at stock voltage). In fact, I wonder where the temperature sensor is located. If the temperature sensor doesn't read the read die temperature, then I must calculate that for a givven temperature reading, the actual die temperature is a lot higher when higher voltages are used. Maybe 1620 MHz was inevitable because of cooling issues after all. I'm having a hard time controlling the temperature of this thing with the weather here at 35 - 45 Celcious degrees; I'll only know in the winter how much of a bottleneck the temperature is.
    What cooling do you use?
    The stock air-cooler ?
    QX9650 @ 4450 mhz
    Asus Striker II Extreme 790i
    4 GB Corsair Dominator 1800 @ 1870 8-8-8-20 1T 1.8v
    3 x Asus GTX280 TRI-SLI 810/1647/1296
    Asus Xonar DX
    Thermaltake Toughpower 1500 W
    Dell 30" 3007WFP
    Logitech G7, G15, G25
    Watercooling Hailea 1500 chiller, EK, Swiftech, D-Tek
    Dual Prometeia site

  19. #44
    Registered User
    Join Date
    Jun 2005
    Location
    Thessaloniki
    Posts
    72
    Quote Originally Posted by Justifire View Post


    What cooling do you use?
    The stock air-cooler ?
    Watercooling with an EK waterblock and my old crappy Thermaltake BigWater 745 installation (changed the water pump to a more powerful 500l/h one). The video card is cooled third (CPU --> NB --> GPU).

    The water temperature is about 45 Celcious degrees by the time it enters the EK waterblock. I can't do much about it in the middle of the summer. We do have an A/C in the house but it is too far away from the PC.
    CPU: Core i7 920 D0 @4.2 GHz 21x200, 3.8 GHz uncore, 1.41875 Vcore, 1,56V QPI/VTT
    Cooling: Zalman CPNS9900 with AS5
    Mainboard: Giga-Byte X58A-UD7
    RAM: 12 GB Corsair 1600 CMD12GX3M6A1600C8
    Video Card: EVGA GeForce GTX 280 713/1428/1350 @ stock voltages
    Video Card cooling: Thermalright HR-03 GTX, heatspreader removed, AS5
    PSU: Enermax 1KW Galaxy
    Storage: Intel X25-M G2 160GB, 2x300GB VRaptors RAID-0, 3x1TB Samsung SpinPoint F3 RAID-0

  20. #45
    Xtreme Member
    Join Date
    Nov 2002
    Location
    Netherlands
    Posts
    232
    Quote Originally Posted by Orion24 View Post
    Watercooling with an EK waterblock and my old crappy Thermaltake BigWater 745 installation (changed the water pump to a more powerful 500l/h one). The video card is cooled third (CPU --> NB --> GPU).

    The water temperature is about 45 Celcious degrees by the time it enters the EK waterblock. I can't do much about it in the middle of the summer. We do have an A/C in the house but it is too far away from the PC.
    Get yourself a waterchiller
    QX9650 @ 4450 mhz
    Asus Striker II Extreme 790i
    4 GB Corsair Dominator 1800 @ 1870 8-8-8-20 1T 1.8v
    3 x Asus GTX280 TRI-SLI 810/1647/1296
    Asus Xonar DX
    Thermaltake Toughpower 1500 W
    Dell 30" 3007WFP
    Logitech G7, G15, G25
    Watercooling Hailea 1500 chiller, EK, Swiftech, D-Tek
    Dual Prometeia site

  21. #46
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    G200 overclocking & modding isn't too popular on XS?
    There's just couple thousand views and few people with the cards discussing.
    :\
    You were not supposed to see this.

  22. #47
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    435
    Quote Originally Posted by largon View Post
    G200 overclocking & modding isn't too popular on XS?
    There's just couple thousand views and few people with the cards discussing.
    :\
    It's becouse volt mods do not help and for the most part can hinder performance. The best thing to do for a G200 is get it as cool as you can with a quality water block like the DD Tieton and use a killer water cooling solution along with it. Basicly, the cooler the core, the higher they clock
    I7 920@4ghz
    EVGA Classified X58
    6GB Corsiar 1600
    2X GTX 285@756/1550/2700
    Corsiar 1000WT
    ASUS 25.5 LCD/58"Plasma TH-58PZ800U
    Custom H2O cooling
    [SIGPIC][/SIGPIC]
    Oh, thats a pic of my other rig.

  23. #48
    Xtreme Member
    Join Date
    Aug 2004
    Location
    rutgers
    Posts
    465
    Quote Originally Posted by rodman View Post
    It's becouse volt mods do not help and for the most part can hinder performance. The best thing to do for a G200 is get it as cool as you can with a quality water block like the DD Tieton and use a killer water cooling solution along with it. Basicly, the cooler the core, the higher they clock
    Yea i did vmod on gtx260 and it seemed to make things worse rather than help... Unsoldered it a few days later.

    This was on water too btw.

  24. #49
    Xtreme Member
    Join Date
    Dec 2003
    Location
    Cape Girardeau, MO
    Posts
    216
    Will it at least help on the memory?
    Q6600 @ 3.4ghz 1.33v
    Evga 680i
    8800GTS 640
    2gb Adata Extreme
    OCZ Gamextreme 600w

  25. #50
    Xtreme Addict
    Join Date
    May 2008
    Location
    Land of Koalas and Wombats
    Posts
    1,058
    Changing VID tags in bios does nothing unless you actually change the physical voltage. Then it's a question of whether there is mapping for those voltage ranges or not. I've managed to get my memory to 2698MHz on air. I hit a wall at 2664MHz which I got around by loosening timings on Timing Set 0 in the bios. I can't remember what the original memory timings were, you have to compare an unmodified bios, but from what I can tell the default timing set used is Timing0, but be weary this is also used for boot clocks. This can be changed but I don't know if all the other timing sets are used and if so for what exactly. If there is a spare unused timing set, this could be changed to the original timing values of Timing Set 0 and set as default timing set for boot clock timings. This would be the safest way to go when trying to change timing values tighter than default, or loosening them too much. I only changed tRC, tRAS, tRP. tRC as a result of changing tRAS and tRP ( tRAS + tRP = tRC )

    Memory timing values could be actively changed via I2C interface I assume by flipping bits in the gpu registers realtime. I don't know very much about the registers to give any advice on this. This is really W1zzard or Unwinders playground and they may be able to shed some light on which registers hold which values, and which bits set which timings.

    Heres a s/s of my bios i modified. It's an XFX XXX bios which had the most recent bios version I could find. Nibitor 4.4 also permits changes to be made to the Fan Controller registers so I configured it properly, as software control via Rivatuner is a little unpredictable it seems, and from boot to boot this way would give inconsistent results. Most likely something Nvidia has broken but I don't have enough knowledge there to say for certain why. Setting the values in bios though does work as implemented and mine works very well now. 100% fan kicks in around 57-58c, and follows the angle of tSlope from 50 -> 100 and 100 -> 50. It doesn't sound nearly as loud as default since it rises progressively and not all of a sudden.

    Last edited by mikeyakame; 08-14-2008 at 03:44 AM.

    DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
    Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP


Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •