low res pics attached, high res front and back pics available from TPU
EDIT: unmoded pics untill the vmod will be done & confirmed working
Printable View
Do we need to see a vmod or what?
I see nothing on those pics.
The idea is that as soon as they are on the web, they will be posted in this topic, i guess.
Yeah I guess that's the reason this thread was open
PS:looks so much the 8800GTX/Ultra
Wow that chip seems HUGE to me, or am I going crazy?
top pic- please can you identify the square chip on the bottom right near the capacitor?
Also- the 8-pin chip in the middle of the triangle between the SLI connector and the left + top RAM banks, as well as all the ICs under the I/O chip?
Is the space for an 8-pin chip beside the 6-pin power plug populated?
Cheers,
K
Thanks for the heads up. Forgot to mention that my first post contains pics of unmoded cards. Edited
The best i could zoom into that front side lower right corner voltage regulator chip seems like a Volterra VT138***.
Both Zotac and POV cards reviewed on TPU have the 8-pin chip beside the 6-pin power plug unpopulated.
We need better pics
found a high res pic of the Volterra VT1165MF voltage controller, same one used for ATI R600
SOURCEQuote:
Originally Posted by techPowerUp
http://img.techpowerup.org/080625/d2...b8f6ba7b03.jpg
vmod for Memory?
I knew voltage adjustments would be available awile ago as Nvidias own system monitor has voltage adjustments for core and memory right next to the gpu/shader and memory sliders. I think Nvidia will drop the 'bomb' in a future driver release shortly:up:
I sincerely hope that software control will be made available. That would be a major selling point for me.
So, any OC'ing results for GTX 260 & 280?
How do they respond to voltage?
I have spent most of the day modding an EVGA 280 GTX HydroCopper. I will be posting results with pics, ect later this weekend. Its a hot chip, no doubt.
It does not respond to voltage over 1.350 and Riva/EVGA Precision cannot clock it over 800 mhz. Only way to get higher is by using Nibitor and flashing a BIOS.
:up:
Bump for results!
The core max can only clock to half the shader clock. Example if you have shader clock set to 1500, than 750 core is as high as you can go. So 1450 shader would be 725 core and so on. Linked or unlinked the rule stands.
Why is it taking so long for driver level voltage adjustments? I have asked Unwinder to look into this support with Riva Tuner, he has yet to get back to me on this:rolleyes:
Sorry for the delay guys, way too much going on around here.
The mod went fine, but the results really were a mixed bag. First thing is its a hot chip and it responds really well to lower temps, but no so much increased voltage.
Here is what the card looked like with the EVGA block removed, lots of white silicone, with pretty bad contact on the memory chips. I re-set the block with AS Ceramique to check contact and it was better with a little more paste and more torque on the block. Nylon spacers are used between the block and the PCB.
http://img68.imageshack.us/img68/635...asteps1.th.jpg
The finish on the block was so-so, the machine work was a little rough but all that buffing costs money I guess. I pulled the block apart to clean the leftovers from tapping the inlet/outlet holes for fittings.
http://img68.imageshack.us/img68/540...hinele9.th.jpg
http://img68.imageshack.us/img68/135...basejw6.th.jpg
I'm sure you've all seen EVGA's take on cooling, the logo is pretty neat and it seems to work pretty well, but seeing how I was already in there I figured I'd break out the dremel and create a little more surface area. I didn't go that deep as it is not that thick to start with. Flame on if you want, I couldn't think of how this would be detrimental, should at least create more turbulence through the "pin" array.
http://img68.imageshack.us/img68/955...sideta6.th.jpg
The mod was refreshingly simple, with nothing really to burn up. I used to mount the pots legs directly to the solder point, but if the card is going to be moved around a lot, I have found the small wires give a little more without damaging the board.
http://img68.imageshack.us/img68/4286/vgpumodyr8.th.jpg
I actually forgot to install a vGPU check line before bolting the block up so I had to do this while the block was in place, otherwise I would have used a point closer to the outer edge of the board.
http://img68.imageshack.us/img68/988...suresg9.th.jpg
http://img68.imageshack.us/img68/268...ure2qk5.th.jpg
While I had the block off I polished all the visable surfaces, and hit the ram and cpu seating surface lightly, which took off some edges you could feel with yoru fingernail. Its pretty sharp all polished.
http://img68.imageshack.us/img68/847...shedgc0.th.jpg
Results:
Using chilled water (water temp is maintained around 60F with a 5000btu Daewoo AC) the card responded well. Idle temps were 20C while loaded temps were 28-31C.
I found there is an issue with clocking the card past the high 700s and 800s range using Riva/Precision. You can get past this by using Nibitor, but your core clock will be reported as 100mhz in Riva/Precision. Also, as I have never loved Nvidia's dynamic clock switching (anyone remember the 7900 not switching?) I set all the clocks and VID the same in the bios. This caused a rather nasty corrupt screen when loading windows so I flashed it back, clock switching will stay I guess.
I was able to run 826 core 1620 shader and a pretty impressive 2754 on the memory. I was able to run 2800 on memory for benchmarks, but it showed some errors in ATITool. 1674 shader was not doable at any voltage. Also, forcing different shader clocks in the bios has mixed results, as does forcing odd clock frequencies. Setting core to 794 and shader to 1512 would cause instant hard-lock in ATITool, setting core to 794 and shader to 1566 was stable. Setting core to 795 would instant hard-lock. All clock changes had to be made with bios as Riva/ATItool would not set clocks above 783 core.
As far as voltage goes, anything over 1.350v caused it to pop pixels in ATItool constantly. Even with loaded temps in the low 30s. The card was bench-stable/game-stable at these speeds with the chilled water, but was not stable once loaded temps passed 45-47C (with the chiller off). Voltage was at 1.338 for tests.
The card belongs to a friend was visiting, so we didn't have a ton of time to test it. Its going in a system with a QX 9650 and I'll post up some benchmarks once he gets it running in that system.
Here is a 3DMark06 run on my Q6600:
What you are missing is that core and shader clocks are related even in unlinked mode.
You can't set the core over certain frequencies if you don't clock the shaders higher aswell: when the shaders are at 1458, max core frequency will be 712; at 1512, the core will top at 748 etc.
That is caused by the different core/shader ration GTX series have, which is different than 8800's.
I've heard there are people who are trying to change that ratio through the bios but don't know the details.
About software voltage adjustments, considering the same Volterra chip is being used I really hope W1zzard can sort it out like he did with ATItool for 2900s.
Looking forward for the ATITool substitute he's been talking about :)
Hej! what you have the standard voltage GPU:)?
Has anyone flashed a GTX280 yet? I was thinking of trying to find a FTW edition bios and flash to it. Or can someone check the bios voltage settings from theirs? Mine currently has Extra set at 1.18V, the 3D set at 1.06V, and the 2D set for 1.03V.
Oh Seba84, would you attach your bios please?
Nice 06 scores, wow over 20k on a single gpu, ATI can't touch that;) Currently i'm waiting for a Koolance waterblock along with a Exos 2-LS to see if I can push my card anymore for stable gaming use. I have a DD Tieton on a Koolance Exos 1 running 1/4 in tubing so I imagine the results will improve with the new block (thinking the backplate and actual block quality will help) and Exos 2-LS. Right now load temps hit 63c and I still manage clocks of 756/1512/2650 rock solid stable. I'm hoping to be able to raise the shaders higher as 1600 is stable till it heats up past 50c like you mentioned. This would allow me to go for 800/1600/2700 however I'm not sure, esp with no voltage adj.
About the votage adj. Did you adjust them all with a bios flash or hard mod? Does anyone know if Voltage adj will be available though a driver level like ATI? I mean Nvidia's own performance tab shows GPU and Memory voltage and drop down tabs for adjustmants yet they are greyed out.
Any additional info would be great:up:
bios from the version FTW anything does not differ, there is also extra 1.18v 3D 1.06v and 2D 1.03v
write You to me on priv this give link to bios from FTW
I have flash bios from the version HC:)
my PC: GTX 280 & water block from Aquacomputer GTX 280:)
http://images26.fotosik.pl/244/17b59c7a65c24d5bm.jpg
http://images29.fotosik.pl/243/adb06c5860d9f08em.jpg
http://images31.fotosik.pl/312/fb91c9a2de2c51b6m.jpg
http://images33.fotosik.pl/313/f92d279a45b42b86m.jpg
http://images33.fotosik.pl/313/c947cf02690ea66fm.jpg
http://images26.fotosik.pl/244/cd735bd5fb0b0658m.jpg
[QUOTE=seba84_2005;3128532]bios from the version FTW anything does not differ, there is also extra 1.18v 3D 1.06v and 2D 1.03v
write You to me on priv this give link to bios from FTW
I have flash bios from the version HC:)
Say what, lol. Are you saying the bios from the FTW has increased voltages? The engineers from EVGA said that they do not add voltage to any cards they sell.
What are you using to flash the bios? Nitbitor?
No, he said the voltages are the same. And that if you want the FTW BIOS, send him a PM.
Now the pics of his rig are just flat out sex. VERY clean, I like very much.
Now, will the GTX 280 mods work on the 260?
Finished my GTX 280 v-mod yesterday, but my water pump is malfunctioning and the water flow is very poor, making me unable to test the overclock before I change it. At least I was happy to find a VGPU measure point on the top of the card, cause I couldn't use those shown at vr-zone as I had already placed the waterblock.
BTW VETDRMS, I'm starting to worry about power consumption. In practice the stock GTX 280 eats approximately 200W under load according to intel from reviews. At 1.338V and 826 MHz core it should require about 75% more, hitting 350W while its power inputs are rated at 300W. I hope my Enermax 1KW Galaxy is capable of feeding a lot more than the rated 75 & 150W for the 6 and 8 pin outputs.
anyone seen a vmem mod for these cards?
Saw the 4.2ghz q6600 and was like WoW... then say the nearly 1.7v Vcore and was like... omg nvm lol.
The GTX280 is a good card and yes it is the most powerfull single card out there but Price vs Performance is where its at IMO. I grabed two 4850's for $150 each and am stuck at 19,400 3dmark06 right now being held back by my cpu because I cant break 3.7ghz on my quad, but im all air cooled and no volt mods :P
So curious what would your score be with your q6600 @ 3.8ghz? I wonder if it would be equal to mine. I assume slightly better since dual cards should use more cpu overhead.
OK, now I finalised my testing and I am not particularly pleased with the results, but here goes:
Maximum overclock: (stock voltage/v-modded voltage)
Core: 713 --> 771 (+8.13%)
Shader: 1458 --> 1566 (+7.4%)
Memory: 1377 (no v-mem mod)
I couldn't get the shader 100% stable at 1620 no matter what. 1566 is stable at 1.25V and I went up all the way to 1.42 as an attempt to get 1620 stable and still no go. In fact the more I kept increasing the voltage, the earlier the tearing/artifacting seemed to occur (load temps mid to high 60s). Another note is that I never had a core overclock instability, since setting the clocks with rivatuner I was unable to set higher core clocks with the givven low shader clock overclockability. If flashing the BIOS gets around the core/shader 1/2 limitation, I'll give it a shot.
Looks like yield problems are important for the GT200 chips. G92s can do 2.0 GHz shader with 1.1-1.15V and I need 1.25V to stabilize this at 1.56 GHz. It has so many more SPs, but no more than ~40% more shader performance in practice due to these yields.
** update **
It seems that at 1.30V the card artifacts when the core temperature exceeds 75 Celcious degrees (this is not the case at stock voltage). In fact, I wonder where the temperature sensor is located. If the temperature sensor doesn't read the read die temperature, then I must calculate that for a givven temperature reading, the actual die temperature is a lot higher when higher voltages are used. Maybe 1620 MHz was inevitable because of cooling issues after all. I'm having a hard time controlling the temperature of this thing with the weather here at 35 - 45 Celcious degrees; I'll only know in the winter how much of a bottleneck the temperature is.
Does anyone know if Nitbitor can set voltage and have it stick on the G200? I would like to play around with some voltage adj and OCing but do not want to hard mod the card. For one i can't solder for crap and 2 it does not seem like volt mods help at all unless you have extreme cooling solution. That was how Fugger managed such high OCs on no volt mod. It seems like it's best to put your money into better cooling and not volt mods.
I still would like to be able to set and test out some volts and OCs myself though;)
BIOS volt modding is not an option with most current nvidia cards.
Is the vmod the same for the GTX260, they appeared to be the same pcb, but wondering if anything such as trimmer resistance should be changed.
Watercooling with an EK waterblock and my old crappy Thermaltake BigWater 745 installation (changed the water pump to a more powerful 500l/h one). The video card is cooled third (CPU --> NB --> GPU).
The water temperature is about 45 Celcious degrees by the time it enters the EK waterblock. I can't do much about it in the middle of the summer. We do have an A/C in the house but it is too far away from the PC.
G200 overclocking & modding isn't too popular on XS?
There's just couple thousand views and few people with the cards discussing.
:\
It's becouse volt mods do not help and for the most part can hinder performance. The best thing to do for a G200 is get it as cool as you can with a quality water block like the DD Tieton and use a killer water cooling solution along with it. Basicly, the cooler the core, the higher they clock;)
Will it at least help on the memory?
Changing VID tags in bios does nothing unless you actually change the physical voltage. Then it's a question of whether there is mapping for those voltage ranges or not. I've managed to get my memory to 2698MHz on air. I hit a wall at 2664MHz which I got around by loosening timings on Timing Set 0 in the bios. I can't remember what the original memory timings were, you have to compare an unmodified bios, but from what I can tell the default timing set used is Timing0, but be weary this is also used for boot clocks. This can be changed but I don't know if all the other timing sets are used and if so for what exactly. If there is a spare unused timing set, this could be changed to the original timing values of Timing Set 0 and set as default timing set for boot clock timings. This would be the safest way to go when trying to change timing values tighter than default, or loosening them too much. I only changed tRC, tRAS, tRP. tRC as a result of changing tRAS and tRP ( tRAS + tRP = tRC )
Memory timing values could be actively changed via I2C interface I assume by flipping bits in the gpu registers realtime. I don't know very much about the registers to give any advice on this. This is really W1zzard or Unwinders playground and they may be able to shed some light on which registers hold which values, and which bits set which timings.
Heres a s/s of my bios i modified. It's an XFX XXX bios which had the most recent bios version I could find. Nibitor 4.4 also permits changes to be made to the Fan Controller registers so I configured it properly, as software control via Rivatuner is a little unpredictable it seems, and from boot to boot this way would give inconsistent results. Most likely something Nvidia has broken but I don't have enough knowledge there to say for certain why. Setting the values in bios though does work as implemented and mine works very well now. 100% fan kicks in around 57-58c, and follows the angle of tSlope from 50 -> 100 and 100 -> 50. It doesn't sound nearly as loud as default since it rises progressively and not all of a sudden.
http://hosting01.imagecross.com/imag...memtimings.jpg
Are there other places to solder for vgpu (not measure)? Thank's!
anyone going with the GTX260 ?
Well this is a bummer , I have a 4870x2 and was going to dump it for a vmodded 280 but the results are pretty poor , what a shame . My last vmod effort was a G92 gts that clocked to 910mhz on water , some dice/ln would have been good on that one.
:up:
Did anybody of you a mod @ GTX280 and how are the results?
I've done my two and the shader clock seems very unresponsive to voltage and is the main performance driver. I only went up to about 10% over volt tho, or about 1.3V. One of my cards could hit 1566 shader no problem, the other is stuck at 1458. The very next tick introduces instant artifacts, that can't be "dialed out" with more voltage as was possible on older cards.
Also, I noticed if you pushed the shaders really hard, the failure mechanism was very strange. Instead of just dropping vertices or pixels it would sometimes loose portions of the frame buffer altogether. When I say lost, I literally mean it's pointed to the wrong address in local memory. You could get it to show a flat 2D source texture as the entire screen, like if you browsed through the games assets. It was very interesting, particularly because the game (crysis/bioshock/shader heavy game) wouldn't be locked up. You could still escape out and get back to the desktop.
Based on that I think the 512bit architecture complexity is pushing the setup and hold time requirements for the bus to it's absolute limit. Particularly in keeping the shaders fed properly at increased clock rates. This would also explain how they are much more responsive to temperature than voltage and why 260s clock further, and looks to me like the typical limit to this current part. It certainly isn't starved for DC power.
Going to a smaller process will certainly help, as would reducing the bus width to relax the timing requirements. The only benefit Vcore gives you is to match the core clock to half the shader clock without issue, but the cards performance is largely driven by the shader clock.
Yes, Shader goes not realy higher with more voltage.
Without Mod 1620, with Mod 1620 too, but i test 1674 when i find any time ;)
Core is stable at 810MHz, more i can´t test, because you need more than 1620 Shader.
Here somethnig in german about my Mod.
==================================================
vGPU Mod GTX260/280
So, nachdem es fast nix mehr zu testen gibt, musste ich mich natürlich an den vMod der GTX280 setzen.
Ganz vorn weg, so einfach wie bei der GTX260/280 war ein vMod wohl noch nie bei einer Grafikkarte. Die Lötpunkte sind alle super zu erreichen und und es liegen auch keine benachbarten Teile in direkter Nähe, die man evlt. beschädigen könnte.
Bei der Karte sind von Werk aus im Bios 1,18v GPU Spannung eingestellt, was bei meiner Karte reale 1,196v ergeben wenn man an den Messpunkten nachmisst. Diese kann man durch einlöten eines 100Ohm Potis langsam und auch sehr genau erhöhen.
Aber bevor ich hier nun einen Roman schreibe, poste ich einfach ein paar Bilder, die die nötigen Infos liefern. Bilder sagen bekanntlich mehr als 1000 Worte.
Trotzdem könnt ihr natürlich gern Fragen stellen, wenn es noch offene Punkte gibt oder ihr selbst den Mod anwenden wollt.
Rückseite, vMod
http://www.partypicssaarland.de/vers...GTX280/80s.jpg
Vorderseite, Messpunkte
http://www.partypicssaarland.de/vers...GTX280/81s.jpg
Ergebnis
http://www.partypicssaarland.de/vers...GTX280/82s.jpg
(klick to enlarge)
==================================================
Here the full review in german about the GTX280 ...
http://www.forumdeluxx.de/forum/showthread.php?t=504260
It looks like the heatspreader is easy to remove too... i can see the gap between the IHS and the substrate.
http://www.nordichardware.com/news,8065.html
The die looks just large enough to be covered by a water block but if i take it off i'm condemned to water cooling it forever
Yes, you can cut 5,5mm, ...
Pix say more than 1000 words ;)
http://www.partypicssaarland.de/incl...ic.php&pid=541
http://www.partypicssaarland.de/imag..._GT200_DIE.jpg
:eek:
Is that with watercooling + removed IHS?
If so, what block/wc gear do you use?
Why don't you run unlinked?
It also looks like you started with a highly OCed card, like an EVGA FTW. A good purchase as it seems the G200 are binned very heavily and have little OCing margin. Additionally, we have no way of reasonably increasing the shader performance without extreme cooling, core voltage mods do nothing. Maybe if we had some kind of mod for the memory IO interface of the part, similar to CPUs, it would help some.
Ah, then you must have flashed the bios as it shows default clocks of 702/1458/1215
Has anyone else had good results with voltmodding a 280 ? Or not worth it ?
:up:
Not sure if this is 100% accurate but apparently Fugger modded the gtx 280 and reached 1000+mhz on the core with the gtx280 LN2 of course, Im pretty sure its when he got the top scores for vantage.
1200MHz core is possible with ln2 and vmods.
anyone here with a good bios for the 280gtx..? 702Mhz+++?
anyone posted an ocp mod for 280 ??
thanks!
Hi, Is any alternative soldering point on GTX 260 for GPU VMod?
yes, but probly on one of the pins of the volterra chip