Pandora's Box, you tried flashing the 1.4v BIOS and your temperatures rose, indicating the BIOS did indeed raise the voltage ? Just trying to clarify, thank you!
Pandora's Box, you tried flashing the 1.4v BIOS and your temperatures rose, indicating the BIOS did indeed raise the voltage ? Just trying to clarify, thank you!
yes. i did flash with that modified bios. and my temps at idle rose about 10 degrees.
quick thought:
in that modified bios i see you have the core clocked at 500mhz with delta set at 50.
if i tried overclocking the card after flashing with that bios. say i tried 600mhz core, thats 650mhz effective right?
because when i tried that the system rebooted, now when i did the coolbits auto detect, it detected 560mhz, now with that delta, that would be 610mhz effective right???
Pandora's, that is how it worked with the 7800GT's , so as an educated guess, I would say you are correct on all counts.
Thanks for the response regarding the BIOS, I'm going to try it on my card and probably order an aftermarket cooler along with it to help the temps ! What utility did you use to flash the BIOS, and extract the original one?
EDIT: The 7900GT has the same mounting holes as the 7800GT, so a VF700-CU or VF900-CU would work, right?
Last edited by GoldenTiger; 03-11-2006 at 09:48 PM.
well i reflashed to that modded bios with the 1.4volt in 3d mode however i took out the delta.
3dmark03 is currently running at 600mhz core 1800mhz mem. i am seeing tiny artifacts in the nature test so it looks like aftermarket cooling is definately going to be needed for 1.4volts on a 7900gt. the test passed at 580mhz and i scored 19997 in 3dmark03. pissed off that i didnt get to 20k i cranked the clock speed to 600 and we shall see what it scores....
Has to be the first 7900GT to get to 20000 in 3dmark03
Originally Posted by Pandora's Box
Wow, super nice!
By the way, I have never editted a BIOS, either A) can someone tell me how to edit out the delta on the 1.4v modded one, or B) upload a copy of the 1.4v modded one without the delta ? I want to use it on my soon-to-be-ordered 7900GT with an aftermarket cooler.
before you install vr, measure carefully stock volts for both 7900GT (running 3D like 3DMark2006 ofcourse), and please post here.Originally Posted by cronic
Also, cool if you could give "rough" resistance on variable resistor to get 1.36V (according to vr-zone stock for GTX)
thanks
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
http://www.guru3d.com/article/Videocards/326/15/Originally Posted by mascaras
my thread - all the 3DMark2005 scores you ever need:
http://www.xtremesystems.org/forums/...54#post1328154
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
Re Zalman.Originally Posted by Nazu
I think if you endevour in overclocking the 7900GT (and especially if you voltmod, software or hardware), an aftermarket cooler is a must. The stock HS is tiny.
Re cherry-picked
Now this is all speculative, but based on typical normal distribution. There are 2 interesting things about 7900. No model with pipelines disabled (good yields), and big range of clocks with three big groups: 650-700 (higher voltage), 520-560, 450-475. lets call these groups, A, B, C. I think it would be too time conuming and complex to bin for exact clockspeed of model shipped so, here's what I imagine.
Possible binning process:
get chip out of big magic bag. test at 720Mhz with higher voltage, and if passes put on "A" pile. If fails, lower to 7900GT nominal voltage and try at 580Mhz. If passes put on "B" pile. Finally, try 500Mhz, because not all GPU are super-stars
Since relatively few ~700Mhz models will be sold, if there is shortage of chips, you can use from "A" pile for "B" (retest at lower voltage!), and from "B" pile in "C". So if you're getting a 7900GT, I think its good to get an "OC" model.
Worst case its only a few bucks more. BUT, if you get plain 450Mhz version, and you can only oc to 470Mhz, you will be really angry at yourself you didn't spend only few dollars more.
"normal distribution"
http://www.fao.org/docrep/W7295E/w7295e04.jpg
EDIT:
Pandora Box, set your delta to what gives you best results.. try 50, 20, 0. At the highest stable clocks, which gives you the higest 3DMarks - > go with that one.
Last edited by ***Deimos***; 03-11-2006 at 11:05 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
Your score is low: 20159 @ 530/1440Originally Posted by Pandora's Box
Nice mem clock btw , XFX Extreme??
Stock cooling OMG thats insane clocks there , Nice jobOriginally Posted by Gorod
Current Machine is my trusty Dell 8200 Lappy , while i make a decision
I have one question...
Does XXX and SUPERCLOCKED cards (XFX, eVGA) have higher potencial overclocking?
So I can buy eVGA for 299$ and overclock it on SUPERCLOCK frequency and higher? There are no differences between normall and SUPERCLOCKED right?
DelaMaris:
No, I don't think so. Usually higher grade cards have higher OC potential. Ofcourse not always but very often. Same with cpus, memory and other hardware.
...
i think this time round they might be more aggressively speed binned given the numerous offerings at different rated default clocks - so probably the higher rated ones will clock better
but we don't know the conditions under which these video cards were rated either
---
I apologise, your right, that memory was not taking, I will remove the picture as it’s misleadingOriginally Posted by bachus_anonym
NiceOriginally Posted by Pandora's Box
So did the bios actually improve your maximum overclock?
Here are 2 new biosses with a 20mhz delta clock so you can really determine if you can overclock higher than before. The first is a 1.3v bios and the second is a 1.4v bios. With the stock cooler 1.3v might be a better choice.
I'd like you to test the difference in temperature at the same coreclock with the 1.2v stock bios, the 1.3v bios and the 1.4v bios.
1.3v bios 20mhz delta:
http://rapidshare.de/files/15329733/...delta.rom.html
1.4v bios 20mhz delta:
http://rapidshare.de/files/15330116/...delta.rom.html
Good luck with it
If it really works then it might also be possible to get 1.5v at a 7800GTX 256mb
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
Real Overclocker
XFX Geforce 7900GT 256MB Basic model Partcode: PV-T71G-UDFR (£207 delivered)
Defaults are 470 / 685
BIOSTAR TForce6100-939 (£60 M/Board)
I will be off now and carry on crunching Rosetta
UAAAUU 1800 on the mem's !!!
No mods and stock cooling ?
My System
» ASUS P6T Deluxe V2 (CF w/0006 bios) | Core i7 930 3002A879 @ 4.20GHz (21x200) | Corsair Dominator DDR3 6GB TR3X6G1600C8D @ 1603MHz 8-8-8-21-1N
» ASUS ENGTX460 TOP 1GB @ 900/4000MHz | SB X-Fi Titanium PCI-E | C:\Intel SSD X25-M G2 80GB | D:\Samsung F3 HD103SJ 1TB | E:\Plextor PX-716SA
» Corsair HX620W Modular | CoolerMaster Stacker custom "All Black" | Samsung LED XL2370 23" Full HD
» WC 3/8": Apogee GTZ + BI GT Xtreme 240 + Koolance Nozzles + Station 600 + Silver KillCoil
All Stock, but I was tryingOriginally Posted by jVIDIA
thank you very much for contributing.Originally Posted by alexio
I hope we can get lots of folks to contribute voltage scaling results (ex. max gpu at 1.2V, at 1.3V, at 1.4V)
I just realized something obvious. 1.4V for 7900GTX on 90nm. Didn't 6800 ultra on 130nm also use 1.4V.. whatever happened to voltage scaling? (Aren't the X1900's at like 1.1 or 1.2V?)
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
I noticed that too, but let's not forget that there was also an Ultra extreme out that had 1.5v at stock and the X1900's run very hot. But I have to admit that Nvidia is really pushing it with the vcore they use with the 7900GTX because they don't run much cooler than the 6800U's.Originally Posted by ***Deimos***
It could be true that Nvidia has yield problems with the G71 and because these chips run cooler than the 7800 cards they decided to push it a little.
But don't be happy about this as the RMA rate is part of the price you pay for the cards. This could also explain the big price difference between the 7900GT's and 7900GTX's.
ATI warned us about this in one of their slides regarding the 1.5v setting at 7800GTX 512mb's, but they should shut up as they offer only 1 year warranty themselves
And I just thought about something else. The 7900GTX is only 1.36v atleast so says Shamino. This means the voltage is not THAT high compared to the 7800 cards.
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
Would the most accurate way to test voltage scaling be by determining max overclock at 1.2v, 1.3v, and 1.4v all with a 20mhz delta on the bios? What's the stock bios's delta?
Bookmarks