Ya, but did you try benching with the GTS bios.
EDIT: oh. Nevermind :/
Printable View
and the stream processor can be actived ??
So I tried a GT - > GTS flash, and I can't test anything as all my driver installations are failing.
Which drivers are meant to be used for a G92 GTS because I've been looking around and can't seem to find any..
okay nvm. I used giorgios's bios and went from 700/1780/960 to 735/1895/970 mem.
this is a rma 8800gt man I wish I had my old one that did 770/1850/1130! stock bios.
i wonder how it would do after this bios? Thanks for the bios man :toast2:
Well, I got some drivers to install for the "G92 8800GTs", and it didn't work. Rivatuner still showed 112SP, and ATi Tool practically crashed from artifacts at 720/1800 speeds which were previously stable on the default GT bios.
Well, was worth a shot.
Voltage was 1.13V under load measured by my DMM for those interested.
so did u get more voltage with the gts bios or same as gt?
my BFG GTS v2 have kick my XFX GT ^^
Someone know if is possible to modif the voltage with Biosmod like GT ...
my Asus 8800GT flashed to Gigabayte 8800GTS. flashed ist ok. gpu voltage ist ok. bad driver installations are failing.?
169.21 help me please..:shrug:
right click and extract the drivers to a folder, then you gotta edit teh nv4_something.ini file, and change the device id for the GT or GTS to 610 instead of 600 or 611.. something like that.. just look for the lines that have G92 and you will see it... save the ini and then run setup =p
I tried it as well, with a volt moded BFG.. no go.. suxors.. got artifacting after loading the drivers.. im sure someone will have to hex edit the GT and GTS bios together if its possible
the orginal nv_4 disp
http://img172.imageshack.us/img172/7...shot087qb2.jpg change code, dont work http://img49.imageshack.us/img49/699...hot087hyj0.jpg
wow you've been lurking since 10-26-07 hmm :D
so the GTS has higher voltage selectable in nibtor, interesting..
anyway we can manipulate it to use on a GT bios? that extra .5 would help a lot.
I wish I had a DMM to measure it with the GTS bios
Well the GTS should be pumping more voltage in but I think it has an extra power regulator as well onboard. It would be great if I could flash a my GT to a GTS bios just for extra voltage I could care less about the extra SP units just gimmy more voltage without a hardmod :)
.05. Thought the GTS's had 1.2-1.25V stock on there cores. Now thats what I'd want .05 can already be done but the bigger voltage increase is what I'd be intrested in. As long as its under 1.3V since I guess the GT power circuity can't handle it.
As for cooling even a nice air setup like the HR-03GT keeps my GPU temps under 40C which is more than fine with me when highly OC'ed. Gimmy more volts :)
Still have to get my dang thumb drive to load dos properly for me to flash the bios still on my video card then I'll be rockin.
Dumb question guys:
I'm vista 32 bit only now and my bfg oc 8800gt is coming in 1-2 days. I'm currently on an overclocked 725/1075 msi 8600gts that was atitool stable. However, after upgrading to the newer nvidia bios (169.25) I find that atitool 0.27 beta 4 still doesn't work. What are you guys using to test for max core/mem if not atitool then?
I use atitool 0.26.....no problems.
Did you press F8 during VISTA boot and select "Disable Driver Signing"?
Just another data point, i've got an EVGA SC and flashed using 1.2 in bios.
Load volts are 1.165, seems to have given me a bit more headroom.
Never found the max before though.
Has Anyone tryed to modify the voltage in the bios on the new 8800GTS G92? I wonder how high 8800GTS voltage can really go
Does it worth to pay extra 60 Euros for a GTS G92 ?Or i should get a GT? I know the performance gain wont be to much, but the cooling its much better , and maybe more overcloking ...
As far as i know , 8800GT are limited to 1.1V , 8800GTS G92 can work at 1.2V or maybe more ?
id guess about the same, the regulation circuitry looks similar if not identical to the gt. if ive read correctly the gts starts at a slightly higher voltage, but nothing that cant be gained on the gt.
Looks like the only way to get good volts is the hardmod, something my shaking hands havn't managed yet.
I think the GTS is worth the extra cost. You can't increase the vGPU in the BIOS, but it comes higher out-of-the-box. It also has an extra phase for the voltage, and of course the extra SP's and better cooling. So far the RAM on the GTS is clocking much better for me as well, and it also has tighter timings than the GT model I had stock.
W/o doing any hardmods I had these results:
8800GT-512 741/1728/951 (1.1v BIOS mod & 1.4ns timings) 14,879-3DMark06
8800GTS-512 771/1944/1062 (stock) 16,190-3DMark06
My GT also had RAM issues, and I ended up having to RMA it. I had to reduce RAM clocks to 885 to pass 8 loops of Battle of Proxycon (3DMark03).
I have problems in battle of proxycon also on my gigabyte 8800GT . Since i am using the stock heatsink and fan i didnt go over 700 1750 2000, but i edited my original bios so i can have 1.1 now.
At battle of proxycon i get lots of red lines for a few second then it freeze. I saw that its working with the memory at 950 but i didnt tryed 8 loops. i will now.
I edited My gigabyte Bios again and i think i got it working at 1.15V , i test it and now my card is artefacts free at 720/1800/2000 as long as it dosent go over 85 degrees,so i have to keep the cooler at over 50%
With my 1.1V i got artefacts at 720/1800 very fast, no matter at what temperature.
I also made a 1.2V bios, but i havent test it yet, becose i am using the standard cooling so i recomand it for thous who have changed the cooling.
715.zip = 1.15V 700/1750/950
720.zip = 1.2V 700/1750/950 .
I set this freq to make sure will work for guys with bad memory also. The timings are default from my original gigabyte bios. Let me know how high your cards will go. I will buy this days an Accelero S1
YM (yahoo mesenger ) : blackiice2007 if u have any question
Will it ever be possible to have 1.2V via the BIOS?
So i concocted a custom bios with 1.1v for my xfx 8800gts 256mb. Its great, its 100%stable in both ATiTool and Ozone3D's Fur Benchmark at 756/1836/1050. But, as soon as I load up something like Crysis or 3DMark 06, it freezes within seconds. Why?
What does the screen look like when it freezes? Is it like checkerboard?
I don't!
The GTS has average 8% of advantage against the GT! It doesnt help that it has more SPs on it, the 256bit memory bus interface chokes the GTS as well as the GT!
Now if compare the prices it is clear to see, the GTS just doenst worth the extra money... not even close...
Of course, if u r lucky and can get your hands on a GTS for the price of a GT or lower, get it. Otherwise, dont.
Hello,
i have this strange problem - it seems that my overclocking doesn't bring any benefit in real usage scenarios. For example at 600/1500/1800 i'm getting 38 average FPS with High settings at 1280x1024 in Crysis, and at 730/1750/1900 i'm only getting 40fps! What could be the problem here? same in 3DMark06 - it's 5100 vs 5200 (SM2).. why only so small gain?
Nope, the core just freezes. You know - sound looping and all.
I reflashed the bios, and everything seems fixed. Artifacts are appearing as normal (before there were NONE), and the squeeling has more or less stopped. Final clocks are 729/1782/950 artifact free and 729/1836/1050 for benchmarking.
what increase in 3DMark06 do you get with overclocking (SM2/SM3 scores alone)?
Well guys been playing with modifying my 512MB Inno3D 8800GT OC bios clock speeds as I been having problems overclocking both cards when in SLI mode. Only one card in SLI gets overclocked resulting in SLI scores similar to single 8800GT scores. Flashing my bios from 650/1500/950 default to 742/1836/1000 fixed the issue as you can see at http://i4memory.com/showthread.php?p=82839#post82839
Then decided to play with that Exact Mode Extra VID everyone has been discussing. I tried 1.1v and no change in max clock speed in SLI mode, both cards in SLI still max out at 742/1836/1000. Then I tried 1.22v and guess what I can at least do 3dmark06 sometimes at 756/1836/1000 now!
Still have issues oc'ing both 8800GT in SLI mode. Ntune 6 with 169.09 beta drivers overclocks only 2nd card. While Rivatuner 2.06 only overclocks 1st card. But still can pass 3dmark06 @756/1836/1000 now :D
I'll share my bioses, all modded from the base 512MB Inno3D 8800GT OC edition bios which defaults to 650/1500/950.
Bios files can be downloaded here. Use 7zip or winrar to extract them.
Included is instructions for flashing:
FYI, i used the 742/1836/1000 non voltage change biosCode:All custom bioses are based on the original 512MB Inno3D 8800GT OC 62.92.16.00.05 bios.
- Original 8800GT OC 650/1500/950 bios
- 650/1500/950 1.1v Exact Mode Extra bios
- 650/1500/950 1.22v Exact Mode Extra bios
- 700/1700/1000 1.1v Exact Mode Extra bios
- 742/1836/1000 non voltage change bios
- 742/1782/1000 1.1v Exact Mode Extra bios
- 742/1836/1000 1.1v Exact Mode Extra bios
- 742/1836/1000 1.22v Exact Mode Extra bios
#####################
Flashing Instructions
#####################
1. Take a *.rom bios you want to use from bioses folder and rename it to update.rom
2. Create bootable floppy disk and place contents of nvflash folder on floppy disk
3. Place update.rom on bootable floppy disk
4. Reboot system with bootable floppy disk inserted
5. At DOS prompt type the word listed below in quotes (type without quotes though). Always backup your original video card bios before flashing!!!!
"backup"
this will backup your video cards default bios
"update"
this will flash your video card to the modded bios named update.rom
"restore"
this will flash your video card with your original video card bios you backed up.
Disclaimer: I am not responsible for any damage done to your video cards. Do so at your own risk!
Enjoy :)
Before and After
Before stock bios but bios modded to 742/1836/1000
3dmark06 = 18,842 http://service.futuremark.com/compare?3dm06=4397853
click image for full screenshot
http://fileshosts.com/intel/Asus/680...0_18842_tn.png
After bios modded to Exact Mode Extra VID = 1.22v with 742/1836/1000 clocks. Then overclocked to 756/1836/1000 using Rivatuner 2.06 for 2nd card and Ntune 6.x beta for 1st card.
3dmark06 = 18,944 http://service.futuremark.com/compare?3dm06=4435069
click image for full screenshot
http://fileshosts.com/intel/Asus/680...0_18944_tn.png
Seems only SM3/HDR score benefited from the GPU clock bump from 742mhz to 756mhz :)
Very interesting Eva, I have the Inno3D 8800GT non oc edition and I can't get 1836 shaders stable with the 1.1V BIOS Mod. The max stable I can get is 742/1782/972 (If I set the RAM to 1000 sometimes games will hard lock).
Are you able to check the vgpu voltage exactly using a multimeter when using the 1.22V BIOS Mod?
can someone please post 3DMark06 scores BEFORE and AFTER overclocking, for me i get very very small gain.. (and not combined scores, but SM2/SM3 ones)
You only changed the label of voltage table entry 4 (that's the "extra" option in Nibitor) so no magic here eva :(
Maybe the card ran hotter in previous tests due to hotter case or ambient temperature :shrug: I don't use a computer case by the way :p:
Can you measure the temperature with both bioses at the same clock and fan setting in idle and load? There should be a big leap in core temperature with the dinky heatsink ;)
Its a real pity we cant overvolt more via the bios, there is so much room for the 8800gt... Does anyone have a link for a voltmod? Thanks
don't use case either.. temps around same 57-61C load @742-756mhz gpu
Hmmm...I get lockups in games if I set my RAM above 950MHz on my Inno3D, would loosening the RAM timings help? I'm not sure how exactly to do so in nbitor.
Thanks!
i get lockups too, i have the bfg 8800gt's in sli, when i bench them seperate, one card i can run 756/1044/1912 the other card i bench 756/1000/1900. when i put them together i can bench 756/1000/1900, but in games i cant pass 675/960 without lockups, what gives. i still can return them till jan21. But the volt mod bios worked for me:)
I did the 1.1v mod and i can't oc the core any higher than i could without it (max 700) but the shaders could go 40mhz higher.
i can oc to 771/1939/1015
should i bother with the bios mod?
this is with a XFX 8800GT Alpha dog edtion (600mhz) witha accerlo S1 with 2x 80mm fans on it.
thanks.
is it really worth it?
Where have it Tested ?
I think NVIDIA had a good reason to leave these Qimonda chips at 900MHz stock with tighter timings.
Here's what I did to my EVGA 8800GT SC (650/950/1625) with a HR-03 (non-GT edition) on top of it.
I first flashed my card to 1.1V plus I also changed the timings to Qimonda's reference design (see the OP's BIOS for the numbers). After a reboot I started upping the core frequency. At stock core voltage, I ended up at 730MHz, so I started from there. I got all the way up to 770MHz, that's 40MHz above what I previously got and 120MHz over the stock 8800GT SC clocks. Not too bad. I then lowered the core frequency to 760MHz (just to be completely safe) and started on the shader overclocking.
Before the 1.1 overvolt, I got to 1710MHz. This wasn't the max on stock voltage, I just didn't bother to see what it would max out. I started at that frequency and got all the way up to 1860MHz. Again, a pretty nice overclock.
Now, about the memory.
At this moment I was at 760MHz/Stock(950MHz)/1850MHz (lowered both the core and shader a little bit). No artifacts and it kept warm at 48C load after a 10 minute ATITool run (again, I have HR-03 on top of my card).
Before the voltage mod I got the memory to just 970MHz, anything above would give me artifacts (with the core at 730 and shader at 1710). So I started at 970MHz and kept going up by 10MHz every 5 minutes of ATITool running. I got to 1050MHz (!) but as I went up the temperature of the card and the case climbed steadily.
The voltage BIOS mod and changing the shader+core clock resulted in an insignificant increase in temperature (about 1C or so). But... Once I got to 1050MHz with the loosened up timings the temperature of the GPU rose up to 54C (that's 6C more than just with a shader+core overclock!). The temperature of the case also went up by about the same amount (!) (to about 42C (HOT!)).
This memory could be DDR2400 but it gets hot really, really fast. I'm sure with a better airflow through the case and some copper heatsinks (cheap aluminum right now) I could get to over 1100MHz, and probably everyone else with these timings.
I'm thinking your card is already at 1.1V since your clocks seem absolutely crazy for stock voltage (if they're at all stable, of course).
Check the voltage. If it turns out that you're at 1.1V you don't need to reflash the card. From what I've read this is the maximum these cards can do with a BIOS flash (according to ViperJohn @ ocforums)
Interesting krogen, what case do you have by the way? How exactly do I set relaxed ram timings using nibitor? Can I test the timings without flashing the BIOS to my card?
I have NZXT Apollo. A fairly decent case but it doesn't have a good airflow. Comes with one 120MM fan on the back (no more than 25CFM) and one 120MM on the window (pushes about the same amount of air). I have a 92MM Panaflo 45CFM on the HR-03 and underneath it there's a 40CFM slot cooler. Both push quite a lot of hot air outside of the case but I still never saw less than 30C in it on idle.
40C is just terrible and probably too hot for those Qimonda chips to operate properly.
Once you open a BIOS in Nibitor go to Timings tab and select Detailed Timings. There is an option right underneath it called "Test Timings" but I'm not exactly sure how well it works.Quote:
How exactly do I set relaxed ram timings using nibitor? Can I test the timings without flashing the BIOS to my card?
Im sorry, how do i cheak my voltage. I read most of this entire thread but i have forgotten.. (sisters wedding and all) so do i just save my bios with Nibitor? ive seen other ppl with the same card as mine and get like 720 on the core.. why would mine be so much higher? ill cheak my bios after you guys tell me how :P
The OP is wrong about the video RAM on the reference boards with HYB18H512321BF-10 chips. They're not 2400. They're actually 2000. I think the OP looked at HYB18H512321BF-8.
Take a look:
http://www.qimonda.com/promopages/in...AM%2FGDDR3%3A1
Are you sure your clocks are stable? They may appear stable when playing but that isn't necessarily the case. Do you use ATITool to check the stability?
You might either have a really good card or it might have came with 1.1V out of the factory.
Has anyone done some benches to see what changing the timings has effects on?
Yes i used ATI tool, no artifacts. Stable for like 2 hours. it sits at the max temp (like 45) for about 1:58 mins lol. so im pretty sure its stable. yes all stock volts.
How do i test if its 1.1v already?
gosh, has anyone who's pushing the card to the limits actually tried to see if it gives any benefit whatsoever?
Change the timings under "detailed timings" to this:
http://img.photobucket.com/albums/v3...1Jan050333.jpg
After this save the bios and it's ready for use ;)
I believe alexio's bios settings are what i loaded up(no access to that PC right now) and I was able to improve my overclock from a fairly respectable 720Mhz core to 792Mhz core, 1998Mhz shader, 2016DDR Mem. 799Mhzx core or above 2K shader locked up on loading any 3D app.
3Dmark06 link - 14,755
http://service.futuremark.com/compare?3dm06=4333775
Starting point was around 11K at stock CPU and Vid card speeds (it is a 8800GT SC, which is what, 650/1600/1800 or something)
I've not tried to tweak the BIOS further, as my 680i died the day before Xmas.
They show the same timings in HEX code. The sets are predefined. You can choose to read the current timingset with the "autoselect timingset" button but I'm not sure how accurate this is.
It's best to manually edit the timings in the fashion the picture shows ;)
here's my POV 8800GT EXO..stock cooling no vmod
it's already 1.1v on BIOS
http://img214.imageshack.us/img214/7...6ocall3mx0.jpg
BIOS
http://img170.imageshack.us/img170/7941/nibitortq7.jpg
Checked with DMM
vgpu @ idle 1.13v
vgpu @ load 1.16v
wanna try to loosen timing :)
I edited my BIOS with the looser timings, thanks alexio. I can now bench with the RAM at 1100MHz, before it would freeze.
so,how`s mhz vs timings performance?
did you gain anything?
http://img184.imageshack.us/img184/3...igh8800mt1.jpg
47.825
nvidia 8800gt 512mb @ 756/1100/1836
Windows XP 32bit
intel e6600 @ 3600mhz
did you notice better results with 1100mhz and looser timings?
Ok, my RAM is not stable at 1100MHz, it crashed in the CS:S video stress test. I'll test if 1053MHz is stable. I can run the crysis bench at 1100MHz though but it gives me very little increase in fps.
I can confirm, before loosening the timings, I could not get the RAM stable above 950MHz, now it is stable at 1053MHz. :D
it's probably already been mentioned, but i'm such a lazy ass. at the same clocks, how much do you lose by loosening timings? i imagine that the added clocks from loosening timings is still better than the max possible at the tighter timings?
I didn't notice any difference. I'm just glad my RAM is much more stable now at higher frequencies.
There was one time when overclocking had a meaning :P
I did gain 10fps in the CSS video stress test by overclocking the RAM from 950 to 1053.
Hey for all you people that wanted to know my stock volt. (please tell me what it is)
http://img177.imageshack.us/img177/4...voltpj3.th.jpg
On this i get 771/1935/1015 Completly stable.
thanks
Click on the 'Exact Mode' tab and then show us the screenshot.
umm theres nothing under it. all feilds are blank.
It's 1.1v stock ;)
lol ok, thanks. That explains the high clocks and why it was a bit hot on the stock cooler. So is it possible to get 1.15? i think i was reading this thred and someone got 1.15 or 1.2? i would much rather no solder something to my card.
The XFX Alpha Dog is reference design so that's not possible. Maybe some cards in the future will be able to have higher vcore than 1.1v. I think one Gigabyte card (don't know if it's released yet) may have the hardware onboard for higher than 1.1v. It comes with software to change vGPU, so who knows...?
I updated your memory timings on my XFX Alpha Dog edition bios and was pleasantly surprised with a bit more of overclocking. Earlier my max. was 736/1836/1022 after the bios mod my best overclock is 756/1900/1050 and it did bring improvements in my 3dMark scores though have not tried it in any of the games.
i have teh same card as you. but yet they say i already have 1.1v inthe bios?
I think a lot of the Alpha Dog cards are coming w/ 1.1v stock, but you could always try changing the "Extra" block under the 'Exact Mode' tab to 1.1v. Some cards (mine included) also had all fields blank, but changing it to 1.1v helped me a lot.
Just make sure to save a copy of your original BIOS in case you want to go back to stock later.
There is currently no way to get more than 1.1v through a BIOS modification on reference design boards. Believe me, I've tried!
ok. is there anyway of flashing from windows? or will there ever be? Also if i do flash from dos, i will have to use a flash drive (does this b4 on my xbox 360) but cant remember how to make a bootable flash drive. Then again i dont have a pci video card incase i screw up :( and i also dont have onboard..
/cry
I used this to make my USB stick bootable and used it to flash my 8800GT BIOS.
http://www.bay-wolf.com/usbmemstick.htm
Here a tutorial to make a bootable DOS USB stick: http://www.bay-wolf.com/usbmemstick.htm
There is a way to do a "blind" restore from the thumb drive. Read the NVIDIA flash guides here: http://www.mvktech.net/content/category/4/67/37/
Test it before since the options they give to do a blind restore didn't work for me. I think it says to use "a:nvlfash -4 -5 -6 -a -y file.rom [Enter]" while I think I had to remove the option "-4" for me to make it work (I'm not 100% sure about that, test the restore by yourself with the original biod before to flash with another one).
gah, i cant even find my flash drive now. So to get this right. (when i do find it) i just load my bios, save it (backup) then change it with nibitor then save that and flash the one i changed to the card. Is the extra the only thing i change? and am i changing it to 1.1? thanks.
Anyone have a modified or stock BFG 8800GT OC2 (preferably) bios?
My HDD fried and I never had a second backup. The updater doesn't work for me (and i'm too lazy to change the hex bits so it updates).
Thanks :)
Mvktech has a database of bioses. Maybe you'll find yours: http://www.mvktech.net/
I heard that there are new bioses out from evga to speed up the fan at lower temperatures.
Is there any way to copy those settings and apply it to the older bioses without this feature?
what settings are used for the new bioses for the temperature?
After opening your bios in Nibitor, go to the tab "temperatures", then click on Fanspeed IC.
There, if you select "automatic Speed, you can play with Tmin, TRange slope and min duty cycle to have the fan speed increasing according to the temperature of the core. The mechanism / thresholds are well described in nibitor itself.
IIRC for my 8800GT, I have set Tmin to 60*C, TRange slope to 20*C and min duty cycle to 30%.
As result the fan speed will slowly increase when the core reaches about 64-65 degrees. The temperatures will stabilize at about 75 - 80*C depending of the load, with the fan spinning at about 60-65% at this temps, although in nibitor it says that at 80*C the fan should spin at 100%.