Anyone tried running a gt on a second psu? And if so are there any issues like the ones with the fried x800`s?
Anyone tried running a gt on a second psu? And if so are there any issues like the ones with the fried x800`s?
Somebody help me? Please :)
I do not succeed to flash the bios of my 6800u! PoV6800U
Before I have tried with nvflash ver 5.08 and typed command
a:\nvflash -p -u -f mybios.rom, and when I hit enter it just kept telling me to hit enter to continue (it was giving me different commands to type in) :confused: :confused:
After I have tried nvflash ver 4.41...and still problems!!
Now the error is : No NVIDIA display adapters found
I do not understand which is the problem :confused:
heres 2 extreme methods of flashing a unflashable card.
1. use the "nvflash -e" command.
this will erase the EPROM completly, after that, don't reboot, but enter your bios flash then.
if this doesn't work, then erase the EPROM again, and reflash back to your original bios.
2. use the "nvflash -p -u -2 BIOS NAME.rom" command.
it will say, are you sure you wanna do this?
type YES in capital letters, and it will flash your bios completly with no regaurd for the existing bios or anything.
like always, if it screwed up something, just reflash to your backupo copy of your original bios.
also, try 4.42, thats the one that works for me the best.
Is there any bios editor for ATI that tunes voltages?
Yeaaa with nvflash ver 4.42 y have flash my bios ;)Quote:
Originally posted by Kunaak
also, try 4.42, thats the one that works for me the best.
But it does not convince to me :rolleyes:
From my bios I have only changed them the voltage 3D 1.4v/1.5v and the frequencies them 3D 425/550 with 450/600
Before mod my bios with DETECT OPTIMAL FREQUENCES (coolbit) the result was 469/1180.
Now with my mod bios the result is 459/1210 :confused:
With more voltage it finds little frequency to me? :confused:
Gpu temperature is under control..Watercooling...
oh damn i'm so newb :rolleyes: i just had to turn off InternalClockTest in Rivatuner and now im getting 415/950 easily as well...this card rocks! :banana:Quote:
Originally posted by gafa
im just getting to 385/816!! anything above that, when "testing settings" just returns error in the nvidia control panel w/ coolbits.
This program does not work with changing my memory clocks on my FX 5900XT. I had to use X-Bios editor to do that. Its quite simple though fx5900 users, dont be scared by all the binary hex.
Having 3d and 2d at 1.4v bumped me from 420 to 456 as well, have 1.5v 3d only was worse than my 6800GT at stock 1.3 :( . Confirmed vgpu with a DMM, 1.3v was 1.322, 1.4v was 1.420 and 1.5v was 1.520.Quote:
Originally posted by Blind_GI
It looks like heat has become an issue for me, seeing as I passed coolbits test at 456/1.16 with 1.4v 3D/2D (need the 1.4v on 2D to get any more out of her) but BF:V just doesn't like to run at any speed with those volts. I didn't do to much testing (was on my way out when I did it), but I will test it a little more today, to see if I can see if it really is a heat issue.
And the reason why upping the 2D voltage helps is because when overclocking u are in 2D mode, not 3D. So just adding extra volts to 3D won't help the overclock in 2D mode (regular windows). Well that is my theory at least, don't know how true it is but its my idea none the less.
It sort of makes sense to keep the card running at a single voltage, single clock at all times, then it's just the current load that varies. No switching voltages and frequencies that seems to be causing these freezes/ pauses when the GPU's at it's limit.
that's awesome, so does this make vmods obsolete?:banana:Quote:
Originally posted by Nico
Having 3d and 2d at 1.4v bumped me from 420 to 456 as well, have 1.5v 3d only was worse than my 6800GT at stock 1.3 :( . Confirmed vgpu with a DMM, 1.3v was 1.322, 1.4v was 1.420 and 1.5v was 1.520.
It sort of makes sense to keep the card running at a single voltage, single clock at all times, then it's just the current load that varies. No switching voltages and frequencies that seems to be causing these freezes/ pauses when the GPU's at it's limit.
as for soldering on an extra molex connector, i don't think it'd work. Why keep the extra power circuitry on there if it's not going to be used? I do hope i'm wrong about this, though :)
edit: i just checked an Ultra Extreme board vs GT, the Ultra has 3 more caps and like 5 more resistors and 2~3 more mosfets.
The bios mod is an ease way to vmod, but my card doesn't like 1.5v so the 2d = 3d vgpu probably isn't as beneficial as I first thought. Still have clocks and voltages the same can't hurt.
Quote:
Originally posted by b0bd0le
that's awesome, so does this make vmods obsolete?:banana:
as for soldering on an extra molex connector, i don't think it'd work. Why keep the extra power circuitry on there if it's not going to be used? I do hope i'm wrong about this, though :)
edit: i just checked an Ultra Extreme board vs GT, the Ultra has 3 more caps and like 5 more resistors and 2~3 more mosfets.
nope the 1.5v is max using bios adjustments! meaning if you whant to reach max clock speeds of the chip you need about 1.8v, meaning volt modds are still a must if you wana hang with the top spots (of corse you will need better cooling than the sucky stock cooler!)! the way it works is the IC uses an DAC or Digatal to Analog Conveter. on the intersel chips like on the 5700s and 5900s it was a 5-bit DAC, useing 1's and 0's to make up the differant voltages. like 01101=1.3 and 11101=1.4 and so on, programable up too 1.5v.
these DAC voltages are used for the over current protection on the card! the card is actualy a smart card and will raise and lower the voltage. the DAC is used as a mesuring piont as well as a voltage setting! the card uses an inner voltage monitoring chip! is the voltage going through the chip dosent match the DAC voltage then it actualy will in some cases ither defualt the cards core voltage or in most cases causes the card to go into a hickup mode. the card trys to reset itelf to the proper voltage, if proper voltage is not achieve the controler ic will lach off and go into soft start mode with a low voltage! thats how you get the down clocking! but now this is with the 5900 and 5700.
with that said the 6800 ic is made by a differant company Voltera but is most likely to use simalar tech. i think the Voltara chips are useing a 4-bit DAC in stead of 5 and i dont know about the over voltage protection yet. it seems Voltera doesnt have open source for there tech pages! i have sighned up for their secure sight but havnt recieved any reply yet. i would like to see the tech sheets for it myself! my Albatron 6800gt arrives fri. and i intend on doing some serriouse modding myself!:D
as for the useing one clock speed and one voltage to achieve highest overclokcs! yes the has been known since the first FX 5600 was released! its how i got my 5600 ultra to 530mhz and my 5700 ultra to 800mhz and let me take topp 3d mark 2003 scores on each card!:banana: the fx cards dont like the heavy power switching!
good luck and ill see you all on the orb in a few more days!:stick:
Walldow- OC 3DMark Team
wonder if i should set my 2d voltage the same as my thr and 3d voltage on my fx 5900xt.
Quote:
Originally posted by r3b0rN
wonder if i should set my 2d voltage the same as my thr and 3d voltage on my fx 5900xt.
try this! useing reva tuner! disable seperate 2d and 3d clocks to were you are running one clock speed at all times! now buy doing this the card will set a lower voltage ( take a mesurement and make sure)! meaning youll have to turn youre pot up higher (or lower the resistance) to reach the voltage you were useing! this is ok untill you decide to go back to 2d-3d mode! you MUST turn the pot back down before doing this or you will overvolt the card! so be very carful and document alll voltages when doing this! this keeps the card at a steady voltage! giveing alot more stability! also do away with the clock testing option in reva tuner. so when you set the clock speed it is set, with no testing! now cranck her up and tell me what she does!;)
but, you gotta ask yourselves do you really want to be juicing your cards like that 24/7? I would definitely not want to run my card at 1.5v in 2d mode and in windows all the time. This just seems like unneccessary stress. It would be like running a game every time you used your computer...in the long run that cant be good for the card....just my 2 cents though
i sold my fx 5900xt, converting to an ati boy.
Quote:
Originally posted by drcrawn
but, you gotta ask yourselves do you really want to be juicing your cards like that 24/7? I would definitely not want to run my card at 1.5v in 2d mode and in windows all the time. This just seems like unneccessary stress. It would be like running a game every time you used your computer...in the long run that cant be good for the card....just my 2 cents though
no thats not true! 1.5v is completely safe for these cards! the only reason they done the 2d-3d mode thing for in the first place was for power consumption pluss on the older 5600's and 5800's there was also a heat problem (but dosnt seem to be the case on 5700" and later cards)! they figured why draw unessasary power from the power supply when not needed in 2d mode. witch would probley prolong the life of youre card, but the margin of this prolongation is slim.
dont get me wrong im not saying volt mod the card and run 1.9v 24/7 for we all know what extremely high voltages do over en extended period of time! but 1.5 volts is completly safe!
my fx5700 ultra has been running 1.75v on water cooling every since i got the card and it still clocks like the day i bought it.
Is there any BIOS modding tool like this for radeon cards, specifically the x800. I would love to mod the vgpu without any soldering.
I don't think ATI does it the same way as Nvidia. Their voltage is not controlled through the software as far as I know.Quote:
Originally posted by =[PULSAR]=
Is there any BIOS modding tool like this for radeon cards, specifically the x800. I would love to mod the vgpu without any soldering.
anyone got a working link for ver 1.3 or 1.4 ?
How's about v1.5?Quote:
Originally posted by DEVIL_DK
anyone got a working link for ver 1.3 or 1.4 ?
Omniextremeedit v1.5
Thansk :)Quote:
Originally posted by DEVIL_DK
anyone got a working link for ver 1.3 or 1.4 ?
another success here! I have an nvidia engineering sample 6800GT that autodected 395/1090 and had a max overclock of 410/1200 before the flash. Modified the original bios to 1.4VPGU and it autodetected 429/1150 and the max overclock is now 435/1220 (so far). My shuttle/barton 3D03 score went from 12,062 to 12,593. Idle and load temps @ 400/1100 have not changed much, perhaps 1-2C at most. Im happy with the results! Right now im using a stock referance design cooler than came on the card. Perhaps with an NV5 i could go even higher.
my PNY 6800 GT uses e-vga bios 5.40.02.15.01 version. 2D vgpu is 1.1v, 2D/3D is 1.3v, and 3D is 1.4V. I don't see a need to change bios to ultra here. so , i'm keeping the old bios. I am o/cing using coolbits and getting 430/1.14 out of my PNY 6800 GT already! thx for the info tho.
First of all, Unwinder (programmer and designer of RivaTuner) is tha man :D rivatuner RC15.2 is out and now i have my Sparkle6800NU running with 16 pipelines and 6 vertex processors with no worries whatsoever :D
Second, my overclocking has gone a little down with all the pipes turned on, i guess the card must be sucking more wattage. the problem is, i thought "ok, let's raise the vgpu, that oughta do it", i used nvflash to get the original bios, then went to NiBiTor 1.6 (former OmniExtremeFX) to change the vgpu...but guess what, it gives me back 1.4v 3D!! default!! what does this mean!?!?
Gafa, what version is the best 6800 GT bios flash?