Thank you Wiz! You know what we are all waiting for now ;)
Printable View
Thank you Wiz! You know what we are all waiting for now ;)
mattmartineau88:
Use the latest RivaTuner to get higher clocks
Also, GPU clocks work in blocks. For instance, my 8800Ultra increases in blocks of 27Mhz so if i'm running 648 on the core or 655, it doesn't matter, because the core will only work at 648. I have to incresae to 675 to see the next jump in MHz to 675 etc.
So hit another 13Mhz and you should be at a new 880+ value.
ATITool doesnt support RV670, i tried it yesterday
Rivatuner 2.06 does though, so please try that since you might be able to go above 885, which seems to be the Overdrive limit
Now it needs volts :D
Time for more voltage it looks like =]
ty for sharing your results, hopefully we'll get some updates from W1z on ATITool, or maybe a bios mod to up the 3D Voltage
Are those memory values stock? And how's the heat at 902 core?
Yeh i just retested at 905 with 100% fan speed, for the 30 secs it was on proxycon it shows 53C but i cant say that's deinifitely the figure as it crashed.
if it crashes very early, whatever temperature you see is not the load temperature. Its possibly a good deal higher because it takes a longer time before you can really tell what your load temperature is
Sorry but do you guys use Winflash or Atiflas to flash the new bios?
And if Im on crossfire i will be able to select which card to flash to right?
Apologies as I have never flashed a VGA bios before.
I think they used WinTool? Not sure, maybe martin can confirm
I used ATIFLASH, i removed all current display drivers and catalyst.
Used floppy to boot to DOS and saved the old bios to the floppy disk "a:\atiflash -s 0 oldHD3870bios.bin".
Went back into windows and transferred the bios to another computer for a hard backup.
Then i booted back into DOS and "a:\atiflash -p 0 -f HD3870new.bin" (i had renamed the bios from TechPU so as to make it easier in DOS).
I then shut off the power and did a hard reset by removing the main power lead and then replacing it.
This worked perfectly for me anyway and i personally don't like flashing stuff through windows as its a bit volatile and not very easy to rectify
It works perfectly!
I flashed both cards simultaneously using ATIFlash.
Here is my TOP (hotter) card running in single mode just to find max clocks....
Attachment 67600
Fan set @55% (reasonably quiet, like 36% for HD2900), room temps around 22C-25C...
This seems to be max on my first card though!
Now I need to try my BOTTOM (colder) card :)
Here it is:
Attachment 67602
Lets find max GPU clock on this one :D
Thanks for the results. From the few other results I've seen so far, it seems like everyone else is also hitting 877.
So the question comes to be askin', is it still a weird divider or are these things that starving for volts that badly? I ask because with the exact same clocks from the users I've seen so far, we're obviously seeing another wall caused by neither temp nor architecture.
If it's not the divider (again), then I hope to see some v-mod results after our resident bios and software voltage changing experts figure it out. :)
lol.....15mhz????
Guys I cant get ATIFlash the latest version to work on this card. What exactly do you type to get it to flash?
This will mess up your theory:
Attachment 67603
:p:
On TOP card it was 5-10s before lockup....
Yeah I have the link and the new bios but cant get it to flash.
I am booting to dos but dont seem to be getting the cammand correct.
Could someone type this command out so I know what to do.
Thanks
Heres another that will mess up your theory....
http://www.rit.edu/~mkw1084/miwo/3870/NewBIOS.jpg
Lightman we need more volts!!!! :clap: :clap:
My temps are slightly up, but still under control IMO
As i said, actually reading the thread might help. Directions were given there: http://www.xtremesystems.org/forums/...&postcount=121
Just thought i'd put a post out incase someone hadn't noticed. The New 7.11 drivers are out today :)
more volts and an LN2 pot.
who will be first to reach 1000mhz core overclock? haha
Can somebody post some 3870 crossfire 3dmark scores? I,m still looking for some overclock results but cant find it on the web.
Thnx allready!
Was that stable? If so, congrats. :clap:
Ok, so the divider seems fixed if there is some variation on top stable speeds, and where we see the hard reset happening (for most it seemed to be 890, same as Lightman). Theory debunked with that 900mhz shot. :p:
Guess it's time to wait for some voltage do-dads, and when such occurrence happens, I imagine it will be mildly entertaining, especially since the 3800 products finally have a sink over the digi VRM (unlike the x1950pro). It's like it's waiting to happen.
Thanks again for the results fellas. :up:
Edit: Out of curiousity, has anyone checked the VRM this time around? Is it of better quality than that found on the RV570?
Stable in the fur benchmark, but then I start dropping frames like crazy in 3DMark. Definitely needs more voltage......temps are fine. Highest stable/artifact free i have tested so far is 877/1314, which is pretty much the same as Lightman. Setting 885 MHz in Overdrive is actually 877MHz FYI, even though it "reads" 885MHz in OD. Oddly enough, the auto tune bumped my memory to 1379MHz, but I am definitely getting memory artifacts at that speed. Dumb auto utils, lol :)
Not too shabby with stock volts I'd say. To match a fully vmodded 8800gt, you'd probably need 1GHz+~ , which may very well be obtainable once someone comes up with a way to adjust voltage via hardmod / tweaking app / modded bios.
Accelero S1 + 140mm Yate Loon
http://www.xtremesystems.org/forums/...d.php?t=166432
No crossfire results from me, sorry :p
That's what I figured, thanks for the update. :)
Not shabby at all, especially since it seems relatively easy to obtain by most everyone...They are definitely holding back potential (through voltage) because of the cooler it seems. I'm sure someone will find a way soon to fix the situation, but I personally think you're going to need to see way more than 1ghz (1100-1200?) to match those tricked out GT's (perhaps not stock vs. stock, but certainly with both using aftermarket cooling)...If that's possible remains to be seen imho, but still, whatever the case ends up being, it's always great to see more value squeezed out of the cheaper (and easily obtainable) option, as it's certainly why I myself mod/overclock (to save a buck here or there). The closer it gets to it's competition by any (reasonable) means necessary, the sweeter it becomes. :DQuote:
Not too shabby with stock volts I'd say. To match a fully vmodded 8800gt, you'd probably need 1GHz+~ , which may very well be obtainable once someone comes up with a way to adjust voltage via hardmod / tweaking app / modded bios.
Not the highest overclock on the CPU, but 850/2500 on the GPU. Running Vista with ATI hot fix drivers (stock).
http://i7.photobucket.com/albums/y25...703DMARK06.jpg
Weak.........IMO
No problem! But my score will be skewed because of CPU. Still I can run full suit and then you can compare detailed results :) .
Canyon flight is scaling very well on my machine, but other tests are CPU bound to a level that GPUs are utilized only 50-70%...
Results after my work :up:
i'm just trying to decide what kit i need for upgrade to crossfire 3870's, do you reckon i should go for:
MSI K9A Platinum Crossfire (Socket AM2) PCI-Express DDR2 Motherboard
http://www.overclockers.co.uk/showpr...d=5&subcat=808
or...
Asus M2R32-MVP Crossfire (Socket AM2) PCI-Express DDR2 Motherboard
http://www.overclockers.co.uk/showpr...d=5&subcat=808
or....
MSI K9A2 CF AMD 790X (Socket AM2) PCI-Express DDR2 Motherboard
http://www.overclockers.co.uk/showpr...d=5&subcat=808
- AMD® Phenom/Athlon/Sempron CPU.
- HyperTransport 3.0 supporting speed up to 2600MHz
- AM2 CPU supports HyperTransport 1.0
- AM2+ CPU supports HyperTransport 3.0
- AMD® 790X and SB600 Chipset
- Supports Dual DDR II 533/667/800
- 1x PCI Express x16 slot with x16 operation
- 1x PCI Express x16 slot with x8 operation
also does anyone know about the operation speeds ie. would the first two motherboards run at 16x16 or 8x8, i've read everywhere but cant seem to figure it out :S
Thanks in advance
i would take a look at Gigabyte GA-MA790FX-DQ6 AMD 790FX (Socket AM2) PCI-Express DDR2 Motherboard http://www.overclockers.co.uk/showpr...d=5&subcat=808 yeah it might be a bit on the pricy side but , 2x16 and 2x8 for cfx , the asus M2R32-MVP has 2x16 and the msi also has 2x16 , as far as the info i am seeing about those cards. ( hope that helped )
does that equate to 16x16 operation with two cards in crossfire on the board? the gigabyte one is for singl slots unfortunately and if i was gonna spend that much i'd probably just go to 775 :D
Ahhh brill cheers mate :) and nice one wit the ram ;)
Guys it appears that the bios file is .103 file extension.
Notice that most ATI bios is either .bin or .rom is there a diff?
So i just make my thumbdrive bootable and put ATI Flash along with the bios on the first page and flash it in dos?
Cheers
Jeremy
About that volt mod.
Has anyone noticed that in the HD3870 BIOS (like in the HD2900 BIOS) at 0ADC0h you can find the 10 power state preset with core, memory and voltage values?
Core is set to 777(3D), 700(3D?) and 300 MHz (2D).
Memory is 1126 MHz.
Voltage is 1.327 (3D) and 1.241 (2D) volts.
praying someone works how to soft mod the volts :)
id rather not get out the soldering iron just yet :)
im running a powercolor @ 865 and 2600.
Just curious where you've found the BIOS?
You said the BIOS was fixed by AMD Graphics department and I can't seem to locate the BIOS from the official website...
I would appreciate if you give a link to the official support site. :cool:
You should get 1.377 Vcore with this BIOS image.
(I don't have a 3870, so i cant verify)
Is someone brave enough to try out?
Use at your own risk.
(Oh and of course the checksum is no longer valid.)
dude for some reason lately
with a few brands
we seem to be getting stuff like way earlier than the rest of the world
the 8800gt was available 10 days before the launch
the hd3870 was 5 days before the launch
yup mainly in lowyat but usually i deal direct with distro
the shortages for 8800gt has been picked up by zotac and galaxy cards
most of the guys here reported by flashing the galaxy cards to the SSC bios they manage to get higher clocks.
oh yeah the shops are running some political thingy
a lot of them have it but its reserved for their frequent customers.
and some distro has been harrasing a few shops
two incident came to mind
buildtech the distro for asus gpu/mobo threaten a few shops a month ago that those who do not take the 2900pro stocks from them.. they will not get the first preference
kaira the distro for XFX now is saying if u want the 8800gt ADE u need to take 10 8500gt...
Hmm strange, I'm running Gigabyte X38T-DQ6 and Radeon HD 3870 and GPU-Z reports: PCI-E 2.0 x16 @ x8 2.0, I wonder why?
hey guys
does the 3850 card have a clock limit and is there a bios to use on those
got a bench session on sunday
might as well try to take advantege of that since we'll have acetone dry ice water loop in Xfire and all :)
what application will let users adjust volts for these ati cards
Well a simple Hex editor will do it if you know where to modify.
AFAIK there is no automated tool available for hd3000 series yet.
This is why no checksum recalculation.
The hex edit method worked well for HD2900 volt and clock mod.
You can find info about it somewhere here.
http://www.xtremesystems.org/forums/...=165389&page=3
835/1321 MHz seems to be the limit for my Radeon HD 3870 sample from AMD (Non-qualification sample).
I did the bios mod using atiflash
to be specific (x:\atiflash -p 0 -f hd3870.bin)
Used the RivaTuner v.2.06 to overclock...
Without the mod I was able to use it up to some what 850/1200
but now that I've done the mod I can use around 860~865
if I set it to 870 and run the 3DMark demo (return to proxycon)
then the system shows artifacts and the video stops...
using the autotune it configures the core clock up to 872 and the memspeed to 1376 which turned out to be useless...
overclocking the memory seems to be the hard part and gives little performance boost.
My system:
M/B: AW9D-MAX (with beta bios 1.6)
RAM: Super Talent
Power: 450Watt (not a brand)
Oh! the cooling! using the stock cooler... T.T
Does the type of mainboard matter in overclocking graphic cards???
Can't seem to boost further...
What kind of tools do you test with to check if the overclocked settings are stable or not?
and another question...
Does ATI Tool 0.26 work right on 3870???
coz mine doesn't
and another question...
installing multiple overclocking utilities like RivaTuner and ATITool cause a problem?
I can't seem to sort out what the problem is?
Maybe could be the power or the heat...
Why don't we connect the fan power supply to a location other than on the Video Card?
If I run a test that uses the GPU aggresively I can sense the difference in the RPM with my ear...
I think the fan is consuming a portion of the watt that is needed by the GPU...
mem is at 1332, look closer.
Right :)
And yes, it works ;)
Wow, this problem has been fixed quickly! Can't wait to see results on water. :slobber:
Keep it up! :up:
Couple of quick questions.
Is Riva Tuner the only utility supporting the 3870 atm? Tried latest Ati Tool beta I could find but no luck.
Is there anyway to change the increment Riva Tuner increases the Core? Right now it seems to be at 13.5. I can bench 891 on the core, but 904.5 is a no go. Would be nice to set it some where in the middle. Thanks
Yep, same here tried the other two also. Tray Tools just locked my system when trying to set the clocks. Either way not too bad with the bios change and a ThermalRight V2. Benched 891/1400. Should be interesting when there's full support and some voltage control. Thanks for replying.
Wired! My X2 6000+@3500MHz runs 61100 :)
Here is detailed info:
PS. That run was @860/2250 both cards ....Quote:
3DMark Score 61108 3DMarks
GT1 - Wings of Fury 628.3 FPS
GT2 - Battle of Proxycon 513.4 FPS
GT3 - Troll's Lair 413.4 FPS
GT4 - Mother Nature 466.6 FPS
CPU Score 0 CPUMarks
CPU Test 1 0.0 FPS
CPU Test 2 0.0 FPS
Fill Rate (Single-Texturing) 12738.9 MTexels/s
Fill Rate (Multi-Texturing) 25752.1 MTexels/s
Vertex Shader 222.1 FPS
Pixel Shader 2.0 563.2 FPS
Ragtroll 233.7 FPS
think these cards will allow software voltage controls in the future.
05 is nothing like 03. The only test in 05 that even significantly changes between dual card and single is GT3. 03 is completely GPU bound. 05 is a system test and 03 is a GPU test. If it were drivers, your score would be low in everything, but the only difference is the nature of the bench.
I didn't see that you had a 4x card, that will clearly limit things...
i apoligize if im asking a question thats been already answerd but does this bios fix work on the 3850?
so whats the average OC u guys are getting for the 3850's core/memory?
Anyone have a good way of controlling the fans for both cards in crossfire? Setting up a fan control scheme with RivaTuner was easy enough but it only covers the card that you're currently monitoring. The bios auto fan settings seem a bit loose to me (60C->30%, 65C->40%, 70C->50%, 75C->060%, 80C->70%, 85C->80%, 90C->89%, 98C->100%). Since the fans are fairly unnoticable 50/55% and under (in my case, at least), it'd be nice to kick em in earlier than 60C.
On a side note, my 3870 crossfire 3dMark '06 scores were virtually the same between Vista Home Premium x64 and XP x64 with the 7.11 drivers when tested at the same core speeds; Crysis hotfix drivers were hard resetting my machine so no idea on them. XP does seem to allow me to bump my cores up higher.
same horrible fan speeds as what the x1950xtx had...
tiger 8312 - what CPU do you have? Sounds like you've got that 450w power supply running pretty hard.
has anyone tried the voltage modded bios that Pitya posted?
I am thinking it may work, but can't test it until I get back from Thanksgiving
Basically at address ADC0 - AE50, he changed "2F 05" to "61 05"
If you invert the bytes, the original read 052F, or 52F. If you punch this in calculator and do HEX->DEC, this reads 1.327, or the readout for stock 3D Voltage. Changing this value from "2F 05" to "61 05" reads 0561... or 1.377v.
Even though the checksum is going to be wrong, it might still work because this bios voltmod technique was successful on the 2900cards by simply modifying these values.
http://www.xtremesystems.org/forums/...1&d=1195827761
In the event that you do brick your card, I am fairly sure that you can still "blind flash" it back. Video cards have always been like that, right? I remember doing that to reflash my old 9700Pro way back that I thought I killed through a bad flash
It is said on the manual to use a 500W or higher PSU though I speculated that the spec they are giving is way too high for standard purpose... just a way to avoid the blame when the graphic card gets burned with the PSU...
My CPU is E6300 @ 2.7GHz and the RAM is running at 482MHz...
couldn't get my RAM to run at different ratio so used FSB : DRAM spec = 3:4
CPU = 386*7 = 2.7GHz
Sucks...
I roughly calculated the total energy consumption used by my system and it should be around 270Watts even with the graphic card.
Do you think PSU is holding me from further overclocking the video?
What about the temperature?
Spuggi says that he/she is using the stock cooler and with the help of the BIOS overclocked it up to GPU@891Mhz and MEM@1332Mhz...
I could do that too but system either crashes or give out a VPU recovery session.
I have stabilized it the GPU@840
further going causes VPU recovery or crashes after exhaustive use...
besides that how do yo know the clock is stable or not?
I used ATITool 0.27 B2 and turned on the 3D renderer (Cubic Fur) and scanned for artifacts
Is there a better way to see if your clock is stabilized aside fromt the above mentioned method?