No, the file name you type is ATI290~1.BIN
Printable View
No, the file name you type is ATI290~1.BIN
OK, I'll report back in a few.
WOOT!!!
I've done it! Thanks NickS!!!
Pic: http://i117.photobucket.com/albums/o...hwhey_14/2.jpg
:cool:
Here's a pic of my card with 2900PRO BIOS:
http://i117.photobucket.com/albums/o...ey_14/GPUZ.jpg
Hi everyone, last night I've flashed my Pro 1GB with XT bios, but result wasn,t better then before. The OC is the same 911 / 2466, but 3D Mark06 score was 90 3D Marks less with the same settings. I,ve tryed the next step 928 of GPU with 1.288v but it crashed.
Now I,m with old good Pro bios again:)
Hey Guys , does the atitool 0.27 allows to fully controll the voltages via software? [vcore , vram etc.] on win-xp ?
At vista didnt work , and im thinking to rolling back to xp's .
Ther's a issue regarding this card and 0.27 - you can alter vddc (vcore), but it will not stick. Once you hit 'set clock' or launch a 3D app its back to 1.20V or whatever it is as default. Vram sticks (mvddc). I think this issue is mentioned earlier in this thread.
can anybody with a crossfire setup verify that the cards indeed switch to 3d clocks? I put my second card in and i think my drivers are screwing with me again. i have the rivatuner hardware monitor up and once I launch a 3d app (3dmark06) it doesnt switch to 3dclocks. I'm only getting 13k right now, which is pretty dissapointing. But i'm sure it's because the cards aren't being switched to 3d clocks. I just wanted to confirm that the cards do switch to 3d clocks in crossfire mode.
in rivatuner it just shows 500/500 and never changes.
I run in crossfire and get
14,300 with the cards @ Pro and Q6600 @ 3.0 on a X38. You aren't using Rivatuner to adjust the clock speeds still are you as only one card overclocks.
With them both @ XT bios and Q6600 @ 3.5 I 17,500.
BTW the x16 x4 ports are meant to give it a 10% hit
ATItoll an single card 2900PRO bios 512MB , can't remember after reboot:mad:
i got to this result here 1,20V to GPU = 857 MHz / 958 MHz on air
Q6600 G0 @ 3.9GHZ
Thats interesting.. so you think my scores are about normal then for my specs? The 13k score I got is with the XT bios.
I was only using rivatuner for the hardware monitoring, to check to see if it was switching out of 2d clocks to 3d clocks.
I'm not going to try overclocking till i know my machine is working correctly. I think it might be a good time to reformat, and put a clean OS on it.
Curious if anyone with the 8-pin chooses to use Overdrive for oc instead of other utils?
Just finished a re-install and now get 18,205. Thats on Vista x64
http://service.futuremark.com/compare?3dm06=3418483
My card from eWiz should be coming in this week hopefully. Cant wait to try this!
They're out of stock at newegg, again. Ewiz doesn't have any 2900PRO, nor does any other e-tailer with the exception of NCIX. Although they have the 512 for like 290$+
the sapphire card i got from new egg yesterday was flashed with a pro bios. I flashed it to the XT. both cards are able to run in crossfire now. I just need to resolve my driver issues. At least i hope thats the problem.
first run of 3dmark06 on a fresh and clean winxp install with cat7.10 and stock card clocks:
single card 8860
crossfire 11691
not that bad for an amd-only rig ;)
now iīm waiting to buy some waterblocks next month and oc those babies.
Nicks , when you have time , could you put up the 2900 Series Heatsink Removal Guide on your website nickfire.com ? That is if you took pics during your own removal of course :)
Your temps look quite awesome to me . Are you still running the thermalright cooler , or did you mount the standard cooler again .
Ok is that where Overdrive itself tops out, or just for your card? You can go higher with other utils? Guess I'm just wondering if there is some advantage or better results somehow using Overdrive, tho my hunch says no. If there is I'd buy the 6 to 8 adapter or do the mod.
I have idea what's this 'grafik' about but first thing I can think of is a CPU-bounded system... :P
High memory speed by itself doesn't mean jack:banana::banana::banana::banana:, I agree.Quote:
As you can see the 512bit bus is not really effective, and on high memory speeds you cant see performance improve.
However when you need bandwidth (strong pp e.g. AA) then it is very beneficial.
The bus most certainly won't be - the card, altogether, could be.Quote:
An 256bit bus will probably about the same performance as the 512bit bus, but will be much cheaper to produce.
However if RV670 turns out to ba a simple lower process respin of the current R600 and you take a look at these great 2900 Pro OC results here then I cannot really imagine any HC gamer to opt for the card with half or 2/3rd of the bandwidth for the same price...
Errr how so? If it's indeed the final spec then I'm sure it isn't more powerful - see these 860/2000-2100MHz results? If the upcoming card is the same except a 200-400MHz higher clocked but half-wide membus then how would be more powerful? :)Quote:
If the final specs of the 2950 will be 825core, 2400mhz mem, then i'm sure the 2950 will be more powerfull.
What I'm trying to say is that we don't know anything about the GPU and that's the key here. :cool:
Of course, halved databus doesn't help, that's for sure. :p:
i tried overclocking my cards like shown in the link on NickS page. it didnīt really work... fps in 3dmark06 were lower than without ocing thru amd gpu clocktool.
now iīve got the "problem" that i have to use the other dvi port on my cards, itīs like they switched, when i enabled and disabled the secondary display and extended my desktop to it. is there a way to get back to the standard dvi port (the one outside, farther away from the mobo)?
:devil:
rivatuner + fan on 100% 2300MHZ on MEM ,it is ok to 3D-mark01/03,but not to 05/06 ned more volt to MEM
no ATItoll
3D-mark01/03 836/2300MHZ,an 05/06 it is 836/975MHZ
ok... I fixed it... stupid problem. It's because i didn't have enough power. i was only using 1 6pin on each card. so now i'm using a 6+6 plugs on each card. These seems like more realistic numbers.
Pro flashed to XT X-fire:
http://www.turborocco.com/2900pro/XTcrossfire15131.jpg
Sry, I was a bit quick on that one. After setting VDDC in Atitool it will stick when lauching a 3D app (at least 3DMark). It will not stick if you try to alter the core-clock via Atitool.
So the way to go is to set VDDC in Atitool and clock with another software.
Or, set coreclock first in Atitoll and then alter VDDC.
btw, I did try Powerstrip, somtimes it worked, somtimes it didn't.
Ok im going to format D
Best way is to set first the memory speed with ATITOOL, then set the Volts.
Next step is to start AMDGPUclocktool and raise the clockspeed.
But my experience is that using ATITOOL and AMDGPUclocktool together is not always stable. Sometimes its stable to use, and sometimes i get a reboot.
We only can wait for ATITOOL 27B3 for some pwnage action.
Other than running xfire, is there any benefit from flashing up to xt? I'm currently running 831/996 and temps average 77c under full load and has max peaks at 80c.
How can you tell your PSU isn't up to the job?
As I have a OCZ 520w Modstream with 28A on the 12v rail, so it's a little under spec even running XT speeds.
But it does run at XT speeds fine, but when I take to say 810/950 it runs for a while, NO atrifacts, but the PC will just hard lock with not a sign of an artifact.
Is this just a poor R600, or lack of amps?
It could be you're PSU. My CM realpower 550watt couldn't handle R600 :P.
My 12 rail collapsed to 11.6v when i had the R600 connected. Best way is to measure the 12v rails under full load with a DMM.
I bought a Fortron Booster X3 300watt VGA PSU, they are good and cheap, only 15 euro :p: It handle's R600 with no problem.
http://geizhals.at/img/pix/244247.jpg
Well after I've got my Antec P182, I'll be getting a Corsair 620W jobby. That comes with one of those 8 pin PCI-E plugs. So I guess I'll see then.
So guys any1 have info about atitool 0.27 b3 [release date etc]
My 2900pro is on the way. I've heard conflicting reports about its power requirements.
Does anyone have an OCZ PowerStream 520 or something similar? It does 33a on the 12v.
I've had this baby for 2.5 years now. I am hoping it will be enough!
I did the bios thing and it went verry smooth I thank everyone for the instructions :clap:
A few questions thou,
when i fist used the ATI HD2900XT.bin version, GPU-Z showed that default GPU 743 and VRAM 823,but rigth above it the readings were 505/511.
Same info under AMD GPU Tool.2D and 3D clock speeds. I gues its normal :shrug:
so i tried the Sapphire HD2900XT.bin(newer version than the first one) reinstalled CCC 7.10 and now i've got 743/823 on both,current and default. BTW, with the first bios, altough it said that the default clock was set to 743/823 when i played with the settings under AMD GPU, everest reported that i had actually OC the card :confused:
Also, ATItool did not launch at all. But as stated above, once flashed with Sapphire XT, everything seems to be normal now. Or am i just kidding myself:D
3D mark scores before and after the flash didnt change at all(with the first bios)i coulndt get a score over 7000,stuck at 6980. I know that the overall performance of the PC effects the result,but still it should've done something. Now i can only get around 7080. Oh and befor i forget, i cant seem to get ATI overdrive to appear under CCC. I'm assuming it's cause of the 6+6 pin connectors, but i made one my self last night and still no overdrive.Any suggestions for this,other than buying a new PS?
System specs:
Intel Pentium D930 3.0 no OC
Asus P5K-SE
Corsair XMS2 4-4-4-12 2T 677mhz
Sapphire 2900pro@XT
Silver power 480W ps
Now, how could i improve my 3D mark 06 scores?
I've been reading this thread for 2 days and my account just got activated. Hello to all of you i am a newbie(just to this form,not to this world)
Intel Pentium D930 3.0 no OC ? buy a new cpu ? it is to slow;)
just as i thought :D
Hello all, new to the forums here. And this thread is what made me make my final decision on what to buy.
It looks as if its now a deactivated item on Newegg. I know they had got a few in on Tues, which is when I ordered my HIS one and had it 2nd day.
My question would be dealing with the 3DMark scores people are showing. I've just recently upgraded my system after getting some OT in at work and stashing money away so the woman didn't see it.
What I've purchased was:
AMD 6400+ Black Edition
2GB Patriot PC2-6400 4-4-4-12 Mem Kit
ECS GeForce6100SM-M Board (I know, but it was bundled with the CPU and couldn't afford much more after)
HIS 2900Pro 512mb
630watt RaidMax Volcano PSU (has two 6pin's and 8pin but I guess the 8pin is for EPS)
I've got a new Vista OS installed and only scored around 9600. The Vid is OC'd to 740/2000 (effective). The CPU test is where the system REALLY bogged down during the benchmarking.
I've a feeling alot of it has to deal with the mobo itself and others, settings I've not made or tweaked as of yet, both bios and OS.
What I'd like to know is there any really good guides on making adjustments? Doing searches gives you so many different opinions, it actually gets confusing.
Thanks.
I found an easier way to OC in crossfire mode, if you run dual monitors. I have one monitor hooked up to the primary card, and the second monitor hooked up to the secondary card.
Disable crossfire.
Go into display settings and extend desktop to the 2nd monitor.
Launch rivatuner or whatever apps you use to OC. (I have only tried rivatuner)
Select your cards, and overclock them. Shouldn't matter which card you choose first.
Then enable crossfire. (this will disable the second monitor)
This still may work if you have one monitor. But you'll have to extend it to the second card. (haven’t tried)
I think i've reached about max potential with my mobo/psu. I was able to overclock in xfire mode to 820/970. Any higher and I got mixed results, I either couldn't enable crossfire mode, or the machine would crash/reboot.
So I hooked up a spare psu laying around. I hooked them up to the 6pin plug. I didn't get much better oc's as I still suspect the power supply still isn't enough. I was able to get to 829/990 with the extra psu.
I also was unable to overclock in xfire mode in Vista. but was fine in XP.
at 829/990 I was only able to score ~16110. This is where I think the p5b and a c2d reaches it limits. At stock XT speeds, I got 15131. With a 78mhz oc on the core and 142mhz on the memory, I got about 1000 points more. With the x4 pci-e handicap, I should be happy with these results that it was able to even scale this high. Not as high as I wanted, but that’s because of the mobo/cpu.
definitely a slippery slope. I upgraded the cards, now I want to upgrade to a X38 and a more powerful psu.
need help to set volt to MEM in ATITOOL
MVDDC is 2.200V an MVDDQ is 2.099V
First of all welcome to XS. :)
Thats not a *horrible* score but yeah it should be higher with your card at those clocks.
Are you sure it's not something simple/stupid like you have FSAA forced on in CCC? Even putting Mipmap detail on high performance will help a bit. I also quit all unneeded Windows Services etc before running any bench etc etc etc. Read a Vista tweak guide for sure.
Also try something 3DMark 01 which is not so GPU-bound and is more sensitive to overall system speed (cpu and mem clocks and timings). How that scores will provide more info as to where the bottleneck is.
Can overclock your CPU at all with that mobo?
FSAA is set to application preference and Mipmap is on highest setting.
Trying it with '01 I get a result of 30090.
It looks as if I can, I haven't messed with it, but the settigns are in there to manually set voltage, multiplier, etc.
Hmm that seems a bit low too. I am not done tweaking/testing myself but my high so far in 01 is 57,800.
Good news on the mobo BIOS settings, see what you can tweak there such as mem clocks and timings. Other than that you're just gonna have to run other various benchies and just keep testing. Oh install Riva Tuner and set up the hardware monitoring. Can make sure you're properly switching to 3D clocks, for one thing.
You have 33A on 12v rail
http://www.amdzone.com/pics/powersup...rstream520.jpg
Thats 5 more amps, which is a lot for a GFX card ;)
How long do you test with ATI Tool?
As I've just finished running the artifact tool for 1 hour at 800/1000 and it found no problems.
But what I will say is I've noticed the 12v drop to 11.58v at that clock. So maybe it is my PSU????
tried that method a few mins ago. i set stock xt-clocks 740/825 on both cards (fan@100% just to be safe) and veryfied the clock change with gpu-z. the clocks were set and the pixel fillrate, texture fillrate and mem bandwidth increased on both cards, but the fps in 3dmark06 dropped by 10fps. i didnīt run the full benchmark cause the first few seconds were enough to see, that the oc wasnīt good.
i really hope that there will be a clocking tool out soon that is fully supporting this card.
So I had loads of trouble getting one of these cards, and I will now publicly thank NickS for all the help he provided, not just with figuring out the issue, but also in helping me to activate my membership on here.
I learned the hard way that these cards for some reason don't work using the DVI ports until the OS loads, at least on my Sceptre X20WC-Gamer. I almost returned the second card but I tried an old CRT and it worked! I feel bad for Newegg b/c I probably unnecessarily RMA'd a working card (I'll never know). They were a real class act about the whole thing. How was I to know that I wasn't supposed to see a display @ POST? I had previously been on a BFG Tech 7600GT that did not have this issue. This is my first ATI card since the original All-In-Wonder.
It's a great card, but I found Sapphire's documentation completely unacceptable. They don't mention the power connectors at all! :shakes:
I got things worked out now by hooking one DVI port on the card to the DVI port on the monitor, and the other uses a DVI-VGA adapter with a vga cable to the VGA port on the monitor. When windows loads, the monitor automatically selects the DVI input. This is less than ideal, because it means I potentially would be forced to switch to VGA whenever I needed to do something different in BIOS or change boot order to get into Vista or Ubuntu, IF I had 2 LCD's hooked up (each using the DVI).
I can only imagine the headache this issue is causing both end-users and distributors who are seeing lots of these cards seeming to be DOA.
It was really difficult to find information on this issue, but with enough googling I found that ppl have had this same issue with other ATI cards, dating to at least 2005. I really hope they make a BIOS update to fix this.
I'm not here to just complain. These cards are amazing! I can run every game I have so far at 1680x1050 Max EVERYTHING :D . This includes Company of Heroes, Oblivion, and Bioshock (which looks especially nice I might add). This was in XP, Vista might change things a bit. This card even beats out my Core 2 Duo for the price/performance crown! $350 for the performance of ~$600 card, now that's a deal! :clap:
Does anyone recommend using ATI Tray Tools? I have so far tried RT and CCC, and for me CCC is the best option as 850Mhz didn't work (so no need to exceed the limit in CCC), but I'm wondering if ATT might have some other features that I would want.
Also, does anyone think I'm shortening the life of the card if the core hits the mid 80's Celsius, what about 90 or 95? I live in SoCal, so my temps will probably be higher than most ppl at the same settings. Unless ATI is now using something other than Si04, like Hafnium or something, it's hard for me to imagine that this core can operate safely up to 100C :shocked: , while CPUs have a much lower threshold. IIRC, the R600 has more transistors than the Core 2 Duo, so I would surmise (see Foolishly Assume) that the tolerance for overheating is even lower. I suppose the chip's internal layout also factors in. I would love to get that cleared up.
Thanks to all for all the info!
Just adding another owner to the thread... sys in sig..
Sapphire 2900pro 512 gddr3 846/900
both pwr connections/ ati overdrive oc
3dmark06
http://img.photobucket.com/albums/v4...Hudson/3-4.jpghttp://img135.imageshack.us/img135/3209/35650305jf6.jpg
I see this card disappearing in almost all shops around here .
What i am wondering is : what if a card becomes faulty and needs to be RMA'd . What card will we get back in case they need to replace it ? Since they only had limited amount of pro chips , we might get an XT card with a pro bios on it , or maybe even an XT card if they dont bother flashing it to pro :up:
I am picking my card up this weekend , next week i will test . :yepp:
According to Newegg's return policy, if that happened, they would just turn it into a refund RMA instead. This is what happened to the first 2900 512MB I bought. I sent it back on Friday and by Monday they were sold out. I was so pissed. I had even called on Friday to make a special request that they hold one, but they apparently don't have that option as customer service is completely seperate from the warehouse. If I had been absolutely sure they would sell out before the RMA processed, I would have bought a second one on friday, but I was afraid to be out double the money and have an extra card I couldn't use. I guess I could have ebayed it but :shrug: .
Some retail places would end up sending your card to the manufacturer and in that case it would probably come back refurbished (I think this applies to Ewiz.) Ditto if you RMA directly with the manufacturer, although I suppose it would be possible for them to take an XT and flash it like that. That would only be a situation where they couldn't refurb it I suppose, since they would be losing a big chunk of money based on the price difference.
I'm just really glad I was able to snag one of these before they were completely gone. Good luck with the new card!
Well, have some interesting news.
As I had previously reported, I was getting benchmark of around 9500-9600. This was on a newly installed Vista 32-bit.
My DVD arrived yesterday for the 64-bit, so I went and did a re-install with that and installed all the previous stuff that was on it earlier, benchmarked and it came out with 10325. So almost a 1000 difference with just going to 64bit Vista.
The reports that 3dmark give me are good on the video side. But the CPU is only giving a score of 1700. The two tests that it does in the benchmarking is so choppy its rediculous. 0-1fps. This is what is really dragging the score down.
i think the problem is rivatuner. although gpu-z shows the clockspeeds i set for both cards in rivatuner the way i set them, rivatuner sets the second card back to standard 2d clocks the moment i enable crossfire. this clockchange isnīt shown by gpu-z. i canīt get the second card back to 740/825 again then. funny thing is that rivatuner reports the stock clocks of the secondary card as 740/825 when i disable crossfire and expend my desktop to this card again. gpu-z on the other hand then reports the stock pro clocks for this card again. rivatuner is buggy as hell, thatīs the only reason i can think of why my fps in 3dmark are that much lower than at stock pro speed.
i so wish to have a working atitool, like i had with my x1800īs. nothing better than clocking both cards at the same time with cf enabled. i hope they get it done soon.
Flashed 2900pro to 2900xt
However, I inadvertently deleted my backup of my original sapphire 2900pro 512 bios. Can't find it online. Anyone known where I can get it?
Thanks
Can somebody chime in on my temperatures? Running 875 core with 1.225 core. (ATItool .27b) There are no ramsinks on mosfets or top set of memory chips. The memory chips closest to the mosfets are showing 60C, the ones towards the DVI inputs are at 45C. The big gray box mosfets are nearing 70C, the motherboard itself just past is at 60C from just the presence of heat. I can sense the heat from the system when I get within 12 inches of the card. GPU core is at 53C per ATItool (water). Am I going to damage something with these temperatures?
You won't damage anything but it may cause instability down the road if your ambient temperatures rise or something. I'd set a 120MM next to your card to blow across it. That'd solve your problems EZ.
Hey there.
First off let me say to all the people talking about the CPU tests really bogging down. It's supposed to run from 0-5fps. I runs the graphics on your processor and doesn't use your video card at all from what I understand. That's normal. The card isn't going to help that. That's why it's a CPU test.
Just bought my first 2800pro and and quite happy with my purchase. For 279 you can't beat what I have basically upgraded to.
Here is my brief story.
Ok, I am running an MSI K-Neo 4 or something like that. I had to buy a replacement board for my 939 when my abit an8 ultra blew up ;)
I had an x800xl 256MB GPU and a X2 3800+ running at stock 2.0ghz.
My score on 3dmark06 was an abysmal 1863.
Bought my new parts and with the 2900pro stock and the 3800+ stock I jumped up to 7094.
I then overclocked the the 3800+ to a measly 2.25ghz and upped my 2900pro to 833 core and 891 memory.
I then ran 3dmark06 and got
http://www.unholysyndicate.com/image...3dmark8246.jpg
http://www.unholysyndicate.com/image...llocscreen.jpg
and a gpu-z validation at
http://www.techpowerup.com/gpuz/w7uv8/
I don't know a whole bunch of overclocking but I am learning. I have tried to use ntune to change my gpu but can't figure out how to use it so I'm stuck with using Ati tool.
If anyone has any good overclocking utilities that they can link me I'd greatly appreciate it.
I'm looking for a good program to use for the video card (and maybe how to use it lol) and some stresstesting utilities so that after I overclock the GPU I can run some tests to make sure it's stable.
I'll be talking to others in a different section to help me with the OC'ing of my 3800+. I have an original big typhoon and I'm sure I can reach a higher OC than 2.25 lol.
Thanks in advance.
Try RivaTuner.
Welcome and glad to see another 1GB owner here :D Agreed on the price/performance crown and as for ATT overclocking and monitoring are not supported as yet, which is really too bad, someone needs to score Ray Adams an R600 card.
As for your temps I'm no expert here but experience and common sense tell me you're fine unless it's consistently over like 85. This thing was engineered to not hit 100% fan speed until its at 105C. And You may live in SoCal but Im in Chicago in sub zero temps w a gf who likes the room at 80F :p:
I bet it's your memory that crashed FarCry. 999 seems a bit extreme (for my card at least).
You set the speed to 100%? Lol...I put it at 50% out of curiosity, man I immediately set it back to normal. That thing sounds like a jet engine!
1. When disassembly the stock heatsink, I discovered that only half of the ramchips made proper contact with the thermal pads on the heatsink. (It's possible to see this because of thermal pads have a slight pattern, when proper contact is made this pattern disappear.
2. After flashing a PRO to XT, I noticed MVDDC was raised to 2.20V. I found this to be a lot higher than than nessesary. Card can bench at 999 with only 1.875 (may be even lower than this!)
Thus, try to lower MVDDC, You will have a cooler card.
just got my card yesterday and i tried atitool voltage adjust is not working,anyone can confirm this?
Tsaroth,
most people use RivaTuner and Ati Tool
Guru3d has just about everything for download
http://downloads.guru3d.com/download.php?id=13
haha.. Try running crossfire at 100% on both cards.. man I'm gonna need those safety ear muffs
http://www.rahq.com/images/Bilsom/707_Earmuff_sm.jpg
Hi,
I read that some Sapphire HD2900PRO 1 GB. has 1.20v. default VGPU. Could you please upload that BIOS if someone has?
Thank you very much...
Hicksetto
anyone with crossfire running vista?
i just found out that 3dmark06 and steam hardware survey only see one card and no crossfire. damn vista... m$ should be sued for making us use this bs os only to be able to have dx10...
I couldn't make heads or tails out of Rivatuner.
I also had to trick the ATI Tool into going over 733 and 833 or something like that. I was setting it up to a certain point and it wanted to go back to a lower ceiling it had when I hit apply clocks or something.
I'll try again.
It was odd, when I was running 833/891 it was 3dmark06 stable but when I tried to play UT3 it crashed and I don't know if it was just a random lockup or had something to do with those speeds.
Are those speeds too high?
Also, what are some stability tests for the GPU other than 3dmark06? Just want to know so I can set the clock then test stability so I'm not in the middle of a game and go POOF.
I also put the fan to 100% and good god that thing is loud.
Dont use ATI Tool to oc, hell you dont really want to use it for anything at this point unless you need to up the voltage as its the only software option for doing so now. And a buggy one at that.
With Riva Tuner hit that little triangle on right side of where it lists "512-bit R600 (320sp)". Otherwise use ATI GPU Tool. ATI Tool does have all the features/tweaks u need available but the latest beta just not great w our cards.
And those speeds are great for default voltage, tho not super high for what this card can really do under optimal conditions.
And dont put it at 100. its loud for a reason. Not set to do that in bios til at 105C which is HOT
oh ya saw that post awhile ago. and? at any rate still no confirmation of any 1GB bios that gives 1.20 vgpu.
Lightman , did you come across something comparing those bioses ?
Atitool 27b2 is too buggy for me to use atm , i get glitches ,hangups and crashes . im now using riva 2.05 but no voltage increases there (default 1.15) . 2.06 will come out soon i've read .
I am now at 860 core . 891 crashed my pc .
Hi everyone,
Here is my BIOS with 1.20v GPU. I,ve saved it with ATIWinflash. Good luck!
http://rapidshare.com/files/62576432/r600.rom.html
I think as usual I have got the worse Pro there is, I can't even get to 790Mhz stable (there are artifacts in Whiteout).
:mad: