sooo is it POSSIBLE to get voltage control for x800's, or did ati do something different with the x1800?
Printable View
sooo is it POSSIBLE to get voltage control for x800's, or did ati do something different with the x1800?
Sorry man its only for the X1800's.Quote:
Originally Posted by DamienKC
that's not what I asked. I asked if it's POSSIBLE for this to be made ;]
It must have loosened timings big time as all my scores went up big time from using ATI tool. My 05 score went from 7750 to 8092 with exact same volts and clocks. That is a pretty big jump. 03 went from 14807 to 15024 and aquamark went up some 300 points as well. I'm impressed.Quote:
Originally Posted by deathman20
Ahman, I see you are scanning for artifacts using the new ATI tool. Did you scan for artifacts at the default setting or did you choose old (more compatible?) method in the options before scanning? I ask because you don't have the yellow clusters of artifacts surrounding the 3d image like I get if I don't choose the old more compatible method.
I answered your question thought. Its only for the X1800's, maybe the X1300 and X1600's but I havn't heard a thing on them yet.Quote:
Originally Posted by DamienKC
The voltage is controlled internally, which I guess would be controlled by the GPU itself or another onboard IC, and probley uses some special voltage regulators that can adjust via software also.
I'm sure if you really wanted to you could make a PCI or PCIe card that plugs into the mobo and wires up to the video card just like a vmod, and program the card to adjust the voltages via software program inside windows.
Just out of interest when you rin 03 or 05 does it show the clocks correctly and which does it agree with Atitool or OC tool ?
Regards
Andy
Name is Ahmad, thanks :)Quote:
Originally Posted by cantankerous
No I did not use the artifact scanning feature, I just did "Show 3D View". Artifact scanning does not work at all. I get clusters of deltas (yellow dots).
I think its probably because the x1800 renders the image differently from what its being compared to, hence the differences.
When :mad: i run 3d05 and use atitool i always get black screen ,and comp locked up,even on stock frequeinces. :mad:
You my friend are a genius.Quote:
Originally Posted by deathman20
http://www.xtremesystems.org/forums/...id=40209&stc=1
Thats with 655/639Mhz!! I did 922x with 655/783Mhz using the OC tool!
I guess like it was suggested, the only explanation is the timings. But damn. 140MHz difference AND 150 marks higher? This is unbelievable. Looks like W1zzard can write better code than ATIs engineers ;)
Actually mine ended up higher @ 9118 in 05 :) I was messing around and playing games before I tried the tool which probably gave it slightly lower results, but 4 seperate tests yeilded my 9118 almost dead on each time.
Guys, something else I realized through testing this afternoon. It isn't just the OC tool it is also CCC that changes mem timings (if that is really what is happening). I didn't have the OC tool loaded at all and decided to clock up CCC to max 575/550 and low and behold scores were the same minus margin of error to that of the OC Tool meaning it is something that ATI's drivers do as well as the OC tool in conjunction to eachother. Having ATI tool on its own with CCC disabled and no OC too loaded gets much better scores. Let us see if in the future some more tweaking can be done to gain us even more in the end.
Intresting, I was going to try a similar thing to see what was gonig on. Mainly downclock my memory til it hit the similar scores as my old test. But now well I'm too lazy to take a steep back, I want to go forward :)Quote:
Originally Posted by cantankerous
impressive .. 650/1300 and in multitexturing test in 3d mark 05 .. 10300 .. with memory on 1600 with ati overclocker it was 9600
What the :banana: :banana: :banana: :banana:
I was trying this OpenGL benchmark called GL Excess a few days ago (http://www.glexcess.com). I have ran it at stock and at 650/783 with Overclocker and now I ran it with the ATITool OC. Guess what?
http://www.xtremesystems.org/forums/...id=40211&stc=1
:stick:
:stick:
:stick:
:stick:
:stick:
EDIT: Looks like it will be ATITool for Directx and Overclocker for OpenGL :banana:
hmmmm all lower? I found that 01 is much lower in score for me with ATI tool at same clocks/volts however aqua, 03 and 05 are all much higher. No clue what is going on. Perhaps it still isn't perfected and there is some issues to have further looked into.
The only thing I can think of is your mem speed of 639 over 783. If there is a timing issue perhaps the higher 783 still overperforms a tighter timing 639 even if only by a bit. Kinda like low latency system dimms where 2-2-2-5 at 250fsb would be faster than 2.5-3-3-3 at 275. The only other thing is perhaps the timings/clocks make a bigger difference in D3D which is what most benches/tests use as compared to OGL which is what that specific test above uses. Only speculation but has some basis.
signed
But that really wouldn't make much sense...Quote:
Originally Posted by cantankerous
Unless... Hold on a sec. I'll be right back.
Well I'll test 03 when I get home and see what it yields. But I mean a 400 point increase for me in 05 is mutliple times with multiple reboots seems to be pretty good testing bed for me at least.
Ok, now I am confused :confused:
In ATITool I turned my clocks down to 500 for the memory and kept the core at 655. I scored 20k in gl excess... thats lower than my stock score!!!
Then I rebooted (reset settings and everything), opened up overclocker and clocked the core up to 655 and left mem at default. I scored 23k. Thats 3k higher than ATiTool at identical settings.
But I think I have an idea. It seems that maybe ATITool overclocks different parts of the card and somehow they are both reading the same thing...... W1zzard where the heck are you?
Does anyone know if these cards possibly throttle back if they get too hot? there might just be a trade off of running higher speeds with higher temps, then at lower clock speeds. i was just curious, and wasn't sure if i ever saw it mentioned anywhere.. and i could have sworn when i was playing around earlier when i first got my card, sometimes a higher clocked successful run of 3dmark05 resulted in equal or not much improvement over the a previous run at a lower speed. Recently i began to keep track of my clocks, drivers used, cpu speed, gpu speed/mem speed and 3dmark05 scores, but i haven't taken much data.
I do think this card has throttling. As a stability test I left 03 looping for 9 hours straight in a hot room with the door closed. In the morning all was still running however I noticed the playback very jerky and the fps counter would go up and down up and down up and down. Scores were way off as well. I let the card cool down a bit and it did a run like normal with no more drastic fps jumping or jerkiness visible on the screen.
certainly something is happening when gpu is hot (74C at 710Mhz GPU) .. its getting jerky in Need for Speed for example, decreasing Mhz to 680Mhz fixes jerkyness.
yep, same thing I noticed. It is hard to get accurate temp readings cause the minute you open a third party application the temps go all screwy in CCC. Gets stuck at 20c. ATItool now has monitoring but the values are much lower than that in CCC. Not sure which is right.
cantankerous : well, im sure that 74C is right for 710Mhz and 1.35V voltage with zalman, cant be higher! :D i hope
I have had the same results with BETA 9. With the overclock tool I can get 675/810 1.35/2.2 and hit 9160 3DMark05. With the ATiTool B9, I so far have only been able to hit 625/675 1.3/2 9150 3DMark05. With 625/750 on overclock tool I was at 8625 - a big difference from a lower clocked memory at 9150. Kind of makes you wonder which results are valid. I get memory corruption with memory set at anything above 700. Strange - would definitly like to understand this better. I will do some more extensive testing in the next couple of days.