You are about memset ?
all the same for 2 runs.
Printable View
You are about memset ?
all the same for 2 runs.
Massman that's only one run it can't proof or disprove anything can it? Like I said, clean setup get a base reference stock whatever score, put an hairdryer instead of a fan on the cpu hs and see if you get a speed boost :confused:
Just quickly reading what has been said in this page, and some possibilities (again recapping, credit to massman and CCC for discovering):
- Could be ATI driver not switching to 3D clocks between test runs causing "bugged" runs? But sometimes it works, so you get a "real" clock. This issue becomes moot when you recall this happens on nVidia cards as well (unless it is the blackscreen effect which makes sense then).
- Could be unstable CPU causes 3DMark to error out, resulting in a bugged score (ie may fail half way or part way through a test resulting in unusually high score). Meaning, that a stable setup would never experience such a bug as 3dmark behaves expectantly.
But this itself doesn't make any sense if you are actually seeing higher FPS, wouldn't it mean that there is better performance?
Also, let us eliminate the black screen issue. This is diluting the results because the black screen issue is reproducable with anything. If we put that aside, we can hopefully just focus on "successful" 3dmark03 runs.
No idea what is causing that issue; mchbar is locked with this motherboard, so I can't even change one setting in memset.
I just recieved this motherboard a few days ago, for reviewing purposes, so I'll try to figure it out. It'll be solved with a bios update, I reckon.
The screenshot was to show that in my bugged runs, the 750/900 stock clocks are applied in each test. A follow-up run with a non-bugged run and a temperature test will be conducted tonight.
Ran the follow-up: no noticeable downclock in rivatuner. Is there a way to monitor the frequencies of both cores separatly?
By the way, compare the GPU usage % in both runs, nearly identical, so the card gets equally loaded in the bugged and non-bugged runs.
http://i154.photobucket.com/albums/s...s/80902_wm.jpg
Just measure power consumption and reproduce a bugged and unbugged run. That'll tell you quickly if perhaps one of the GPUs is dropping out and rendering blank frames, artificially padding the GT2/3 scores.
Ran the tests to see whether cpu temperature is making a difference, underneath the setup:
http://i154.photobucket.com/albums/s...mps/setup1.jpg
http://i154.photobucket.com/albums/s...mps/setup2.jpg
First result is with a vcore of 1.104v, second result is with 1.488v running through the cpu and thus way hotter (idle in bios around 52°C)
http://i154.photobucket.com/albums/s...s/75525_wm.png
http://i154.photobucket.com/albums/s...s/75301_wm.png
Both runs are not bugged.
Great idea :up:
Catalyst 8.9 beta is out http://www.xtremesystems.org/forums/...6&postcount=16
I'm going to sound like a jerk sorry in advance but if you test something why not keep the ss universal? I'm missing rivatuner in your last shots Massman :up:
Universal as in? My only two screenshots that have rivatuner in it were those to check if the card downclocks at a certain point, hence rivatuner was absolutely necessary to prove my point. In all my other screenshots the actual frequencies the card is running at is of no relevance to the test. Gpu-z is included to show that I'm running 750/900.
And no, I don't consider you a jerk; this question is in fact a good question ;).
Yeah but it also proofs nothing changed between runs, not that I don't believe you when you say so but looking at all the info at once instead of having to think of something which isn't right there ( but in the back of your head ) sometimes makes spotting that troublemaker easier.
I'm not sure if RT can follow two gpu cores at once, but I think when you look at vreg amp draw it would show there if a core dropped out to?
what LOD are u guys running at? to compare scores directy, u need to know everything the guy did, ram timings, PCI-freq, LOD, motherboard (some are waaay faster than others), u know. the worst choice of LOD and worst board vs the best board and best tweaks might make an 7k difference in some instances.
but i agree with everything that is said here, something isnt right , the bugged run is so common it isnt a bug run anymore, its something else, but it is still an invalid score. hell, 4870x2 cant get near this score in 01 :P
Maybe nothing is bugged lol. Now for some reason my scores about 3000 pts higher than DeDaL's and we're on basically the same board. (I'm on P5E64 WS Evolution, he's on P5E3 Premium) The Rampage Extreme scores exactly the same for me too. LOD does nothing. I've tried up to 10 and it has no effect on the scores.
You're exactly right though, we need to know everything that everyone is doing to get a firm handle. I don't do anything at all besides set Cat AI to standard (but I just found that Advanced scores the same) and I turn the mipmap detail to high performance.
FWIW I usually use Vista Home Premium or Home Basic instead of Ultimate...Ultimate seemed to score a little worse, but Kinc has been using it.
Service pack 1 is huuuuugggge...a clean 3000 point boost! I feel like an idiot for not trying this earlier.
However...there is still a 4000 pt gap between spl and I, and still about 1500 between myself and T_M...
http://www-personal.umich.edu/~gautamb/138.jpg
Link? I thought that was included in SP1??
Gautam I use SP1 too, but i dont understand why my GT2/GT3 slower than yours. :shrug:
But my Nature is more faster
I had a strangely low GT2/GT3 ONCE...when my fans were not at 100%. Is it possible that your cards are not fully stable and are throttling? PCI-E shouldn't make your score jump 2000 points, only ~500 it seems to me, at best.
My cards can do 860/1020
I flash cards 840/970 with 70% fan speed
All of your system info is showing as N/A...do you have the futuremark hotfix? I doubt it'd make a difference tho.
I use "-nosysteminfo" command, just for fast loading, buy it dont change result.
Oh, yeah. :shrug: The only driver setting I even change is mipmap detail in CCC. :S
Very strange indeed...
64 bit or 32 bit?? Which version of Vista??
i use vista ultimate 64bit sp1
in the ccc default