Quote Originally Posted by KTE View Post
Hahaha
Re the sigma: I was using an experimental, completely JS written scientific tool coded by a friend. It was supposed to be stable and running basic calcs fine. However, yesterday when running these values through, it gave me the answers I had posted, repeatedly. Later on I had time to debug it and it was a bug. I'm not sure how but it's caused, but the coding for n-1 kept defaulting to n so the squared variance was dividing by 6 values whereas it should divide by 5 and then square root. That led to the error in end values, apologies for the extra hydra. Those results you posted I hand checked early morning, and they are correct including the new values.
Yep, you lose that degree of freedom ... I make that mistake myself. I have become very reliant on excel and statistica for doing routine statistics.

Previously, I have benched UT3/Prey/Crysis/FarCry/FEAR on Q6600 G0 and Phenom 9600BE using a 3870 but only at generally played resolutions mid (1680x1050) to high (1920x1200). It was a mixed scenario since the GPU was bottlenecked many times but the CPU had enough code being fed to still scale with 100MHz speed changes. Overall, at those resolutions it was showing UT3/Prey being very Intel favoring while Crysis was near same for both (wasn't a CPU dependent game, nor was it a quad-core optimized game, as this of a few thread shows), but FEAR favored Phenom. FarCry, I don't really remember the output of and I don't have the data drives with me here, but from cached instinct, I would wager that it was showing Core2/Penryn favoritism clock for clock.
I am seeing similar results, I have not benched at Q6600 speeds though, I need to go back to that so FEAR is behaving differently for me, I have no doubts at Q6600 speeds this is true as I did a Q6700 (speed) wise and made the same observation.

Crysis is turning out to be one heck of a trick. Not sure how much time I want to spend on fleshing it out, right now I pretty well have UT3 well understood.... but Crysis is proving to be 'weird' to say the least. Not doing an Intel/AMD comparision, rather just a core scaling comparision.

If I get enough data to put together a good presentation I will post on it, but if I do I would beg if you or someone with similar HW can repeat what I would show.... it is just bizzare. On that topic, the trap I think reviewers are falling into is simply clicking on the 'set all' drop down option in the advanced tab. If you are going to use Crysis as a CPU bench, the physics and particle qualities should be set to high in order to put as much stress on the CPU, while keeping the graphics intensive options low to remove the stress from the GPU. Of course this is not 'played at' settings, but I am looking at the game as a CPU test for the most part.

Jack