Originally Posted by J-Mag
That's exactly what I wanted to see. However, something concerns me.
We all know that benching at lower resolutions shows off CPU performance better, right?
Here are the results copied directly from that thread for the FEAR test:
1024x768, 4xAA 16xAF, all max:
Opteron 165 @ 3GHz: Min 60 | Avg 142 | Max 370
Conroe @ 3GHz: Min 78 | Avg 162 | Max 485
1600x1200, 4xAA 16xAF, all max:
Opteron 165 @ 3GHz: Min 25 | Avg 43 | Max 172
Conroe @ 3GHz: Min 53 | Avg 86 | Max 310
Can someone tell me why the 1024 resolution has an average difference of 12% while the 1600x1200 shows a difference of 50%? Thats a little counter intuiative don't ya think?