
Originally Posted by
zerazax
The question isn't how many games... it's what games.
In a relatively small sample size, if you let too many outliers in, you're going to skew data.
IIRC, back when G80/R600 were the cards on the market, people would bench Call of Juarez and say "look, R600 is not as slow!" etc.
Now imagine if a review had 10 games, 3 of which were Call of Juarez esque... instead of the realistic G80 being 30% faster than R600, the average might have dropped to 20%, etc.
If you're going to take an overall comparison, with just 10 games, you'd want all 10 to be as close to neutral as possible.
I mean look at the original big chart... you've got Batman:AA and FarCry 2, well known to favor Nvidia. Aside from the obvious question of how AA was enabled for ATI's cards (again, issues with settings), you have one game accounting for a good chunk of the performance increase. And do you factor in a review like Crysis Warhead 2560x1600 8xAA/16xAF where 17.2/4.7 is going to be "OMGWTF 300% faster!!!" etc.
PR slides at its best
Bookmarks