Quote Originally Posted by cegras View Post
Whoa, doesn't that mean HardOCP have been doing it right all along????
Like I said in the conclusion: there is no "right" way to go about it. Kyle's team has a unique persepctive and it sets them (and sometimes their conlcusions) apart from the norm. They do use in-game sequences and research things properly it seems which is a huge step in the right direction. However, I am not actually sure if they include action sequences in their benchmarks or if they are just doing a run-through.

What the article is really meant to convey is that the vast majority of standard benchmarking methods (built in benchmarks & stand-alone) are dead wrong. This is why readers should push for a transparent benchmarking process where there is disclosure of exactly which methods were used.

Quote Originally Posted by hurleybird View Post
The ironic thing here is that this article is perpetuating one of the most common benchmarking mistakes of today: providing minimum frame rates without qualifying them. Minimum FPS by itself is worthless, since for all you know it be for a single frame at the start of the level, or conversely that card might be hitting that minimum frame-rate all of that time. Another example, if one card hits a very low minimum frame rate once for a very short period, and another card hits a higher minimum frame rate but goes there more often, it's the first card with the lower min fps that is providing the better game play experience. If you want to provide minimum frame-rates, you MUST qualify them with a graph of fps over time, or at the very least a description of the gameplay. Unfortunately this poor methodology is very widespread.
I can't speak for other sites but the fact that we test every run three times and average out the results eliminates any "zingers" when it comes to minimum framerates. Average FPS are also there for a reason.