Quote Originally Posted by STEvil View Post
Is this in reference to using multiple benchmark runs to create an average result?
na. standard deviation is used to express how far away from the mean a certain amount of the sample set is. you can change that certain amount by changing the number of standard deviations you count away from the mean.

so if you were concerned about the minimum framerate value expressing just some odd one-off circumstance, like some posters are, you could use the mean framerate minus three standard deviations instead. this would leave off the most extreme minimums while still capturing 99% of the other values.

you could even do this with only the values below the original mean, which would leave off all the high values you dont care about. standard deviation is normally used with "normal distributions" and using it with whatever a benchmark turns out to be may not produce results we could understand or predict right now

just think about the requirement (expressing useful information to potential video card buyers) and find a math tool from the massive library of math tools to manipulate your data set (ideally instantaneous frame rates like from fraps, because using pre-calculated fps is like cooking with flavored sardines) into what will help you