Quote Originally Posted by SKYMTL View Post
FurMark isn't a prefered benchmarking tool since it pushes GPUs above and beyond what they are specified to run at. You can't say that just because a FurMark produces abnormally high power consumption and that the GPUs are running out of spec. It is the PROGRAM that causes the GPUs to run above and beyond, NOT the GPUs themselves.

I've seen the way ATI tests for power consumption and they run every game you can think of at multiple resolutions to determine their max board power consumption and I am sure Nvidia does the same thing. However, I would go so far as to say that FurMark is buggy in the fact that it loads the GPU in ways it was not meant to be.
So should we stop using Linpack and other tools which utilize CPU's to their full potential as well?

The GPU's were rated at a "working" power usage level which is fine by me, but to say that we cannot measure their full peak power draw because they never attain that doesnt sit right with me. Who's to say that some GPGPU application(s) might not use more of the GPU? What if a game comes out that uses the GPU more, are they going to have to write "idle" routines into the next driver release so you dont go over the allotted power draw (ie: Furmark renaming)?

I think two things should happen.

1: They should be rated at their peak.
2: More reviews should be done like this.