Quote Originally Posted by GoldenTiger View Post
I'm confused... the chart you link shows the GTX 680 as better perf/watt than the Radeon 7970.
I didn't state 7970 anywhere, and I meant AMD cards in general.

Quote Originally Posted by GoldenTiger View Post
When a few frames are technically 15-20%, it shows the scalability will be better in SLI overall for playability. Hypothetical, BF3 24fps on Radeon 7970, 29fps GTX 680. Then two cards.... and your numbers look more like 44fps 7970 CFX and 54fps GTX 680 (assuming 85% SLI and CF scaling). The gap widens in absolute # despite the percentage being the same. Otherwise, you are correct... but those numbers result in playable vs. not enjoyable in these edge cases. Hopefully this helps explain the logic as to my thinking.
I understand your logic and agree to some extent. We can continue this dispute once we see SLI/CF numbers.

Quote Originally Posted by Andrew LB View Post
...When you take into account ALL resolutions commonly used in games, yes... it is 16% better. I know you just want to cherry pick a few very high resolutions that 99.999% of gamers don't use because they don't run 3 monitors, but the numbers are quite clear.
There are two resolutions I commonly use, the native res of my 3x24" LCDs and the native res of my 30". I do not care about a million FPS @ 1024x768. If you spend 500$ on a GPU you usually do not use it with a 80$ 19" monitor. Anything under 19x12 is not a valid information for me/my customers.