Quote Originally Posted by BeyondSciFi View Post
The more or less obvious answer is that gaming performance is not solely based on raw GFLOPS output. So if the GFLOPS is the not the number we should be looking at, then what? [...] Thankfully, there is a way of finding the differing (if different) output between GPUs which is not arbitrary or somewhat subjective. The way to do this is by comparing another group of numbers, namely, the texture and pixel fillrates.

[...]

So, it seems the GTX 380 may be faster then the Radeon 5870 (with mature drivers, if not earlier). The GTX 380 (using the rumored specs) has 22% more texture fillrate and 14% more pixel fillrate. Given that the difference in performance is not too extreme, I suspect the GTX 380 will beat the Radeon 5870 in a majority of games and benchmarks but not necessarily all.
I'm not sure anybody with a good head on their shoulders would submit to your logic. Up until the point you mentioned texture and pixel fillrates, you were spot on, and then...I don't know what happened. All that talk about core configurations and average game performance not being indicative of an overall performance picture, and then you go and compare fillrates? Like this is still the DX7 and 8 era?

If fillrates were such an important factor, the 6800 Ultra wouldn't have kept up as well as it did, the 7900GTX should have done at least as well as the X1900XTX (if not better), and the 8800GTX would never have hit the 60%+ improvement over the 2900XT that it often did.

Forget the fillrates, go back to what you were saying about games and drivers and different core configurations. There's too many factors to look at to really even speculate what the performance would be, but the best guesses are the ones that barely even touch fillrates.