i think power consumption should be almost the best candidate for determining how well a game or benchmark works. if the unreal engine loads nvidia gpus to 90% of the furmark temps, and amd cards only to 60%, it would show that amd cards are heavily under utilized in it. but compared to other games ati cards might always be around 60% which just shows how the architecture works, but if another game shows 85% on the amd card, it could mean that engine is just not amd friendly. then a quick comparison on temp relative to framerates to see if the engine and architecture compatibility are showing expected results.
considering that 3dmark vantage is commonly used now as the main way to test load temps and consumption, its clear that people like how well it stresses cards.
Bookmarks