Quote Originally Posted by JumpingJack View Post
Depends on the game... Gen-1 games, yeah I agree.... but current DX10 games... even 1280x1024 with high fidelity settings is GPU limited.

WIC at 1280x800, high setting default, QX9650 @ 3.2 Ghz
Min =32 Ave = 58 Max = 154

WIC at 1280x800, high settings default, QX9650 @ 2.66 Ghz
min =28 Ave = 59 Max = 141

This is on a 4870X2 ... lost planet is doing the same thing (everything high)..... so this review ran GPU limited by tries to conclude (as well as most other on this thread) about the CPU.... this is incorrect. EDIT: Note I ran XP DX9, DX10 will be even more the same...

Does it make a difference, nope ... why? Because we like to play at those settings ... however, I personally, prefer to not buy a whole new system ... the CPU is the lowest common denomenator -- and revs every 1-2 years, GPUs rev every 6-9 months, so if I want to future proof -- I prefer the fastest CPU then incrementally upgrade the GPU as needed... that's me ... which is why I want to see both the high quality, high res result but also the low res, lower quality results to ascertain the viability of the CPU ...

This review did not do that... the question whether Nehalem actually improves gaming is still a question mark... I do not expect a huge leap, and I suspect to see some games actually under perform ... but the oddity of this data set is that all the CPUs compared bunched up to be roughly the same ... this is GPU limited.
I see your point, but who is buying a nehalem system in order to play CPU limited DX7 and DX8 games that already run in the 100's of fps though?

The thing you *need* more performance for is current day games and if they're so GPU bound at 12x10 on average, I'd say it's probably not worth the price to someone who wants gaming performance. Just my opinion though.

Do you mean to imply that we don't know the CPUs gaming potential purely because they tested 12x10 with GPU limited settings? If so, you might have problems in the future, because that trend is only going to continue, and the average resolution is going up, not down.

Should they have bumped down resolution to settings that no one plays at to gauge the CPUs "gaming performance"? Sure if you're benchmarking by running a game at 800x600 you can call it "gaming performance" but it's not real-world gaming performance.

I guess the fundamental question is.. what is "gaming performance"?