Quote Originally Posted by GoldenTiger View Post
I'll stick with my S-IPS 2560x1600 30" LCD myself... call me when consoles are doing that UrbanSmooth!
I really want one of those Dell 30" full resolution LEDs; if they weren't so god damn expensive I'd have upgraded my 24" LCD years ago.

Quote Originally Posted by BababooeyHTJ View Post
Most people don't see the blatant stutter in Fallout 3, Oblivion, and New Vegas that appears on any 60hz display. I can see it clear as day and didn't need anyone to tell me about it.

You post on [H], do you remember those HP LP2465 refurbs that were popping up in the hot deals section for a while? The input lag was horrific on that display, yet most people didn't notice it.

When I play Crysis with a GTX280 and then move to a 4870x2 and despite the higher actual framerate the game feels no smoother, thats not difficult to notice. I noticed crap like that in a few games and ended up selling the card in large part due to that. I didn't even know that was microstutter at the time.

I've seen microstutter with 6950s in a couple of games. It really stood out in Metro in a lot of spots. I saw it in Stalker Clear Sky, clear as day. The choppy feeling framerate despite is very annoying at times and can be seen at 40 or so fps. I've never seen noticeable uneven frametimes resulting in choppy performance at any framerate that I would consider playable on a single gpu. I've never seen any research that shows this to be a problem in real world usage with a single gpu. I'm sorry but unlike with sli and crossfire this isn't a well documented problem with a single gpu.

I'm not saying that microstutter will stop me from using sli or crossfire in the future since I've found both solutions to work really well most of the time but I wouldn't buy two mid-range cards in an attempt to get the performance of a high end card. Microstutter was a massive problem with 4870 crossfire but it does appear to be less of an issue these days. Thats not to say that its non-existent.
I wonder if that is more due to poor game coding than than anything else. Fallout 3 in particularly seemed to have a lot glitches when I was playing it, if for no other reason the game was massive and they probably just couldn't get around to patching everything in a reasonable amount of time. I suppose with that logic we could just say that better drivers would entirely eliminate microstuttering, but being realistic whenever we're dealing with devices in parallel it'll be near impossible to get them perfectly aligned on the microsecond scale short of having optical connections to remove all latency.