Microstutter actually DOES occur on all GPU's be it single card or more. The definition of microstutter is simply stemming from typical rendering, which is "uneven frame times within a second", which simply means that the frames in "frames per second" are not being rendered at an even spacing. A single GPU does not render every frame of every second at exactly the same rate within the second as some are more demanding than others inside a second. However, it typically is described as becoming more visible with 2+ GPU's as you excaberate the effect by using even more uneven times mixed together. The reality is that even with dual-card setups it is basically unnoticeable to most people, some notice it if they look for it because they had read about it (a handful) and a minute number actually notice it for real (non-placebo) while playing. Most people confuse things like load hitching for microstutter.
I'll stick with my S-IPS 2560x1600 30" LCD myself... call me when consoles are doing that UrbanSmooth!
In all seriousness, even 1080p is better than consoles render virtually any title at, only a tiny handful of them render at 720p even, most are sub-720p for good ones and upscaled. 2560x1600 is double+ the pixels of 1080p, so.... yeah. Also, multiple-cruddy-TN-displays is not what most people with the cash to afford this hardware go for.






.
Reply With Quote
Bookmarks