Quote Originally Posted by boxleitnerb View Post
First, you absolutely cannot compare gameplay with just displayed sequences. 48 fps, is way too low for many games, think input lag. In some game engines, 100+ fps are required for truly direct control. Secondly, it is unimportant what the "majority" feels as long as there are still people who have higher standards.
So did you actually experience SLI/CF first hand or did you not? Theory has its place, but real life can be something different altogether. This smells like sugarcoating the problems that CF has. I'm always highly suspicious if some people - especially reviewers who should be objective and very careful with such stuff - tell me what I should or shouldn't be able to feel.
What? Some engines require 100+ fps because the fps is tied to the control and other things. So, more fps = the game runs different. This has nothing to do with visuals... for instance, in many Quake games fps has a great effect on how the player moves around the scenario.

48fps, by the way, is not slow by any means. The problem is that FPS is not an even stream of data, and the more fps you have, the more close to each other they are and, thus, the smoother the experience is. For instance, the Crysis titles are far more playable at 40-50fps than most of the games out there, and its precisely because their engine works in a different way and doesn't spike as others do.

Also... SKYMTL is part of hardwarecanucks (not sure if the owner/main editor/just another crew member). So, it's safe to assume he has tinkered with more SLI and CF than most of us

Quote Originally Posted by cx-ray View Post
This depends greatly on the viewing source. Sit in front of a 50-60Hz CRT and you'll notice flicker immediately. If not, then for sure in direct comparison to an LCD or when you increase the refresh rate of the CRT to 75Hz for instance.
This has nothing to do with the issue. A CRT is unable to have a static image on it. It simply can't, and thus the flickering. Flicker is simply part of the CRT, and having a higher refresh rate only mitigates it, but it still there.


Higher than 60Hz refresh rate isn't essential for minimizing ghosting on LCD screens. I use EIZO Colorgraphic monitors for video work, etc., the ghosting on them is basically non-existent. No consumer grade TN 120Hz or 144Hz comes even close to them in that respect. However, due to the display and hold nature of LCD screens, there's a significant difference in motion perception. This is directly related to the human visual system. The display itself plays a comparatively minor role in the perceived motion blur.
The new 3D Vision 2 monitors have close to non crosstalk, which means that their ghosting is effectively 0.


Quote Originally Posted by bhavv View Post
@box

100 FPS isn't going to do anything over 60 FPS on a normal 60 hz screen. If FPS is maintained at over 50 on a 60 hz, I don't believe that you or anyone else are going to be noticing any stutter.

The frame spiking with crossfire in these tests however is a seperate issue to FPS, you may very well notice stutter due to that happening, and im not sure how frame limiting works or if it can be forced to 60 without dipping straight down to 30 like normal vsync.

I've usually found that people complaining about SLI / Xfire and microstutter haven't even used the setup they complain about themselves, and when people do notice lag / stutter, in most cases this is caused by FPS spikes rapidly dropping down to a low number. This is easily measurable in some bechmarks which record the minimum FPS. Most likely case is if you are seeing sudden lag / stutter on any GPU set up, the FPS is dropping far below 30 when you see it.

You can't judge an SLI / Xfire setup either way until you've tried it. I've been through SLI Geforce 6800s, GTX 460, and 560 Tis, and Xfire 3850s, 4850s, 4870s, 5770s and none of them were worse to me than a single card setup. I noticed the exact same lag and smoothness at the same FPS points, and I have perfect vision.
Oh, it does. You consider framerate as a steady stream of frames, when it isn't. With 100fps you have far higher chances of never skipping a frame than with 60fps (@60hz screen). So, the bigger the number, the better.... and stuttering is not only related to multi-gpu setups, as it exists on both systems (single and multi-gpu).


Quote Originally Posted by vario View Post
Could you explain how would this work ?I cant see any reason why higher amount of generated frames from gpu would lower input lag of a display when the synchronization remains at 60hz.Its interesting to me because im looking at monitors to buy atm however im not seeing any high resolution/high quality displays with low input lag.
My understanding is that input lag is introduced when display analyses and does some work on the frames got from gfx card before displaying it.To be honest i dont even see why input lag would get lower in 120hz display at 120fps.Its all about monitor electronics/display technology.
Only if you have Vsync on. Tearing happens precisely because you are drawing different frames at once. This is pita, but at the same time you have your lastest inputs drawed into the screen. Food for thought...

About input lag, the higher the refresh rate, the lower the input lag (provided the pannels are as fast). Why? Whenever you input something into your computer, it will take a time to get pictured on screen. The fact that at 120hz you update your computer twice as fast means that the time between input and picture will also be lower. Think about it this way: you are expecting a package, and you can choose between a company that makes 1 delivery per day or another that makes 5 deliveries per day. Which one will give you your packet faster, provided you don't know when the packet will reach the office? It's kinda simple actually...

Quote Originally Posted by gx-x View Post
you are all forgeting about capabilities of panel itself. Sure, you may have 6ms on ips or 2ms on TN but that is gray-to-gray. blu-to-blue is more like 30ms, so BtB cannot achieve even 50fps, so in 120hz (120fps) BtB will be lagging behind GtG anyway. My point is - LCD panels are not capable of true 120hz/fps anyway. Don't get to exited.

edit: example: 3D crosstalk. Putting 500Hz refresh rate on a panel that is capable of displaying (all color transitions with all pixels firing up and down (on/off) ) some 80-90 fps will not yield a 500fp capability nor will it grant you a better experience than sam panel used in 120Hz monitor/TV. But hey, it would sell and placebo would rule the forums.
600Hz plasma anyone?
Oh, you are wrong. The new 3D pannels have no cross-talk... meaning: no ghosting whatsoever, and cross-talk is by far the hardest test of them all, as they are drawing 2 different frames one after another, and if anything gets mixed into them they eliminated the blurr altogether.