Not true. In single card you also have that variation because each frame may have more ou less work to do.
And there are variation in frames in single card otherwise you don´t have min,avg and max FPS.
Printable View
doesn't look to bad...
There is a definition of what microstutter is. However, there is no clear definition of how much of a percentage that makes it certain.
From Sampsa's graph, the 3870 X2 clearly suffers from microstutter. But does the 9800 GX2 suffer or not? Is the R700 X2 better than the normal CF setup of two 4780?
It is not so clear in the end, especially with a single game and with one particular resolution / settings.
single cards have variations but not this much. when you turn around in a game it can drop from 100fps to 30 fps but this is first 30 frames from game start ofcourse single cards will have variations but if we test same grid start with a single card i am sure its gap will be 2 or less then 2 fps. so what i think is ati made it more acceptable but not solved.
Can't wait to read the article <3 :D
sampsa, ahh now youre at 1000th post... and its a nice finishing touch by posting such useful info about microstuttering!:wiggle:
Is it allowed to post some of my own findings concerning microstutter in this game? It may be a quite long post and with much data :S
Looks better than 9800 and 3870. Once we see GTX comparisons we'll no for sure, but it looks good as it is.
Perkam
:clap: Good work Sampsa - appreciated
Hold on now, which version of Grid did you test? Grid v1.0 did not properly support multi gpus (CF). V1.1 does support multi gpus (fixed graphical problems, etc) and, v1.2 was just release late last week. The game IMHO still needs more work and no one knows if another patch is forthcoming.
Microstuttering IS an issue and threads like these are welcomed. Only if we make it known to a larger audience will the VC companies do anything about it. I do believe it will be solved as the key is just a better algorithm that distributes the frames between the two GPU cores better. I'm looking forward to reading this test results to see how much of it was solved already.
If you are getting less than 35 fps then its time to turn your settings down! When i'm out for frags I could care less what the IQ looks like, its all about the k/D. When i'm playing an RPG for example I could care less about input lag...its all about the eye candy. Same goes for almost any single player game...to include shooters.
Thank you. I'll post my results along with a little perspective :)
Grid v1.0 does support multi gpu.
SLI on: 51 fps
SLI off: 27 fps
Settings: 2560 x 1600, everything maxed except AA only at x4 MSAA, GC = on, and forced AF x16.
I haven't upgraded from 1.0, because I have heard too many problems with the patches. Also I don't need it, as the game runs fine and without errors.
Do you not know how vsync works? Without triple buffering, if your framerate ever falls below the refresh rate even by 1 fps, it will be immediately cut into half of the refresh rate. So people who play on 60hz LCDs and want to avoid the nasty tearing that comes with such a crappy refresh rate (majority of PC gamers) will be stuck bouncing back and forth between 60 and 30 frames per second. I don't know about you but that drives me crazy, which is why in games where I can afford to use Vsync (where input lag won't get me killed), I always force triple buffer with D3DOverrider to avoid the 60-30-60-30 bull:banana::banana::banana::banana: and allow the framerate to fluctuate freely. Last night I found out that AFR and triple buffer can't be used together (I have no clue why people have been buying SLI and Xfire setups all this time and giving up triple buffer), so until something can be done multi-gpu is a no-no for me I guess.
In CS: Source or I don't know... such MP games I sure don't care about IQ and go for %100 playability but in Crysis I want eye candy, and even in a relatively low res, 8800GT SLI couldn't keep up with frame rates above 35 FPS in a mixture of High and Very High settings.
you don't have to be getting 35fps for vsync to ruin your gameplay experience, in fact you can't even get it. Any framerate below 60fps (assuming 60hz), even 59fps, will be automatically reduced to 30fps. This is why most console games run at 30fps, they can't afford to maintain 60 all the time to avoid the fluctuation so they cap the framerate at 30.
I was referring to CF users. Do you have those results?
Also there was a well known driver hack that would enable SLI for v1.0. I am referring to test results of Grid in this review without added driver tweaks.
That's why I suggested turning the settings down. I played crysis when it came out and had the same problem...thats why I am waiting for a rig that can play it in all its glory. So far there is no such animal. Don't get me wrong...thats not the only reason for building a new rig :p:
A pair of R700's might do the trick...only time will tell. Even then the microstuttering debate will linger on.
In regards to Vsync, I have found my experience to be similar to what Seraphiel mentioned.