Good thing I didn't then.
You can calculate frame rate yourself. It's just 1000/frame_time. And if you want control results you can also generate them yourself if you have fraps and a single GPU card.
Printable View
It's your call but I'm not sure why you need someone to prove something to you that you can do yourself. I think there were some other reviews a few months/years ago with single-GPU graphs too so you might have some luck in google.
This is why I do not use a MultiGPU system.
Until ATi/nVidia move away from the traditional "AFR" rendering method for multiGPU communication this issue will remain a problem.
Setting Maximum Pre-rendered frames to 0 in the nVidia control pannel does help significantly.
What does annoy me is with each ATi multi GPU release there is always hype about a "true multi GPU link aka multithreading/HT/true dual core" however we are always let down as ATi still use the AFR technique.
Remember the hype re X4870X2 launch?
We got the same hype re: 5970 too. I wish ATi would deliver the goods and show the world how MultiGPU is meant to be played ;)
John
well it is finally nice to see some numbers on a large variety of popular games and benches that seem to shed a little more light on the situation. i have used a few Multi GPU setups and some i have noticed mircostuddering one and some i have not. For example i built a system for a buddy of mine using a CI7 920 at 4GHZ and a 4870 crossfire and the mircostuddering was pretty bad. While his system beat my 5870 system in almost all the numbers when benching my system still plays games much smoother overall even with the same or lower FPS. i have also used a 8800gtx SLI and noticed no mircostuddering at all. Hopefully though ATI and Nvidia will make an effort to fix the problem but i doubt it:(
I don't know if you can call this a "huge" problem since many people just don't see "microstutter" unless they are looking at one of those graphs....myself included. We all know the problem exists but it doesn't impact upon my gameplay experience in any way, shape or form. But that's just me....
I agree with the notion that AFR is often beneficial. As someone who has actually used XF for an extended period of time on 4K cards, it's obvious beyond belief that for some games it makes a marked improvement in not only framerate but also user perceived fluidity. This isn't always the case, but for most of the gaming I've done it has been.
For those graphs to matter whatsoever, they'd have to show what a single GPU would have done at that same exact moment so a comparison about the differences in timing can be made. If you're going to compare to a single card in your conclusion (eg XF is worse than a single GPU), you MUST have data from a single card or the analysis is forfeit.
Last thought: If you don't like AFR, use supertiling. It may not always provide the best benefit, but you won't suffer from frame AFR frame timing issues. You can select whatever you want to use now.
Who are the few that you are talking about that you know who are just sticking their heads in the sand to avoid acknowledging that their multi-GPU setups pumping out useless numbers.Quote:
Actually a fraps graph is far more useful evidence than people claiming to see / not see something. Microstuttering is a well understood phenomenon, people just stick their heads in the sand to avoid acknowledging that their multi-GPU setups are pumping out useless numbers.
2all I don't know, I'm not blind and definitely saw no microstuttering - everything was smooth like baby's a$$ with that 3 sli setup.
Macro stuttering - constant big fluctuations (the usual stuttering in another words)
microstuttering - small and frequent fluctuations (may happend randomly).
There was a lot of talks about microstuttering on the internet at the time of gtx295 -285 launch - the microstuttering is a myth. Because it is almost impossible to catch it for a human eye. People tend to confuse it with some other troubles like latency or the margin between the min and max fps for example...
But one thing for sure - one fast card is better than 2 or 3 cards alright.:cool:
I knew it. I've been :banana::banana::banana::banana::banana:ing about CF being entirely worthless for a long time now.
I bought two 5870s as soon as they came out. Two weeks later I put one of my 5870s on the shelf because MS completely ruins it for me. I feel that single card is smoother and providers better gameplay for me. Even in a blind test I choose a single 5870. My second 5870 is still on the shelf months later, waiting for lucid hydra to get better.
This microsluttering issue you speak of is not real.
Come on now, people...
NOTHING TO SEE HERE, MOVE ALONG!
http://1.bp.blogspot.com/_Vr8Xl0cbUZ...move+along.jpg
as long as the lowest fps of the stuttering stays above 40fps, you probably will not notice it.
i think there is alot of training needed to know how to setup a PC to properly get as much benefit out of multi gpu setups, without having to noticed these kinds of issues.
for the crysis at 1920x1200, at its worse point was bouncing between 70fps and 50, but stuck right around 60. so the user feels 50fps at the WORST, hardly noticeable im sure
for B:AA there is constant jumping between 15 and 20ms and 1000/15 = 66fps and 1000/20 = 50fps, im starting to wonder if they left vsync on, due to how the average is so damn close to 60fps
with dirt 2 i have no idea what went wrong, i benched it with a 5850 everything maxed with 4xaa and same resolution, and averaged better framerates than they show. (im wondering if they were using 9.12 cats) i was expecting sub 10ms numbers, not an average of 40fps
unigine was a constant jump between 5 and 17ms or 200 to 60fps, dont think you can notice that.
i wish i had a second gpu so i could do some of these tests myself, im really curious because theres so many things to look at and everyones opinion of how noticeable it is will vary. i also dont think its worth doing it for 1-2 seconds worth of content. and i cant read the article itself, so i was hoping under each chart is a user opinion about what they saw when doing the test.
After using a 4870x2 from launch until septemberish and switching to a 5870, even though benchs often show the X2 ahead, the 5870 has felt night and day smoother. Hell when I spent a month with some 260 c216s the single 260 felt just as good if not better than the X2... This is just my own long term personal experiance using these cards. I've tested a lot of other gpus / configs but I hadn't done more then benched them ( eg didn't get a chance to play even a fraction of the games I own ) But I will mention, unless you compared the 2 I likely wouldn't notice it. Does that change the fact that this is a problem? Nope. It just makes me rethink any future move to a multi gpu setup given it is bad enough when you see roughly a 50% speed up ( in numbers ; I'd consider 50% a fair scaling average for a multi gpu system given my experiance) for 2x the cost, its even worse when the afforementioned speed up isn't always relavent.
Perhaps some people can adjust their perception such that, over the course of time, they acclimate to MS?
Is that MS or another phenomena altogether?Quote:
After using a 4870x2 from launch until septemberish and switching to a 5870, even though benchs often show the X2 ahead, the 5870 has felt night and day smoother.
I have had much the same experience that you have had, but I think some observations have been accurate, or wrongly classified as "microstutter".
My views:
1: Generally microstutter is not an issue. I have only had Cf'ed 4830's and 2 9800GTX in SLI. In most games I do not notice anything, but there are a few exceptions.
2: The majority of the "real" microstutter that people experience is from situations where FPS is either borderline 30 or very high. I feel Vsync definitely has some causation. I have noticed that while playing Battlefield Vietnam with SLI, it feels very sluggish. I have no alternate explanation, and must assume it is "microstutter".
3: I think some of the "microstutter" people do see is caused by CPU limitation as opposed to GPU limitations
I agree, if it's fluctuating at the high end of the range then it should be very hard to detect. But that's just one part of the complaint. The other is purely academinc in that the reported average fps in reviews etc is misleading. A series that fluctuates between 40 and 80 fps has an average of 60fps but it's not the same as one that fluctuates between 55 and 65 and produces the same 60 average. The latter is far more desirable.
The human eye can't see more than 60FPS anyway ;)
:ROTF:
Out of some nubile curiosity, has anyone tested to see if MS was mitigated when using SSD's? The idea being that if there was a bottleneck getting data from the hard drive to the gfx memory, that SSD would improve the situation.