Just a bad article. No baseline should be the first clue. Not everyone has the same exact setup that they ran.
Printable View
Microstutter in action:
- Time: 1:12-1:16 on 5970 screen watch carefully as he rounds the corner around the pole, then rewind and watch how much smoother it looks on the 5870 (even though 5870 has less fps).
- Time: 1:21-1:26 on 5970 screen watch carefully at the Camouflage netting on the top of the FOV, and again watch the screen underneath it and see how smoother it is on the 5870.
http://www.youtube.com/watch?v=emG7ZNIsxw8&fmt=22
Single GPU rendering is less likely to show this type of stuttering compared to multi-GPU.
I edited my previous post to include the Supertiling technique of rendering on Crossfire. It appears that Supertiling, in the graph, varies less from frame to frame, but when viewing the actual run, it looks almost as if there is more screen tearing. So I don't know.
Perhaps some other would care to test this out?
If you look closely, it almost appears as though it's a checker board type effect on the 5970... which is what Supertiling is. If I had a video camera, I would show the difference between the different crossfire techniques.
im about to start a "Give eRazorxEDGE a Poster-of-the-Year award" poll, and i expect to see everyone in here vote yes
thanks for the work, simply proving that its not always there, and gives confidence to people who have it and want to get rid of it. i will take a alot of tests with alot of different hardware and games and hundreds of graphs to see if there is any kind of obvious cause to microstuttering, but no one should have to go through all that pain.
Just to backup what eRazorzEDGE has been saying about Crossfire being based on Supertiling since X8xxx.
"Supertiling is the default rendering mode for Crossfire in all D3D applications that CATALYST A.I., the ATI mechanism for game profiling in their driver and one without any user modifiable settings or game attributes to affect Crossfire rendering mode, has no idea about. If ATI have performance profiled an application and deemed it unsuitable for Supertiling or more suited in a performance sense to Scissor or AFR, CATALYST A.I. is the mechanism to enable that. Therefore if a Crossfire user is to disable CATALYST A.I. completely, Supertiling will be used for everything across the board, regardless of whether its the best performance mode as decided by ATI or not."
Source: http://www.hexus.net/content/item.php?item=1651&page=9
I'm going to say that the problem likely isn't entirely attributable to the card, lots of things contribute to heavy variability of FPS but the primary culprit (imo) is RAM and how the App handles RAM and loading of textures.
The poster that mentioned LoTRO, I'm going to say to try fully disabling superfetch/prefetch might help his stuttering problem on Vista.
Thank you guys, it's much appreciated. And welcome Vortex, excellent first post ;)
Also, about the stuttering itself, I think it's more apparent do to the fact that 2 (or more) cards are throwing out more frames per second so in a sense, you're getting really bad screen tearing, except that on an ATI card it looks more like a checkerboard-type of tearing. And since people can really only focus on one part of a screen at a time, they don't get the whole picture, so-to-speak. It just appears like a stutter, but it's really screen tearing only at certain parts of the screen.
On a related note... I just recorded a video of the Crysis benchmark, but it's only running around 24 FPS when I start recording, which is really low considering my hardware and previous benches. Also, when it's running at that low FPS, the images don't move from one part of the screen to the other as fast as they do when I'm getting say 60 FPS, so it's an illusion of no stuttering when I can clearly see the "stuttering/tearing" at higher FPS with Supertiling on. Does anyone know if this can be increased or do you just have to get a camera to properly record?
While I agree that no enduser should have to go through all that testing, I believe that Nvidia and ATI should bother with it because after all they are the ones who want our hard earned money for their products.
@eRazorzEDGE
Impressive what you did there, good job:up:
There are just so many variables I would question if everyone is experiencing microstuttering (or to the same extent).
Valid testing to come to the conclusion that its the vga would be:
Tested across OS's
OS versions (32/64bit)
driver versions
superfetch off and on (i would personally leave it fully disabled for testing)
indexing off and on
Different hard disks tested
video settings and so on.
appropriate peformance logs ran (disk queue length, available physical memory, % processor etc)
yup, so many things to consider, basically any part of the system that can add to latency is able to be a culprit. i almost wonder if the motherboard has a greater impact than video cards do.
I'm sure, and making sure slot is running x16 pci-e 2.0. My p45 mobo for example makes both slots x8 even if its only a pci-e x1 in the other x16 mechanical x8 electrical slot.
I was just playing Dirt 2 online, so I decided to pick up a capture and this is what I got. Same system, 1920x1200 8xAA Ultra settings, With AFR, not the default Supertiling.
http://webpages.charter.net/darkdark...%20-%20AFR.jpg
From the OP, for comparison:
http://img710.imageshack.us/img710/8...21920x1200.jpg
Basically the same settings, except I'm using 8xAA instead of 4xAA, and I'm no where close to what they're showing. Except for that part about 2/3 into the run... dunno wtf that is :p:
that was probably when you hit the water puddles in the map?
Not sure, I played a few races after that, so I don't even remember what map this was :p:
also keep in mind how many frames your chart shows, if you stretched out that one section, it would be almost as wide as their entire chart. so your seeing a few spikes per second, their chart shows 20ish
Yeah, I stretched it out in OpenOffice and its' about 120 frames, and given the FPS in that field, roughly a second worth of larger than normal fluctuation. What's weird though, is their chart is fluctuating in the order of 15ms, while mine is only shifting at most 5ms on average.
I haven't seen anyone else do any comparisons, anyone else want to try this?
*EDIT*
@clip... I could only put 100 frames like the guys at that forum did, but that doesn't show anything but what someone is wanting you to see.
you can try to declock your chips so your framerate gets worse and see if that brings you in the range of 20ms per frame average, or turn on vsync and see what that does
I'll try to underclock the cards, see what happens.
Lol, I'm no whiz, but I did pick it up in about an hour this morning :p:
It's simple really, I use OpenOffice Calc, just open the CSV file. It'll ask you how you want to format it, just select COMMA as the only thing separating the text. Then use this code... =$B2-$B1 in cell C2. Now copy that cell and highlight every cell in column C from C3 down to whatever row your text ends on and then paste. Highlight column C and click on the Chart button and VOILA!
i don't have an MVPUmode entry anywhere in the registry
i'd hazard a guess that yours are from agent god's app
Great work eRazorzEDGE! :clap:
Sorry, I was just posting it so you could compare it to yours. If it's not there, add it, or try the app from AgentGOD and see if it adds it.
I don't think framerates need be known, since those are available on dozens of reviews, and a quick google search will tell you what framerates those cards get.
The point of the video is for you to watch AFR/Scissor/Supertiling in action, and afterwards, directly compare it to the fastest single gpu rendering that exact same scene. You already know a 5970 gets more FPS than a 5870. You then make an attempt to notice any visual differences. The difference in "smoothness" is painfully obvious to me. Especially when the character is turning on the horizontal yaw.
If you can't find that constructive, your missing the point, or unaware of what to look for. :(
Whoever recorded that video is an idiot. The first cardinal sin is that they didn't use the same video for all four video cards. The second cardinal sin is that for the supposed microstutter, they purposely swept left and right on the crossfire rig to expose rendering artifacts.
How can you even claim the ride around the pole was choppy? Almost all of that choppyness was entirely due to the fact that whoever was using the mouse has to lift it up and reposition it; he couldn't even manage to smoothly move his mouse in a circle.
I never noticed it on my SLI 260's as well. But I can see it when you compare multi-gpu to single gpu, like in these youtube vids. Yes I agree that reivewer is purposely sweeping on the horizontal axis left to right, to expose stuttering on multi-gpu, but that isn't any different to what a player would do 'in-game'. Ive seen MS on grid, clearsky, & vantage. PCGH does a lot of videos on it. Here is another one of vantage playing with an irregular framerate on multi-gpu - http://www.youtube.com/watch?v=GbcGy...B2778B&index=4
I'm not sure that's completely MS in those stalker videos...I think the xray engine is goofy anytime you are near an environment with fire. Been like that on every graphics card I've ever tested with stalker.
Just saying, I would be interested in seeing more irrefutable MS evidence.
Honestly that just looks like a normal stutter to me. Nothing different than I saw on my 280 while playing The Witcher. Maybe more like that weird Far Cry 2 stutter. It dosen't surprise me that a multi-gpu setup will see more irregularities than a single gpu setup. I'm also not surprised by seeing "MS" in Vantage. We all know how much ATI cards love Vantage.
I tried same experiment on my gtx 295.... FRAPS on crysis benchmark, 3d6, cod mw, I dont see any bigger swings in FPS or in frametimes (plotted) with 2 gpus enabled versus 1. In fact, comparing same runs on 3d6, 295gtx with both gpus enabled had smaller swings in frametimes than with just 1 enabled, but neither was very much.
http://img132.imageshack.us/img132/7...nglevsdual.jpg
I can tool around in a game quickly to rapidly change sceneries and record varying FPS, then disable sli and watch it look exactly the same or worse if FPS gets too low.
I was actually impressed at the consistency of frametimes for both sli and single gpu.
Indeed. Sorry for this, guys, I never thought a single GPU system could have so much microstuttering. I did a test myself with Crysis and can confirm similar results. Should have done this BEFORE opening the thread.
to experience the awesomeness of microstuttering on an 260SLI rig fire up FC2 and hop on a jungle boat and cruise around... usually doesnt take long until you notice stuttering, especially when the river makes a turn. it takes some time until you notice it, especially if you dont know how fluid it looks and feels without microstuttering on a single 260, but once you played FC2 on a single 260, going SLI is really annoying and the stuttering and lags arent worth the extra fps... especially since its really fast enough on a single 260 even at 1920x1200 with max details and 2aa or even 4aa if your rig is oced...
That is an issue with FC2. Do you want me to post a video of that exact same stuttering on my single 280? I have to cap my framerate to play that game with a single gpu, it's a common issue. Once again that is not "microstuttering". Are you really surprised that you are more likely to see irregularities on an sli/cf setup than with a single gpu?
Maybe it's the saturated FSB bottlenecking,not the graphics ?
All that travelling to the chip-set and back might be tiring.:p:
I've used:
7600gt SLI
8800gt SLI
GTX260 SLI
3850+3870 CF
3x4850 CF /3x4850+4890 CF
5770 CF
(No FSB)
Never seen a "stutter",even with the mixed cards.
I don't know why DIRT 2 FPS is so jumpy on that graph,FRAPS shows a constant number for me (2x5770/win7x64/DFI M2RS/Phenom710@3.5/8gb DDR2),no fluctuations.Certain levels give much higher FPS,but it never jumps/stutters.
As for dual gpu's on a single PCIe x16 -who knows maybe the street is too small....
Don't forget 2 cards get x16 lanes each,no sharing.
I didn't say it - you did :confused::p:
I've seen low fps (SLI/CF) and negative scaling using CF ( often with early drivers), or
stuttering from low frame buffer/network issues (on MP games) ,but
not this "famous" "micro stutter" - that's all.
These are all different issues,but for many it's a lot more convenient to blame the MS rather then tweak/diagnose their systems.
It's always a waiting game with drivers,I couldn't agree more,but again a separate issue.:)
Also to be fair - I do only play a few select titles,thus
reducing the chances of getting MS or any other abnormalities present in one title or the other (but not all).
I do speak from personal, hands on experience only,so what the other have
or have not seen/read about is up to them.
I know I still trust my eyesight more then some web site out there,what ever
they claim.
Thanks to the eRazorzEDGE for running the tests !:up:
You might gain a few extra FPS by using the ganged mode/unganged
memory mode -depending on the game.64bit should do better
with ganged - as it uses the 64 bit access mode (as supposed to 32 for unganged).
The highlighted portion is really the gist of your reply. The problem with your reply to me (& the gist of what is MS is claimed to be in this thread) is that it's based on assumption(s). You want me to assume what I believe to be is an illusion based on the omission of frame rates. Because I am aware of other variables that can restrict frame rates which can cause stuttering, etc I cannot agree at this time.
It wont look any different if less compressed, at least not to me, just see same only stretched out.:shrug: The previous run was slightly out of sync by 100 frames from my slight variance in hitting benchmark key, but this one is in perfect sync. If I graphed a bunch more, basically you see similar variability in frametimes of both 1 gpu and sli, but overall sli just has less on frames I captured.
After hearing all the hoopla about microstuttering, I was against getting sli card...but then thought I would give it a try, and atleast see what all the hoopla was about. And I am still waiting. I guess instead of playing games I need to search the net for claims of ms, go buy the game, then try and find the ms, then test it on 1 vs 2 gpu and maybe if I search enough ms claims, I can finally see at least 1 for myself caused by sli and can even graph it out...but that sounds like a lot of work.
And I have farcry 2, mine did stutter in places on my 8800 ultra single card. Maybe I will get urge to load it again on win 7 and try and find the ms, then graph 1 gpu vs 2, but that means I will have to play it...now if it was farcry original I would be up for that. The few times I tried to duplicate claims of ms in games I have, it always looks same or worse with only 1 gpu...but now I guess I can graph it as well.
http://img43.imageshack.us/img43/868...ual100frms.jpg
I'm glad everyone is trying this for themselves, comparing Single GPU's vs. Multiple GPU's. It's really great to see the graphs guys. Great work. :)
+1000 best thread on Microstutter so far. spread the word to the lesser noob forums.:up:
Microstuttering = one of the most over-hyped, mis-understood, and carelessly dangerous terms ever
I'm glad people here see why there's a need for a scientific BASELINE.... ironic isn't it when the single GPU sometimes is worse than the multi-GPU?
It's too bad that the term micro-stuttering is now being thrown around so carelessly
I have a question: what if this is a problem with the motherboards and not the video cards themselves? That would explain why for some people it works and for others it doesn't. Has anyone done a CF or SLI benchmark on multiple motherboards investigating fluctuations in the framerates?
if everyone who said they saw MS on their PC gave me 5$, then i might have enough money to quit my job, buy a bunch of new and old hardware, and spend a few months and figure it all out. or i could just retire to a small island with that money.
Here's my result from a single 5870@stock at 1920x1200 with everything on Enthusiast.
http://i50.tinypic.com/52b1gj.png
now that is really bad, can you do the same test but drop the textures down one notch (what game btw?)
Oh, forgot that :D It's Crysis Warhead on the Ambush timedemo. Yeah, it's horrible. I recorded frametimes for approx. 1000 frames, and 500 frames were like this. The rest was quite nice.
what was your framerate? if it's low, I wouldn't be surprised that you'll notice stuttering ;)
You can calculate from there, I guess it was around 40 FPS. I can't say I have noticed stuttering, but it was only a timedemo and I wasn't controlling the character.
With CF/SLI you have to render at least 1 frame ahead, IIRC.
How do you do that?
Doesn't this just prove that 'microstuttering' (oh god, I hate that word) simply occurs when the instantaneous load on the GPU is too much for it to handle?
These graphs don't really prove or disprove anything, except that the same thing happens on single GPU's the same as multiple GPU's.
What we need is someone who can...
A.) actually see the MS
B.) reproduce said MS
C.) replay same scenario without MS
D.) record both frame interval data and shoot hi def, minimum 60 frames per second, video and then see if we can't find a link somewhere.
None of the graphs we have seen in this thread are microstuttering, it is just swings in framerates seen with 1 or 2 gpus.
People that were complaining about microstuttering in Crysis were seeing movement, pause, movement, pause, it was obvious to all that viewed it, similar to watching 3dmark6 and watching the cpu bench run at 1-4 fps. The corresponding graphs had a rhythmic dropping in framerates from 30-40fps down to 10fps in certain segments of the game that corresponded to start, pause, start, pause (again that was obvious and all could see it). Then turning SLI off, the rhythmic decrease in framerates improved and the graphs showed less swings, though still have some microstutter but was much better. See the link, it has graphs showing real microstutter.
http://www.overclockers.com/micro-st...and-crossfire/
To me, microstutter is when you get large swings in framerates that constantly dip enough below the 30fps threshold such that the normal viewer would see start, pause, start, pause when playing certain segments of the game. And if caused by SLI, the problem gets better, visually and graphically, with turning off sli. And microstutter is apparently fixed with updated drivers, settings or code, just like any other bug.
I played large segments of codmw in last day with fraps recording entire time with 1gpu and then sli, and no microstutter with 1 or 2 cards visually or graphically. In fact the variation was identical even with graphing 32,000 frames at a time (max in excel btw) only sli showed this same variation at a higher framerate.
I dont think ms exists on any of the 7 recent games I played, and my guess now that people are looking for it, so are the developers, and like any other known bug, it is now very rare to see.
But there will always be those that are now hypertuned to looking for any glitch and say, "see that jitter, right there, didnt you see that ms". um, no. "Well your not looking hard enough...see right there, that jitter."
And by the way I ran with 0 prerendered frames, 1,2 and default 3 with sli and 1 gpu. The default 3 had the best graphically, the smallest swings in codMW, though that may vary game to game. And 0 prerendered is bad, kills lot performance with sli. 1,2 simply had larger swings than default 3.
If you want to see microstuttering all you have to do is open Crysis, and walk sideways left & right ( strafe ).
Ok, been playing Crysis for a little while today. Any stutter is from the hard drive being accessed. I know this, because when I see it start to stutter, I can see the HDD light solidly lit. Maybe I'll get FRAPS to take a video for you.
In a scene that has been pre-rendered ?
Not happening, unless you think Crytek's engine is so dumb and cr*p to dump data that is in use :p:
It happens in any SLI/CF system here no matter what.
But it doesn't happen on a single card system.
If you still think it's the HDD, then set texture streaming to off in order to use texture precaching and check again.
For what it's worth, there are a dozen system executables that can be accessing the disk ( check the file i/o per executable @ task manager if you doubt it ;) )
All this talk about Microstuttering sounds like jitter in the Audiophile world (witch I left Years ago).
...
No, it doesn't happen in any SLI/CF system here no matter what. You can say that it's happening until your ass bleeds, it doesn't make it true.
Yes, I realize there a dozen things that are running in the background and can access the disk. And I also realize that when the HDD is being accessed, it slows frame rendering. I'm not sure why it does it and I really don't want to postulate a theory right now, but go ahead and turn on a disk defragmenter while your playing a game and tell me your FPS stays at it's maximum.
Also, here is a video I uploaded to YouTube... http://www.youtube.com/watch?v=ETn7PNvclxY
It's about 1 minute long. The original was around 650MB and was smooth as butter, but every conversion process that I've tried on the file makes it come out with way worse that original quality and gives the appearance of slight stuttering, when in fact, there was none. So I uploaded the raw file to YouTube, but they converted it so w/o sending everyone the direct file, you can't see what I see.
I tried straffing side to side in multiple places in Crysis, and looking at scenery in distance there is a little jerking noted not evident when walking straight, but cant tell if that is because the character bounces up and down when straffing or not. But it visually looked exactly same with sli vs 1 gpu, and there was no pausing or microstutter visually noted. I graphed straffing with sli, and walking straight with sli. I then graphed straffing and walking straight with 1gpu. All looks same to me graphically and visually, except FPS were less with 1 gpu.
http://img707.imageshack.us/img707/3...ownslicrys.jpg
http://img697.imageshack.us/img697/2...own1gpucry.jpg
I just tried strafing while looking away from the sun, towards the harbor, with palm trees directly in front of me (3 at roughly 5 to 10 foot distances away), and I could see some lag on my display... more like screen tearing. I could see the edges of the palm trees clearly being drawn in a new spot on the screen, from top to bottom. I was getting around 55-58 FPS at this time with two 5870's in Crossfire. I tried enabling Vsync and the "tearing" went away even thought I was below 60 FPS.
I then went on to try it with my second 5870 disabled and while only getting 30-32 FPS, I could still see the "tearing". So then I tried Vsync again, and the problem went away. I recorded the frame times with FRAPS for both Single and Dual GPU's, with and without Vsync and the graphs look very similar to RGE's above... constant ~5ms fluctuation between frames.
I also tried to capture video with FRAPS, but the same image tearing doesn't show up on the video and alas, I don't have a digital camcorder.
I also tried the r_texturesstreaming=0 and e_precachelevel=1 to no avail. It still had the screen tearing, but there's no hint of "microstutter". However, it will randomly stutter on occasion, but I can't say this is "microstutter" as most games do this when something is reading/writing from/to the hard drive.
Well, I could never notice a difference back when I had a 3870x2. Today I still can't notice any difference with the so called microstutter. I only believe it exists because people claim they can see it here.
To me the quality between my 5870 is about the same but with more framerates than my old 3870x2... ?
Crysis strafing or not runs as smooth as butter on my rigs with GTX 285's and 5870s as long as it's one gpu, Vsync disabled or enabled, doesn't matter, no tearing, no ms, nothing.And trust me, I'm very picky and my eyesight is well... very very good.
As soon as I add another VGA on the rig/s I get ms in several parts of the game.
One way or another, that's my experience and what I can see using all the cards I have here, it's my own experience and realization of what the monitors are displaying, I'm not going to try to convince or force any of you to stand by me and my opinion :)
Well, I could never notice a difference back when I had a 3870x2. Today I still can't notice any difference with the so called microstutter. I only believe it exists because people claim they can see it here.
To me the quality between my 5870 is about the same but with more framerates than my old 3870x2... ?
Wow, really? First you say it happens to any system, then after a couple of us tell you that's not what we see, you come back and say "oh, well that's my opinion".
Please don't state your opinions as factual, it will save a lot of time and confusion on your part. :up:
I can't see where the problem is.
Both quoted posts are talking about what I see and happens in my rigs ( keyword: HERE )
If you want to talk about factual we can take a dive into the technical aspects of multi-GPU solutions if you think you can follow ;)
I can't see where the problem is.
Both quoted posts are talking about what I see and happens in my rigs ( keyword: HERE )
If you want to talk about factual we can take a dive into the technical aspects of multi-GPU solutions if you think you can follow ;)
You're just twisting it to make it sound like you want it to sound. If I were to say that my computer is the best computer around, how would you take that? I could easily come back and say that whenever I say "...around", I actually mean "... around here in my home".
Please stop trying to act smart by trying to question my intelligence. Arrogance is the key to defeat. A little bit of humble pie might serve you well.
xs trollfest 2010. really, guys? calm down.
for me microsstuttering was clearly evident in doom3 with x1900xtx crossfire
at 60fps smooth as silk but at 55fps it felt like 25fps
such a small drop in fps shouldn't result in such a constant slowdown aslong as it was under 60fps it didn't feel smooth
if its just short random stutters it isn't micro stuttering this is just stutters normally due to memory swapping which plenty of games do especially large outdoors games
can someone test amd vs intel on this.... ?
lol...
anymore fuel to the fire? itīs absolutely cpu/gpu manufacturer independent ;)
as long as ur fps stay above ur monitors refresh rate u cannot experience the microstutter phenomenon.
games that run with fps rates under the monitor refresh rate are stuttering, no matter if ur on a multi-gpu or a single-gpu system.
microstuttering only gets extreme if ur reaching very low fps rates around 30fps, then u can actually notice it. but who the hell plays ego shooters on 30fps??
i once had a x1800xt crossfire setup and never experienced microstuttering. maybe i did.. but then the game was really pushing my system and it was obviously stuttering, no matter if i had a single or two cards ;)
to me the whole microstutter stuff is just theoretical nonsense. my monitor displays 60 frames per second. so i donīt care if one frame is rendered 0.002s or 0.004s after another, as long as there is always a different frame to display 60 times in a second.
microstutter has nothing to do with a drop in fps. ur fps stays the same, the only thing that happens is, that frames arenīt put out "almost" in sync like they would on a single gpu solution. and this is only noticeable in really low fps environments, when u actually wouldnīt play the game, even on a single gpu system.
no
no. WHAT? no.Quote:
games that run with fps rates under the monitor refresh rate are stuttering, no matter if ur on a multi-gpu or a single-gpu system.
no.Quote:
microstuttering only gets extreme if ur reaching very low fps rates around 30fps, then u can actually notice it.
You are talking about playing with Vsync on. I cannot play with Vsync on, can neither can anyone who cares about snappy controls.Quote:
to me the whole microstutter stuff is just theoretical nonsense. my monitor displays 60 frames per second. so i donīt care if one frame is rendered 0.002s or 0.004s after another, as long as there is always a different frame to display 60 times in a second.
Lol, don't you think there's something wrong with this approach? you are supposed to throw in another GPU if you can't play a game on a single gpu system. Don't understand the "even on a single gpu system" approachQuote:
and this is only noticeable in really low fps environments, when u actually wouldnīt play the game, even on a single gpu system.
Looks like ATi's and nVIDIA's marketing channels did a very good job promoting CrossFire & SLI...
Honestly I would like to see what microstutter looks like in a video compared to what is seen graphically. At least then we all can agree on what it is and looks like.
It is pointless to spend time arguing over and over it happens everywhere with sli and yet not spend 15 mins to post a video of it with corresponding graph, so those that have never seen it knows what your talking about. And if you dont have sli, and havent tested with any recent drivers or games, that might be the issue.
Took me only few mins to run a couple benchmarks and graph what I saw, but no ms and no diff between 1gpu vs sli so no point in posting a video.
Getting a video of it is much more difficult than understanding it technically.
Why ?
Any screen capturing utility slows down the system's performance significantly.
A handheld camera can't record high FPS as they're outputted by the VGA.
Does anyone here, other than BenchZowner, have a known issue of microstuttering in a popular game where they can post a video (like some of the decent gaming videos posted on youtube: http://www.youtube.com/watch?v=1aPmBQ-1x0w) and post a graph, or tell me where in the game it exists so I can see if it occurs on mine in 1 gpu vs sli and I will post video and graph?
And BenchZowner I have your opinion that it cant be videod, or ? even seen unless you technically understand it...I hope it is ok with you if I still ask someone else to do it, so I can see for myself whether or not it can be vidoed, and I can at least see exactly where they see it in a game and can try to duplicate myself along with graphing it.
^^
I have two 5870s. One is on the shelf. I played Far Cry 2 and Crysis Warhead with CF on and CF off and I preferred CF OFF as I felt it was smoother and more predictable. In FC2, I would take 50FPS on a single GPU over 80FPS on CF. I have a very fast rig (4.65Ghz QX9650, 4x X25-E in R0 on LSI 9211). I don't know about anyone else, but I immediately notice lag and inconsistencies when using CF. I wish I could fix it somehow so the $400 I spent on a second 5870 wasn't a waste. I ordered a 120HZ screen recently so I could turn vsync on and see if that helps.
If we can't agree on if the problem even exists, how substantial can it really be?
Seems like VSync related to me. It should get rid of any extra stuttering.
do u even know what vsync is??
if u have an ordinary lcd monitor ur limited to a refreshrate of 60 frames per second. so u can only see 60 frames per second, no matter if ur gpu is pushing 80 or 230fps to the monitor, it will always display only 60 of them.
when u turn vsync on the gpu is limited to only send 60fps to the monitor.
and vsync isnīt a cure for microstuttering.
even the guys that first looked into the ms problem, by recording the time it took for each frame to be rendered, stated, that that itīs only noticeable under 30fps. so if u think dropping fps, or graphical anomalies, like flashing or texture flickering, is microstuttering, u are wrong. u then better check ur driver installation or the clocks of ur gpus.
what part of "even on a single gpu system" donīt u understand?
if ur getting 40fps on a single gpu, which is stuttering, no matter how bad u want it to be not, and then u put in another gpu to get ur fps up to a point when it meets the refreshrate of ur monitor, the game is actually playable.
as long as the time difference between the rendered frames isnīt higher than 0.016s (which is the refreshrate of a 60Hz lcd) u will not notice any stuttering.
and now donīt come and tell me that u can see more than 60fps on a 60Hz lcd display.
Ive had a 4870X2 for a year now and have never felt any
Microstuttering in any games I play.
Hey guys; on ATI driver settings, generally speaking, 31 00 = 1 (true) and 30 00 = 0 (false) I worked this through a long time ago for a benchmarking script I still use that sets ATI & NV driver settings depending on the settings the user desires. I can't remember the whole chain, but I know I worked this much out:
HEX val of 30 00in a REG_BINARY value is considered "OFF" by ATI. This corresponds to a word sized binary value of 11000000000000.
HEX val of 31 00 in a REG_BINARY value is considered "ON" by ATI. This corresponds to a word sized binary value of 11000100000000.
To set via decimal values using reg.write, use 48 and 49 respectively.
Some strings are a direct value, for instance "AnisoDegree" which is set to the binary dword sized value of 110001000000000011011000000000 for 16, which in hex is 31 00 36 00. (which would actually be 1,6 as ATI defines it, not "16" but the effect is the same)
Has anyone noticed screen tearing below 60fps, while playing Crysis and then noticed it go away when turning Vsync on?
BenchZowner mentioned the strafing technique, so I tried it while looking out at the harbor in the level Contact, with the sun behind me and a couple of palm trees directly in front of me at a distance of roughly 5 to 10 feet away. While getting a constant FPS of around 57 FPS and Vsync off, I noticed screen tearing of the edges of the palm trees, but with Vsync on and the same 57 FPS, it went away. It was the same when I used only one GPU.
Anyone have an explanation or can verify what I'm seeing on my end?
lol, if vsync is off there are going to be uneven frames and you ARE going to be able to tell differences even above 60 FPS. I used to play CS: Source, with a 60hz monitor, and the difference between 100fps and 70fps would be like day (I mean it was obvious lol). Why? because frames were uneven.
You are talking about "as long as there is a frame to show at every refresh of the monitor" yes, you are right about that. But having a FPS more than 60 doesn't guarantee you that if you are not using vsync. Consider the case evident on one of the graphs above.
FRAPS is showing you 60FPS which corresponds to a 17ms frame time difference. But when you look at the charts, you see that this is far from being the case. Every even frame is rendered at 10ms and every odd frame at 25ms. To illustrate, frame times would be like:
10-35-45-70-85-110.
What does this give you? Yes, 60FPS. But, let's look at the vertical refresh times from the monitor:
16.6-33.2-50-66.6-84-100.
At 16.6 ms, the monitor displays the first frame. But at 33.2 ms, the monitor doesn't have a new frame to display, so it continues displaying the old one. At 50ms, it displays the third frame, at 66.6 it displays the third frame again, at 84 it displays the fifth frame and at 100ms it displays the fifth frame again.
So in 100 milliseconds, all you got to see was three separate frames. Despite FRAPS displaying 60 frames per second, you are effectively seeing 30.
This might be an extreme case but is nevertheless not unrealistic.
the german website pcgameshardware.de put a lot of effort in proving microstuttering and trying to show it with videos. afaik pcgh also were the guys who first came up with the microstuttering issue (i think they reported about stuttering issues for the first time with the 7950gx2).
for everyone who wants a direct comparison of single gpu and multi gpu graphs of frame times pcgh has a few articles that show a huge difference between these 2 setups.
e.g.:
- http://www.pcgameshardware.de/aid,63...fikkarte/Test/
- http://www.pcgameshardware.de/aid,65...fikkarte/Test/
i think the english website of pcgh (pcgameshardware.com) has some other tests as well. however, i can't search for them right now as the work proxy blocks it :p:
you also want to check pcgh's youtube channel for videos of microstuttering.
if you have a single gpu and want to experience microstuttering yourself there's a demo-application a guy over at computerbase.de created to showcase how microstuttering looks like:
http://www.computerbase.de/downloads...er_single-gpu/
i experienced microstuttering myself on a rig with a 3870x2. and it really felt like the demo-application i linked above. however, i never saw that amount of microstuttering on a e.g. 4870x2. i DO think that ati as well as nvidia were able to reduce the effect somehow with their current gen cards. however, it's still there.
and this whole discussion about "i can see it" - "you suck, i can't, there's no ms!" - "there is you monkey!" etc :p: is pretty retarded imo.
it's the same with discussions like "i can't see a difference between 30 and 123895623875 fps", "i don't notice any tearing at all!", "mouselag with enabled vsync? i've never experienced anything like that!", "120hz, why would anyone want more than 60hz!?!??!?!?!"
you know, the problem is that maybe all of these people are right. everybody has a different perception. some people may notice or be bothered by smth other people don't notice respectively see at all. it's all purely subjective.
if one is playing a game running at 30fps with a crapload of screen tearing and microstuttering as well and you aren't bothered by that at all, i'm the first one to congratulate that person. because i notice everything of the mentioned, and i'm quite unhappy about that. sometimes i wish i wouldn't notice it as well, so i never had to bother about these issues again.