Watching for more developments
This text might be of some interest here
Studies of Threading Successes in Popular PC Games and Engines
is the final review ready for publishing? eagerly awaiting results. thanks.
Microstuttering is in every game with every config.
Why in Sampsa's test we can't see MS?
Because with 48x0 CF Grid is CPU bottlenecked. When there is CPU bottleneck, MS dissappears.
Also Vsync helps that only when you're above 60 FPS average. When you're under that it's even worse.
I'll test it with X2 when I'll get one :P
Conclusions made after few weeks of test with 4 different CFs, Phenom, QX9770 and 16 games.
Article with nice graphs soon :P
ftp://bf2.xweb.org/Movie.wmv
video showing fps in grid and encoding one video at the same time
Last edited by gosh; 07-31-2008 at 04:37 PM.
cpu bottleneck ing that fixes m/s ?When there is CPU bottleneck, MS dissappears.![]()
i7 3610QM 1.2-3.2GHz
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
I'm going on PCGH's HD4870X2 preview, hence my points.
Bedlamite claims he has the card and will test it. Fact is, even if one person tests one set of games on one system. His/her results may be different from another person's experience with M/S.
The buyer must decide how much of an effect it is going to have on his/her experience and though no one can come out and claim the HD4870X2 is useless because of M/S, the fact remains that if ATi solved the xfire scaling issues alone by the time it launches, you're looking at one MONSTER of a card for $499.
Perkam
CPU bottleneck somehow "synchronizes" frames, because GPUs have to wait for next frame.
Wors scenario of MS is in CoJ, which is completly unplayable below 70 FPS.
GRID is completly unplayable below 40 FPS and at 50 FPS you can see it starts to slow down.
Crysis is quite OK at 40 FPS with CF because of motion blur.
Today I'll end tests of 3850 CF and in next week I'll type this article.
I have no idea how does it look like in 4870X2, because I still don't have one, but I hope I'll have one soon for tests![]()
One alternative explanation of why the game could be smoother if the processor works harder could be how games handle threading.
I think that most games that are using threads have one render-thread. This thread is probably the most important also, it may have higher priority or it could even be that the game reserves one core for this thread. If the game prioritize one thread to one core like this then the game has more control over how the game behaves on the processor and how the GPU is fed with data for rendering. There is some drawbacks using this technique, it scales well to two cores but more cores (threads) that are used and it will not use the power of the processor if the processor has more than two cores.
If the game doesn’t take control of how the scheduling of the threads are done and how they are located. Then this task is handled by the operating system. The operating system will place threads for rules it has. Maybe it sees one core that isn’t working and another core has two threads, then it might decide to move one thread to that core. I don’t know the logics on how the thread scheduler works. I think that vista has a much improved scheduler compared to XP though. It could also be that it places two render threads on the same core and if one core will handle two threads it will of course slow down the game.
If you are running one C2Q and one thread is moved from one C2D to the other C2D (C2Q is two C2D glued together and the cache for each C2D doesn’t have any communication). The cache data is invalidated and the core needs to get data from memory, this will stress the FSB and the computer slows down for a fraction of a second.
It could be that if the processor doesn’t work as hard and it is a quad, then threads are moved around more than if the processor is working harder.
This is just a speculation.
Last edited by gosh; 08-01-2008 at 06:46 AM.
Well your speculation hasn't got much to do with the source for ms and I think that most games that are multithreaded are a bit more complicated in their way of threading.
About ms, some people explained it like this:
Let's say that the GPU takes 30 milliseconds to render a frame and the CPU is fast enough to provide the GPU with a new frame to render every 5 milliseconds. (just some random numbers) This would on a single GPU system result in around, while a CFX/SLI system would be able to deliver 66 fps when fully optimized. When using AFR though, frame number 1 will have taken 35 ms. to render and frame 2 would have been there 5 ms. later. Frame 3 would then again take 30 ms. while frame 4 will be there only 5 ms. later. That's because GPU0 starts rendering the first frame it gets from the CPU while GPU1 gets the second one, there is only a 5 ms. difference between those render starts but they both need 30 ms. to render the frame. This would be the case with AFR at least and this could have been a nice explanation for the problem, but it seems that other rendering techniques also sometimes suffer from the same problem and this makes the problem seem more complicated. This inconsistent rendering when using AFR can be solved by simply starting to render the second frame a little later, in this case 15 ms after the first render start and this is what NVIDIA and AMD tell the GPUs to do in some (if not most) cases.
When in this setup you would replace the CPU with one that is capable of only passing on a new frame to render every 15-20 ms. then you would solve this problem by simply inducing a CPU bottleneck.
This too is all a bit speculation as I'm still not completely sure what the reasons for micro stuttering are.
"When in doubt, C-4!" -- Jamie Hyneman
Silverstone TJ-09 Case | Seasonic X-750 PSU | Intel Core i5 750 CPU | ASUS P7P55D PRO Mobo | OCZ 4GB DDR3 RAM | ATI Radeon 5850 GPU | Intel X-25M 80GB SSD | WD 2TB HDD | Windows 7 x64 | NEC EA23WMi 23" Monitor |Auzentech X-Fi Forte Soundcard | Creative T3 2.1 Speakers | AudioTechnica AD900 Headphone |
so.... there was no conclusion in the original post.. just graphs. Is microstuttering solved or not? From quick glance 4870x2 looks the same as 4870... so microstuttering = solved??
Crysis is ok at 40 FPS with SLI, but with a single card 40 FPS Crysis means "really smooth", at least to me.
My 8800GT SLI was unplayable below 40FPS in Crysis. However a single 4870, which gives somewhat lower FPS's, is much, much more playable.
I wish I hadn't sold my 8800GT's before I got my 4870. I would illustrate the difference perfectly with a Handycam.
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
annihilat0r, I have really nice graph with 4850CF frame logs and with GTX280.
Both cards show average 40 FPS in place where logs were taken.
It's quite interesting to see what happens there
And yes, crysis at 40 MGPUFPS is "ok" but 40 FPS from single card is much smoother.
When I'll have a bit time today I'll post it here.
^^^One in the red cant be GTX 280, I have Gtx 260 and frame rates dont sync like that at all.
I have run a few tests in Crysis, Oblivion and COD4 with 4870 crossfire and the frames were rendered evenly in all 3 cases using fraps...
I truely believe the results can change from system to system, because in all my testing there was no evidence of micro stutter....
What resolution?
CoD 4 with 4870 CF is allmost all the time CPU bottlenecked so it's wrong title for such test
Try HL2: Episode 2, or stalker. These titles have really good engines, which are allmost never CPU bottlenecked. It's most visible in these games.
Or just try Call of Juarez and enjoy nice megastuttering
You have q6600. In resolutions below 1920x1200 you can be CPU bottlenecked allmost all the time with your CF so you won't notice stuttering.
Yup, something I've implied a while back. MS has been talked about for over 8 months now (that I'm aware of) and no one has provided any rudimentary evidence to show MS is specific to either current gen single card, SLI/CF or X2 when playing PC games (induced stutter, graphs and fraps don't count). So far many are starting to see that stutter (in one form or another) isn't video card specific but can be contributed by other means therefore not necessarily MS.
Some, will see the red flag in the length of time that has elapsed between the word "microstutter" 1st being used until today. Meaning, after all this time there is still no rudimentary, concrete proof of it's existence specifically current gen X2, CF/SLI and specific brand of video card to date.
Last edited by Eastcoasthandle; 08-02-2008 at 11:41 AM.
[SIGPIC][/SIGPIC]
Eh...
You wan't me to describe that none of these points, which you've mentioned in your post, are possible in my tests?
If you want to I can do that.
But before you'll talk that MS does't exist, just take some good 30' display, get CoJ, get 4870 CF, run a bit in grass and tell me that you can't notice that something is wrong.
Or just take two 3870 with CoJ and some decen't res (1680x1050).
Bookmarks