Originally Posted by
eRazorzEDGE
You're absolutely right. I come in here stating false truths to get a reply.
If you have an ATI card, or cards, look in the registry. Search for MVPUmode... default is 33... 33 IS EQUAL TO SUPERTILING.
It's all over google, just search for "mvpumode" or "mvpu".
If the game itself overrides the default driver preferences, then I'm wrong. However, I was just playing Crysis with the MVPUmode set to 32 (which is Split Frame Rendering) and boy howdy can you see a difference between the top and the bottom halves of the screen when moving around... it's like image tearing, only it does it below the display frequency of my monitor. It definitely didn't do that when I had AFR (31) or the default Supertiling (33) on.
@ Xoulz... no, it doesn't have to be anything. It uses whatever you tell it to use. Look at the program by AgentGOD, ATI Proflies or something (I know I'm butchering the name, I'm sorry), it allows you to force different rendering techniques.
Also, everyone is just in love with the word "microstutter" and it's generally accepting meaning has spread like wild fire throughout the internet to people who like to talk about things when, in fact, they know nothing of these things.
If you notice from the articles graphs, they got virtually no stutter in 1680x1050, but once they went to 1920x1200 it was all over the place... seems peculiar to me. If you notice when the hard drive is being accessed a lot while playing a game, it slows and un-even's the flow of the frames to the screen. Now since we don't have a baseline control example, there is no definitive way to conclude that the results are simply cause by multiple gpu's.
There's a thing called a scientific method, if it's not followed, any resulting data cannot be have a conclusion drawn from it with absolute certainty.