maybe something that could calculate the amount of milliseconds (?) between each frame or something?
Printable View
maybe something that could calculate the amount of milliseconds (?) between each frame or something?
go into fraps and tell it to log frame times. this is what you get.
I'm not a pro with excel but I have enough know-how to make it do calculations and some charts and stuff, if someone wanted to run these tests with fraps and send me the data from fraps I'd be glad to do some data analysis of it. Maybe fraps data from Chew's testing?
If you knew anything about benchmarking then you would know lower resolution tests use the CPU more and the higher resolutions shift to the GPU.
Many CPU reviews use lower resolutions benchmarks to show this.
Meaning when you want to show CPU gains you use lower resolution benches.
i agree, the guy you're quoting has nothing to do with this forum, this is not about who gets higher frame rates at what resolution, it's about:
Hey, this CPU gets 90fps average in this game, but goes all the way down to 30fps every now and then
While this CPU gets 75fps average, but only goes down to 55fps every now and then
someone wanted the definition of smoothness, there's your smoothness, the difference between lows and highs on many occasions, i mean really, who cares if you get 200fps if every 10 seconds it goes down to 1fps, i wouldn't want to play like that
http://forums.overclockers.co.uk/sho...php?t=17896754
Maybe someone can ask the Author for that program as it will work out the avg deviation for you from fraps.
The DL links are dead.
New link http://www.mediafire.com/download.php?2moyugibhez
there where already many threads about that topic which came to similar ideas. When in fact the only thing that influences the "smoothness" factor is the difference between max/min fps, then there is a easy fix for the intel system -> limit max. fps or enable vsync.
If that hypothesis would turn out to be true, a intel rig would be sometimes even more smoother cause of the often higher min fps rate then a amd rig or the other way riudn if the amd rig has a higher min fps and the max fps is limited.
Imho its the "smoothness" debate is the same thing as the "they eye cant see more then 30fps" debate. It is based to much on the individual to make a general statment, just like people say they can see the difference between 60fps and 100fps. Maybe some individual can but the majority can't, its the same with audiophiles, play joeavarage a 192kbit/s encoded mp3 and a flac file on a highend headphones, they won't notice the difference, hell maybe they would even say the mp3 sounded better. :p:
well just like said above, there are often cases where the min fps is higher on the intel rig -> enable vsync, limit max. fps and you have the same smoothness as the amd rig, but your artificially limiting your gfx card of showing what power it has.
But with same/lower min fps for AMD systems this doesn't clarify anything. It's more like this:
System A has avg fps of 120 but has min fps at 40 which happens every five seconds.
System B has avg fps 100, has min fps of 35 but this happens every 40 seconds.
System B would probably be a more fluid experience than A. The main issue is, as maany has already stated; fps stability. If a system fluctuates between 50 and 150 fps dropping to 35 sometimes, averaging 120, it will probably be considered more stuttering than a system fluctuating between 50 and 70 fps, dropping occasionally to 35, avg fps at 55. Microstuttering should be a similar thing.
afaik JumpingJack already did show framrate over time with a QX9650 and a P1 9850 on quadfire.
edit:
just look at that for example:
Company of Heros
http://forum.xcpus.com/gallery/d/756...ps_compare.JPG
Racedriver Grind (done on a 8800 GTX)
http://forum.xcpus.com/gallery/d/701...15SEC_FRAP.JPG
also this smashes the theory that systems behave differently form each other, if a system gets a drop every 30secs the other system also will get a drop every 30sces, when the same situation is played/displayed.
And if you knew anything about benchmarking you would know that using results from a different setup with different settings at a different resolution is totally useless here.
The tests AT did wasnt benchmarking they were actually playing the games using real world resolutions.
You cant say a game felt smoother by watching a benchmark
There's no arguing for the point of "eye can't see more than 30fps" though, the statement is completely wrong, it's not even a matter of 'some people can, some people can't'.
Very slight but unless you're playing online it's negligible :up:
Very interesting graphs, and as a general statement the AMD systems do have less framerate variance from a glance.
I wouldn't necessarily say those graphs "smash" anything though; we are only seeing 1/40th of the available data. Each connected data point is avg fps per second, so any hiccups would be included in that average. we would need to see a non-smoothed raw data graph at least, but a histogram of frame render delay would be absolutely ideal.
You need to calculate on more detailed level to understand how smooth the game is. The graphic card and CPU works asyncronosly. If you calculate based on when picture is drawn from the GPU this isn't the same as when the CPU calculates points for the picture. The GPU is always behind the CPU. The GPU will also smear time spans between when the CPU calculate the picture.
To get a better value you need to time the EndScene API, when they are called if we are talking about Direct3D.
How the game feels depends on when the picture is calculated, not when it is drawn on the screen. Also the mouse is important, reading mouse data needs to be very exact.
You know Guys Graph & Charts do not prove everything :)
the COH chart from hornet actually 'proves' this thread's point.
I think the word instead of smooth might also be "Stable framerates" as the drops and speedups in fps for the Q9550 are more noticeable while playing and thus not so smooth.
obiously vert. sync will help but then we're doing apples vs oranges in this thread.
I think that the chart you displayed with the frames and the time would be much more useful than the "max/min" idea or any graph of FPS over time.
Using the data you displayed you could make a graph of the time DELTA (change in time) between frames. (So it would show the time spent between each frame.)
Optimally this graph should be as flat as possible. Although realistically it won't be due to the operating system, rendering different things in the game, communication between CPU cores in a multi-threaded games, cache contention in the CPU, and a zillion other factors. (i.e., some factors related to the CPU, some are not. A good tester would attempt to eliminate most factors except for the CPU.)
BUT if you look at a graph of the delta vs the frame generally the system with the flatter graph would probably be the "smoother" system. Or if one system had a major dip in this graph on occasion and the other system did not. Of course this would have to be something that can be reproduced; a good tester would throw out weird anomalies that only happen one time and not use them.
NOTE: I know this image I am showing below is NOT REALLY RELATED. I am throwing it here into this post as an example of something similar to what we might expect to see by creating the type of graph I propose. POP QUIZ: If this WAS a graph of the time delta versus frame... which system would you rather have? (And amusingly the system with the higher min and max in this non-related example is also not the "better" system.)
http://img3.imageshack.us/img3/3779/...examplevj5.jpg
<sarcasm mode ON>
We're all just shocked that you, of all people, find this to be a non-issue.
<sarcasm mode OFF>
This is most especially true since we almost have a constructive conversation going on. That can NOT be allowed. You'll need to run to the other forum subsection and gather some of your cohorts.
Actually I'm sure I speak for many people when I say your comment is worthy of a very large yawn.
And changing the vsync won't actually change anything. It might hide the issue from any method that could possibly be used to determine the difference. (But then hiding the issue is probably acceptable to some people.)