Results 1 to 25 of 3567

Thread: Kepler Nvidia GeForce GTX 780

Hybrid View

  1. #1
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    480
    Quote Originally Posted by bill_d View Post
    I don't understand, it still beats the 7970 at 1600p in most games by the looks of it.

    Quote Originally Posted by Kristoferr View Post
    Umm,,, that is exactly what i was afraid of. Minimum FPS sucks badly. Lets hope thats the fresh drivers issue and not Nvidias new dynamic OC issue. Avg FPS says nothing. Minimum FPS is what counts.
    Minimum fps is a worthless statistic that tells you absolutely nothing about what it's like to play on the system. If you have one slowdown it completely bogs the result and will show that system a is worse than system b, yet system b could be in microshuttering hell and you'd never know it.
    Last edited by Cold Fussion; 03-21-2012 at 10:51 PM.

  2. #2
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    247
    Quote Originally Posted by Cold Fussion View Post
    Minimum fps is a worthless statistic that tells you absolutely nothing about what it's like to play on the system. If you have one slowdown it completely bogs the result and will show that system a is worse than system b, yet system b could be in microshuttering hell and you'd never know it.
    Im not even going to make counter argument to this . You are just so wrong and i think you know it.

  3. #3
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    480
    Quote Originally Posted by Kristoferr View Post
    Im not even going to make counter argument to this . You are just so wrong and i think you know it.
    Actually I'm not. Let me show you.

    Here is a short run through BF3 with my system, the min fps is 10, average is 35 and max is 74.



    Y axis is render time in ms and x axis is render time. The largest spike is around 96ms which is an instantaneous fps of about 10. Now there are very few times when the fps is this low and while it is annoying it doesn't really constitute a tangible degrading of an experience because these small small fps are such a small percentage of the overall experience.

    Now let's look at a boxplot with a 1.5 IQR whisker length:



    Now this is painting an interesting picture of what is going on. We can already see we have an enormous amount of outliers in the data (roughly 10% of all frames are outliers by this definition) but we can see that near the top where our min fps is, there is a very, very low concentration of frames there, so small that it's insignificant compared the clumpings from 35-50ms. If we look at the percentiles of this data, we find the 99th percentile is 59ms, that means only 1% of frames are ever over 59ms. If we go further we find the 99.5th percentile is 64ms and the 99.9th percentile is 86ms. Have you noticed something yet? Less than 0.1% of the frames are anywhere near the maximum rendering time. To get the maximum rendering time you need to jump to the 99.99th percentile, 0.01% of the data is at the minimum fps. The minimum fps is just insignificant to the overall dataset because it is at the outer most extreme of it. To look at what really degrades performance you have to dig deeper into the data. So what happens if we through away all the data we considered and outliers before? We get this:


    (This plot started from zero so we don't exaggerate the spikes to much)

    Now this is really worrisome, we've thrown away all our outliers and the rendering time is still all over the place. This is a textbook example of microshuttering, and if you simply looked at min, average and maximum fps you would have absolutely no idea that the gameplay was this abysmal. Even looking at percentiles we do not see the full extent of just the level degraded gameplay this setup yields. Now I've done some further analysis on the this data set just to give you an idea of what is going on. What I've done is counted the number of frame spikes and the number of areas of sustain low fps (excerpt from the code comments so you can see the methodology)

    Code:
    	%% Frame render time difference of 10 ms between frames
        % This is to find frame spikes, large difference in rendering time
        % between consecutive frames is an indicator of stuttering type
        % gameplay.
    Code:
    	%% Find how many frames constitute a lowering of fps.
        % A lowering of FPS is taken as a frame time difference of atleast 8
        % milliseconds, and for the rendering time to be atleast 33.5 ms. A
        % steady rate of 33.5 ms isn't considering a lowering of fps.
    Using this we can find that for 20% of the frames, the very next frame has a difference in rendering time of 10ms, and 2.6% of time there is a sustained period of low fps. This is exactly what we saw from the graphs, and that was while the bulk of the time the fps wasn't low, it was all over the place which is what really degrades the experience and not just the minimum fps.

    TLDR: Using the average,min,max doesn't give you an indication of how the game is actually playing, it simply gives you the upper and lower bounds and a smoothed result.
    Last edited by Cold Fussion; 03-22-2012 at 12:33 AM.

  4. #4
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Cold Fussion View Post
    TLDR: Using the average,min,max doesn't give you an indication of how the game is actually playing, it simply gives you the upper and lower bounds and a smoothed result.
    The answer is that all of the factors matter, but you need a frame of reference (a time trend) for it to make sense

    Quote Originally Posted by Mechanical Man View Post
    I would expect no help from AMD nor nVidia when doing fair review.
    Hahaha! So true

  5. #5
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by Cold Fussion View Post
    Minimum fps is a worthless statistic that tells you absolutely nothing about what it's like to play on the system. If you have one slowdown it completely bogs the result and will show that system a is worse than system b, yet system b could be in microshuttering hell and you'd never know it.
    Minimum FPS tells you how consistent the framerate is between the highest and lowest frames during a section of the game. If the minimum framerate is lower, it means that the game slows down more, and that is never good. You always want the minimum framerate to be as high as possible. That ensures a smooth gameplay experience throughout. As a metric of performance, it tells you how well your system copes with the data load in the most intense conditions in that said scene.

    Basically, you're dead wrong.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  6. #6
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by Mad Pistol View Post
    Minimum FPS tells you how consistent the framerate is between the highest and lowest frames during a section of the game. If the minimum framerate is lower, it means that the game slows down more, and that is never good. You always want the minimum framerate to be as high as possible. That ensures a smooth gameplay experience throughout. As a metric of performance, it tells you how well your system copes with the data load in the most intense conditions in that said scene.

    Basically, you're dead wrong.
    Minimum FPS are not very useful. If you have a benchmark that is 120s long and the framerate dips exactly once quite low on card A and not so low on card B, what does that tell you about general playability?

    It would be much better to do something like the FEAR 1 benchmark, for example:

    x% above 40
    y% between 25 and 40
    z% below 25

    This way, you weigh the frequency of occurence of specific fps or fps intervals rathern than take only one single value that may very well have been an unrepeatable error.

  7. #7
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Look at the min fps hit in Batman, the 680 takes the big hit once early on in the run while over the rest of the run performed very strong, you can't reasonably base a decision about game play on one instance in the entire run. If it where repeatedly hitting min fps sure it would be an issue.

    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •