Defining framerates and methods of describing framerate fluctuations *spawn

Given a particular point in time, how are you proposing that we determine what the framerate is?

Like I said above, my preference would be to also include a table listing the relative distribution of frames per second (1..30 or 1..60) for the capture period. Folks preferring a simplified "mostly 30" get it, those who want to know how variable the framerate (fps) is, get it.
 
Like I said above, my preference would be to also include a table listing the relative distribution of frames per second (1..30 or 1..60) for the capture period. Folks preferring a simplified "mostly 30" get it, those who want to know how variable the framerate (fps) is, get it.
I wasn't asking you how you think framerate results should be published. I was asking you how you propose to get the framerate data in the first place. That is, how do we measure framerate from a video feed, if not by (number of new frames in last deltaT)/(deltaT)?

You're telling us that averaging is terrible, but you've provided no alternative way of figuring out what the framerate is.
 
You're telling us that averaging is terrible, but you've provided no alternative way of figuring out what the framerate is.

Sure, it's not a massive departure from what DF do now. You capture you footage at 60 samples per second, analyse the capture to determine the number of unique (and torn) frames across each subsequent 60 samples (one second) then populate the distribution table for number of unique frames per second. Record torn frames separately.

DF already do much of this analysis, what they don't do is populate a distribution table show how common 1fps, 2fps ... 30/60fps are.
 
Sure, it's not a massive departure from what DF do now. You capture you footage at 60 samples per second, analyse the capture to determine the number of unique (and torn) frames across each subsequent 60 samples (one second) then populate the distribution table for number of unique frames per second. Record torn frames separately.
That's a moving average with a one-second deltaT.

You're not describing anything different than the "terrible" averaging that you're saying we're talking about, except that you've locked into the particular case of a 1-second (60 frames at 60fps video stream) moving average, as opposed to a half-second (30-frame) average or a 4-second 240-frame) average or some such business.
 
No running mean for the reason I gave a few posts above (to Shifty). Given a capture at 60 samples per second, the first 60 samples presents the first second, the next 60 samples represent second 2.
 
I honestly don't see the difference between what you suggest and what everyone else (well, the function posse) is saying. :confused: You sample 60 samples in one second. You find 54 unique frames. You end up with 54 frames in one second of 60 samples, you get 54 fps.

I get one second of video footage. I find 54 unique frames in it. I get 54 fps. As we're working with display outputs fixed at a 60 Hz interval, the one second of capture data is exactly 60 frames as you describe. And then you have a new average for each second of captured footage, which is one of the described options for calculating the averages. and that average fails to account for how the average is produced (doubled frames or partial frames), although I'm guessing at the moment we're sticking to vsync discussion and don't have to worry about torn frames, so this last part is ignorable. But I'm still otherwise confused as to how frames per 60 frames at 60 Hz in one second of video is different to how many frames per second of video clip at 60 Hz. :confused::confused:
 
No running mean for the reason I gave a few posts above (to Shifty). Given a capture at 60 samples per second, the first 60 samples presents the first second, the next 60 samples represent second 2.
Err, that doesn't really change anything; it's the same as taking the 1-second moving average, except that you're only recording said average at 1-second intervals. The individual corresponding data points are still calculated in the same way.

At any rate, it doesn't even begin to solve the core concern that was brought up: any given framerate measurement is an average throughput over a certain time interval, which means it still fails to distinguish the distribution of phenomena on a smaller scale than the time interval. You still won't be able to distinguish, for instance, a frame that consists of 30 frames in 16ms intervals followed by .5-second freeze, from a frame that consists of 30 frames in steady 33ms intervals.

Ironically, your approach actually makes the problem worse, since with the full moving average data, the fine-grained time scale allows you to infer info about where drops occur based on how the moving average is trending (i.e. a flat line in a moving average will start to turn downward at exactly the time a previously-locked frame output misses a refresh). That ability is lost if you only record the moving average at widely-separated intervals.
 
Last edited:
Err, that doesn't really change anything; it's the same as taking the 1-second moving average, except that you're only recording said average at 1-second intervals. The individual corresponding data points are still calculated in the same way.
It's not an average, it's the exact number of unique frames in the sample period (one second - 60 samples).

At any rate, it doesn't even begin to solve the core concern that was brought up: any given framerate measurement is an average throughput over a certain time interval, which means it still fails to distinguish the distribution of phenomena on a smaller scale than the time interval. You still won't be able to distinguish, for instance, a frame that consists of 30 frames in 16ms intervals followed by .5-second freeze, from a frame that consists of 30 frames in steady 33ms intervals.

I entered at function's post here which wasn't about this. Inconsistent frame pacing (or stuttering framerate) is a different discussion which has emerged along the way. But this isn't an unsolvable problem and can also be expressed - again using empirical capture data. Instead of counting unique frames within the sample period (1 second), each sample (a 60th of a second) you record the number of samples passed since the last unique frame.

At 60 samples per second, a 30 fps game should be outputting a new unique frame every second frame so a sample recording would show 0, 1, 0, 1, 0, 1 - again you put this into a distribution table so that hitches, no unique frames for 2 frames etc are recorded.

Ironically, your approach actually makes the problem worse, since with the full moving average data

For the third time, not an average and not a running mean :rolleyes:
 
Okay, don't call it an average but call it a rate. Same difference. 60 frames per second (or 54, or whatever), whether you count them as frames out of 60 or frames in 1 second.

No. A rate and an average are very different things. A rate can be, as used here, a measure of something (unique frames) against another measure (60 samples). An average is distilling a single value from a list of values. There is no list of values, there is just one: the unique number of frames.
 
It's not an average, it's the exact number of unique frames in the sample period (one second - 60 samples).
Which just so happens to be equivalent to the average rate of frame output over the course of the second. Semantics.

Look at it from the perspective of using half-second intervals instead of full-second intervals. If you count that 15 frames were spat out over the first half-second, and 10 frames were spat out over the second half-second, and 60 frames were spat out over the next 2 seconds, isn't it reasonable to say "during the first half-second there was an average framerate of 30fps, during the second half-second there was an average framerate of 20fps, and over the next two seconds there was an average framerate of 30fps"? There's nothing special about intervals of 1 second that exclude them from such declarations.

Alternately (although this requires thinking of each frame as a certain sort of bin): At each sample (frame), you're recording how many new frames showed up. With a vsync'd game this is either 0 or 1, but with vsync off this can theoretically be higher. Now does "average" seem like a vaguely appropriate term (albeit scaled to 1/second as opposed to the native frame interval)?

(Although at some level, the fact that I'm calling it an "average" isn't really very important; it's the understanding of filter structure, and in particular, width.)

I entered at function's post here which wasn't about this. Inconsistent frame pacing (or stuttering framerate) is a different discussion which has emerged along the way.
It isn't all that different of a discussion at all. Different choices of interval result in expressing frame throughput phenomena on different scales; if DF and the other video use different intervals and such, they're going to express phenomena differently. Longer intervals will cause shorter phenomena to be represented lower in magnitude but broader in time-scale.

A stutter is just a very short phenomenon (and, in particular, requires a very brief interval to be clearly shown, if we're trying to represent it in terms of fps).

But this isn't an unsolvable problem and can also be expressed - again using empirical capture data. Instead of counting unique frames within the sample period (1 second), each sample (a 60th of a second) you record the number of samples passed since the last unique frame.

At 60 samples per second, a 30 fps game should be outputting a new unique frame every second frame so a sample recording would show 0, 1, 0, 1, 0, 1 - again you put this into a distribution table so that hitches, no unique frames for 2 frames etc are recorded.
Yes, those are all options for expressing fine aspects of frame output behavior.
 
Last edited:
Which just so happens to be equivalent to the average rate of frame output over the course of the second. Semantics.

It's not the average frame rate, it's the actual number of unique frames recorded over 60 samples. How exactly are you deriving the mean from one number?

Look at it from the perspective of using half-second intervals instead of full-second intervals. If you count that 15 frames were spat out over the first half-second, and 10 frames were spat out over the second half-second, and 60 frames were spat out over the next 2 seconds, isn't it reasonable to say "during the first half-second there was an average framerate of 30fps, during the second half-second there was an average framerate of 20fps, and over the next two seconds there was an average framerate of 30fps"? There's nothing special about intervals of 1 second that exclude them from such declarations.

The reason intervals of 1 second are so common for measuring frame rates is because frames per second is the most easily understand and relatable metric for your average user. That's why you'll never see DF (or anybody else I'll wager) refer to frames-per-half-second (FPHS) or frames-per-hour (FPH).

But I refer you back to my to post to Shifty a page back when he proposed using shortened sample periods. If your goal, as Digital Foundry's is, to give the reader a representative idea of the typical number of frames per second a game will deliver then you want to use as much sample data as possible. What if you focus in a small sampling where there is a hitch caused by the I/O waiting? Is that representative of the performance? Of course not. That's why it's important to capture accurate raw data (and not calculate it from tiny samples) then express it in a way which is accurate. Avoiding mean at all costs, mode for preference.

A stutter is just a very short phenomenon (and, in particular, requires a very brief interval to be clearly shown, if we're trying to represent it in terms of fps).
My method was intended to accommodate the exaggerated example that, I think it was Shifty, gave earlier. And I think that was less about occasional stutter and more about generally inconsistent pacing of frames. Pacing of anything is generally easy to measure and even easier to express visually. A consistent 1-0-1-0-1 output plotted graphically will produce a nice sawtooth and inconsistent pacing will stick out.
 
It's not the average frame rate, it's the actual number of unique frames recorded over 60 samples.
Unless you're using some number of samples other than 60.

How exactly are you deriving the mean from one number?
First, there's nothing technically invalid about doing that, it's just a trivial case of "mean."

Second, I'm not doing that. Every single frame of video gives you information about how many new frames were spewed out over the last 1/60th of a second, which is information that's dimensioned equivalent to fps. Add up 60 of those samples, and divide by the time interval, and you have an average rate over the interval.

The reason intervals of 1 second are so common for measuring frame rates is because frames per second is the most easily understand and relatable metric for your average user. That's why you'll never see DF (or anybody else I'll wager) refer to frames-per-half-second (FPHS) or frames-per-hour (FPH).
Right, but that has absolutely nothing to do with how we're measuring it. You're conflating sample period with output unit. Sampling over half a second or one hour doesn't mean you can't still express the result in frames per second.

But I refer you back to my to post to Shifty a page back when he proposed using shortened sample periods. If your goal, as Digital Foundry's is, to give the reader a representative idea of the typical number of frames per second a game will deliver then you want to use as much sample data as possible. What if you focus in a small sampling where there is a hitch caused by the I/O waiting? Is that representative of the performance? Of course not.
Well, a graph of such information is representative of a certain aspect of performance. That's the point: there really isn't a "best" filter here. It depends on what you're trying to express.
 
Last edited:
Unless you're using some number of samples other than 60.

Which would be dumb because the consoles output at 60 frames per second.

First, there's nothing technically invalid about doing that, it's just a trivial case of "mean."

Nothing "technically invalid"? It's nonsense.

Second, I'm not doing that. Every single frame of video gives you information about how many new frames were spewed out over the last 1/60th of a second, which is information that's dimensioned equivalent to fps.

This is also nonsense. You're on my ignore list. Goodbye.
 
HardwarePal do it best IMO (in some of their reviews anyway) showing Average, Max and Min FPS along with average frame times for 99, 1 and 0.1 percents as well as an actual frame time chart over the whole benchmark run. You can't really get much better than that for assessing frame rate. Digital Foundries performance videos also give you everything you need though since they often show frame time there. Frame time is king.
 
HardwarePal do it best IMO (in some of their reviews anyway) showing Average, Max and Min FPS along with average frame times for 99, 1 and 0.1 percents as well as an actual frame time chart over the whole benchmark run. You can't really get much better than that for assessing frame rate.

Definitely better than mean alone but the issues with mean are why median and mode exist.

Take the unlikely FPS over six seconds of 1fps, 30fps, 1fps, 30fps, 1fps and finally 30fps. The mean is 15.5fps but at no point did the user experience 15fps. Unlikely but illustrative and why I wage an eternal war against mean being used anywhere in statistics. ;)
 
Definitely better than mean alone but the issues with mean are why median and mode exist.

Take the unlikely FPS over six seconds of 1fps, 30fps, 1fps, 30fps, 1fps and finally 30fps. The mean is 15.5fps but at no point did the user experience 15fps. Unlikely but illustrative and why I wage an eternal war against mean being used anywhere in statistics. ;)

But thats where the frame time chart comes in. And to a lesser extent, the frame time percentiles. Although it may be more useful to give the percentage of frames that took more than 33.3ms and 16.16ms to render rather than the average frame time of the lowest 1% etc..
 
This is against my better judgement, and it's sort of pointless, but I can't bring myself to not respond. :/

Nothing "technically invalid"? It's nonsense.
It's nonsense in the sense that it doesn't really embody what people mean when they talk about averages, but it's a perfectly clearly-defined usage of the concept of "mean." Obviously the mean of a single sample doesn't offer a ton of confidence in determining population means, but as ultra-simplified cases in mathematics go, if anything it's relatively non-degenerate.

This is also nonsense.
How so?

A frame of source video is, in a way, a sample entity allowing me to see what happened during a 60th of a second. If I see one new frame of game output in a frame of source footage, then I have (1 frame)/(1/60 s). If I rewrite this to separate the coefficient from the bottom unit, this is 60fps. Similarly, if I see 0 new frames of game output in a frame of source footage, that's 0fps. This extremely short sample period makes the sample values heavily quantized, but since it's quantized in a way which doesn't round "partial frames" away (rather, it counts them "as members of preceding/following frames"), it shouldn't apply a measurement bias and should be simply an imprecision that we can mitigate via averaging over multiple "samples".

Let's suppose we sample a video file and start seeing a pattern like 1 0 1 0 1 0 1 0. Our framerate interval samples thus look like 60fps 0fps 60fps 0fps 60fps 0fps 60fps 0fps.

If you start averaging those, you'll find it's pretty clearly converging to 30fps, which is what you'd intuitively expect from the every-other behavior. If you average them over a whole second, the result you get will be equivalent to the "count the number of frames that showed up in that second" method.

FPS means "frames per second", which is merely a rate, not "number of frames counted during a 1-second sampling interval." Requiring 1 second (60 frames of a 60fps source video) of accumulation to comprise "a sample", as you have done, is entirely arbitrary as a choice of sampling interval; there's nothing intrinsically natural about using 1 second. Does it produce a meaningful "sample" that you can work with using methods of statistical analysis? Absolutely, but it's it's also an average of values sampled over smaller time-scales (which can themselves also be worked with using the same methods of statistical analysis). The only thing distinguishing "1 second" from other intervals is that we've picked our units such that the denomenator of the frames/time fraction has a coefficient of 1, which makes the division "invisible" (and resultantly, the framerate measurement is "just a count" instead of "a count followed by a division operation").
 
Last edited:
But thats where the frame time chart comes in. And to a lesser extent, the frame time percentiles.

Yes, this is what I've been suggesting here - as far back as April. It's the only way to express a full and accurate distribution of a frames per second :)
 
Back
Top