Defining framerates and methods of describing framerate fluctuations *spawn

What the frame rate will come out as will all depend on the period you calculate the frame rate over.
Well it's frames per second and the final number will be whatever the particular median of all FPS figures for each second over the period of capture.

With vsync enabled, I've always found it easier to think in terms of 20 fps / hz :)o) update or a 30 fps/hz update or a 60 ... etc, because that's the rate at which subsequent buffers are flipping.

But you realise there's no fallback from 60 to 30, 30 to 20, 20 to 15? Actual frame rates (FPS) rarely jump like that. Tomb Raider Remastered on PS4 is typically in the high 50s, The Last of Us Remastered on PS4 is typically in the 50s except for some sequences where it drops into the high 40s. Far Cry 4 on PS4 is mostly 30 which some dips to high 20s.

Unless your game/design/engine is lowballing in terms of what it's doing relative to the potential of the machine, the odd frame dip when things get hectic are the norm but 30fps games are dropping a frame or two, not dropping from 30 to 20 frames.
 
The opening sequence alone shows that his numbers (and how they fluctuate) and so how they are produced, the averaging method, are very similar to DF numbers. So whatever reasoning you have about 20fps (or any number) indicated by his tools not being a real 20 frames per second produced by the XB1 can be made on any Digital Foundry video.

I never said it wasn't a "real" 20 fps (just like DF never put FC4 performance differences down to the CPU).

Perhaps you've missed the recent discussion I've had about frame rate, but I think it's entirely legitimate to use shorter trailing average periods that are more likely to highlight sharp dips.

What's important is that both methods to count fps are consistent and so numbers can be compared.

It's far better to compare his numbers with his numbers, especially as he seems to have played the game a lot more. But yes, they're very similar.

Finally his video often shows a heavily fluctuating framerate, the 20fps image is just that, a short moment taken in a fluctuating framerate lasting ~10 seconds which is much longer that the very short, now famous, a unique and very short ~25fps drop seen in the opening sequence that DF used to conclude that the XB1 game performed slightly better.

It's called cherry picking. It was famously (and successfully at first) used by the tobacco companies to "prove" that tobacco wasn't harmful to people.

Any you know this guy isn't cherry picking because ..?

I don't think he is, but then again I don't think DF are either. If you're going to accuse one party then what makes this different.

Frankly, I think DF didn't play the game enough. Same as with lots of others. But they have limited time and deadlines to meet. That's an entirely different issue to the one you are focused on.
 
In general usage, though, the word "rate" implies a trend.

I don't think so. A rate is a ratio. A trend would require change and not a single, simple data point.

For years, FPS worship actually hid frame rate consistency and frame pacing problems. I think it's beneficial to see things in terms of the characteristics of the individual frames.

But you realise there's no fallback from 60 to 30, 30 to 20, 20 to 15?

Again, it depends on how you measure. If you're looking at the time for each frame then that's exactly how it goes. If you're looking at a longer period, then no.

It's strange that people are happy with the rounded and therefore lossy "16.7 ms" and "33.3 ms", but not the completely accurate 60 Hz (60 cycles in 1 second = 1/60 = 16.7 ms) and 30 hz.

Flipping the buffers is a cycle! And it's best measured in terms of cycles per second! And yes, the number of cycles per second can change!

Wut is rong w/ u ppl! :D

Yes, but function is using a "period of capture" of 1 frame :D

Actually, I'm arguing that "frame rate" can use a period of capture from as little as one frame to as much as several minutes or hours. And it can highlight different things as you change the period of capture.
 
FPS is a rate, and so doesn't need to be calculated over a full second (in the same way that calculating miles per hour doesn't require you measure in periods of an hour).
This, and a lot more besides. Function is right on the money with his recent posts. A rate doesn't need a specific interval to be determined - you don't have to drive for an hour to hit a speed of 30 mph. And a rate can be sampled at any point from a continuous source and needn't be taken at specific regular intervals, although we will when charting changes. And finally, the idea of dropping to 20fps is perhaps more descriptive of a the perceptive difference then the average framerate of the past 60 or whatever frames will give.

Let's use a simple illustration using a display that refreshes every 200 ms. Over a period of 1 second, maximum framerate will be achieved with frames of times (millisecs):

200,200,200,200,200

5 frames per second average. Now what if the sixth frame takes 201 ms to render? With VSync, it takes up two frames, so the sixth frame lasts for 400 ms

200,200,200,200,200,400

The framerate for the past frame is 2.5 fps, the framerate for the past second is 4.17 fps, and the framerate since the beginning is 4.29 fps. Now if we return to a stable 200 ms per frame, the framerate for any second including that drop will be 4.17, while the average across all frames will tend towards 5fps.

The important thing for the results is that the drop is far more jarring than the slightly lower framerate. In the case of ~30 fps, a constant 29 fps on a display capable of syncing to that will look just as smooth as 30 fps, whereas a frame being shown for double duration is going to interrupt smoothness no matter what. Hence describing an average as 29.5 fps doesn't sound so bad, but in reality in vsync'd content, the results aren't like a constant 29.5 fps but are 30 fps only with interruptions. Function's last-frame framerate is more indicative of that, although of course it's only one way to present the data. There are always lots of ways to deal with numbers.
 
Does anyone know the interval times for Digital Foundry? I would think the displayed framerate is done in intervals of one second, but if they aren't, and its longer intervals of time then like Function is saying, large framerate dips could be hidden by the following recovery period. However, I don't feel that its needed to break down performance into smaller metrics. We measure framerate in frames rendered per second. In a way, its like comparing two drag racing cars that both race the 1/4 mile in 10 seconds, it doesn't really matter if after an 1/8th of a mile one car is on pace to do the 1/4 mile in 11 seconds, and the other on pace to do it in 9 seconds, if they both completed it in 10 seconds, that's all that matters, because that's what we are measuring, and how we measure it. I understand that breaking it down into smaller segments of time would help showcase these long rendering times since you could have a 60fps game have a dip that leaves a frame onscreen for 50ms for three frames, but then recovers and the frames per second will show 50fps. Even though its still a high framerate for the second, if your really sensitive to framerate dips, that 150ms of time that goes by with only three frame updates could be somewhat jarring.
 
Again, it depends on how you measure. If you're looking at the time for each frame then that's exactly how it goes. If you're looking at a longer period, then no.

DF's methodology for measuring framerate has been published many times. They measure the number of unique frames output each second (FPS or Frames Per Second) and within that measurement is a separate measurement intended to detect and quantify torn frames. Frames Per Second is how frame rates have been measures for as long as I can remember. I don't know of anybody using a different metric to FPS, or rather median FPS over whatever the capture period is.

So you measure the (unique) framerate for second 1, second 2 and for the entire capture period at 60Hz input, 60 seconds a minute for however many minutes/hours you are capturing. Then you quantify that as a median, i.e. average/median FPS.

It's strange that people are happy with the rounded and therefore lossy "16.7 ms" and "33.3 ms", but not the completely accurate 60 Hz (60 cycles in 1 second = 1/60 = 16.7 ms) and 30 hz.

It's not strange. Milliseconds is a known constant but the way your using variable hertz to measure and quantify framerate (rather than FPS), is meaningless. Oh yeah this framerate is 20Hz but the frequencies (a single metric of one hertz) changes constantly. How is this helpful? You may as well use glimbarts instead.

But to be clear, I'm only contesting your earlier claim that framerates/frequenzes drop 30 to 20 to 15, rather than at small granularities of 1-2 frame(s) per second. This wouldn't be common occurrence over several seconds, nor is this statistically likely employing averages/medians as the period of capture of lengthened across many seconds, minutes and hours.

Actually, I'm arguing that "frame rate" can use a period of capture from as little as one frame to as much as several minutes or hours. And it can highlight different things as you change the period of capture.

Sure, you can capture a single frame. But the reason nobody does is because it's not representative of the performance of playing the game. That's why average/median FPS over many minutes/hours is the popular way to quantify framerate.
 
This, and a lot more besides. Function is right on the money with his recent posts. A rate doesn't need a specific interval to be determined - you don't have to drive for an hour to hit a speed of 30 mph. And a rate can be sampled at any point from a continuous source and needn't be taken at specific regular intervals, although we will when charting changes.
Perhaps I need to speak more clearly, but I maintain that rates are a trend of sorts; actually, even for continuous sources, your claim that a rate can be sampled from any point without concern to intervals is incorrect, as there are functions that `are continuous everywhere but cannot be differentiated in places (sometimes even anywhere). For a derivative to exist, a function needs to be sufficiently (*waves arms around*) smooth, which in turn forces the notion of "trend" to be reasonable to apply, even if sometimes only on a very small scale.

But also, frame outputs aren't a continuous source. Instantaneous speed can be pretty clearly defined, even according to sample-based formulas like (x2-x1)/(t2-t1), by taking the limit as the spacing between the two samples goes to zero. There really isn't an analogous "obvious" thing for framerate, which is why it starts to be a confusing thing as you use a sharper and sharper filter for determining it. The time at which a frame boundary occurs says nothing about rates unless you consider when other frame boundaries are occurring. Any reasonable formula that divides out a frame count by a fixed time delta to get framerate is going to fall apart and behave extremely weirdly (flopping between "0" and huge numbers, for instance) when the time delta gets very small. The way you get Function's numbers is quite different, as it works by using a variable time delta, defining said delta in terms of the time-width of the frame that the current sample location sits within, i.e. fps = (1 frame)/(display duration of current frame). I suppose this sort of definition "works", but it's a little bit weird, and at some level it needs to be described on account of not being an obvious natural choice for "instantaneous rate."

Frame time is clear and unambiguous without any need to be explained or justified. Saying that a frame displays for 50ms is pretty much exactly what it says on the tin.

And finally, the idea of dropping to 20fps is perhaps more descriptive of a the perceptive difference then the average framerate of the past 60 or whatever frames will give.
Only if you take the time to describe what you mean by "20fps" and when it's 20fps for. A long time-average by itself isn't significantly less useful than a single sharp "framerate" or "frame time" value by itself. A single 50ms frame in the midst of a bunch of 33ms frames doesn't have the same experiential feel that I associate with "20fps."

I agree that we shouldn't just use lengthy time-averages to describe performance, but I also think that using "count/time" as the dimension when describing fine characteristics on a discrete distribution is clumsy.
 
FPS is a rate, and so doesn't need to be calculated over a full second (in the same way that calculating miles per hour doesn't require you measure in periods of an hour).

This, and a lot more besides. Function is right on the money with his recent posts. A rate doesn't need a specific interval to be determined - you don't have to drive for an hour to hit a speed of 30 mph. And a rate can be sampled at any point from a continuous source and needn't be taken at specific regular intervals, although we will when charting changes.

You don't need to drive for an hour to hit 30Mph but you do need to render a game for at least a full second to measure anything like an accurate framerate.

If the thing being measured has low rate of variance, or where the rate of change is otherwise dictated by known and predictable rules (the laws of physics for example) that make it impossible for a car to rapidly change speed from 30mph to 10mph to 25mph over a short sampling period then you can make a case for using a lower sample rate, say one capture per second.

However a variable framerate is generally not something you can reliably calculate, only something you can measure then later quantify in expression. And because framerates in games are so variable, you need to sample at least at the highest possible output rate to capture the maximum performance possible, at a granularity capable of detecting all of the sudden variations. If you capture 5 frames and they happen to be produced in 15ms you could conclude the framerate is 60fps, but if the following 3 frames all take 18ms to render then is an erroneous rate calculation caused by under sampling.

And this is why nobody calculates frame rates in this way but instead use sampling (captures) at 60Hz for minutes or hours. It's worth noting that many scientific principles have been overturned because of poor or limited sampling data. Physics has been re-invented many times because previous sampling was too low to observe accurate behaviours, particularly particle physics. Similarly geology and metrology have advanced because of longer periods of sampling. Framerates, which are entirely arbitrary, are something of an unnatural, unpredictable, product of man where both fast and prolonged sampling is required.
 
You don't need to drive for an hour to hit 30Mph but you do need to render a game for at least a full second to measure anything like an accurate framerate.
That depends on what you mean by "accurate" and "framerate." You don't need to sample for a second if your intention is to describe only the behaviors within some half-second that you're interested in. Shifty and Function aren't talking about the average rate throughout a long run of play; they're specifically discussing how we can talk about framerates (in the sense of very short moving average filters) changing in spikes and such.
 
That depends on what you mean by "accurate" and "framerate." You don't need to sample for a second if your intention is to describe only the behaviors within some half-second that you're interested in. Shifty and Function aren't talking about the average rate throughout a long run of play; they're specifically discussing how we can talk about framerates (in the sense of very short moving average filters) changing in spikes and such.

Perhaps we need a separate thread: Performance of games in 1/2 second intervals.

In the DF thread it probably makes sense to use the same methodology for framerate as DF do.
 
In the DF thread it probably makes sense to use the same methodology for framerate as DF do.
DF uses a lengthy time-average filter for their filters, but they discuss sharper phenomena and measure timings down to individual frames. Would you refuse to speak in centimeters if you had a meter stick that was graduated with only millimeter marks, or would you insist on only talking in meters and millimeters?

(Also, Shifty and Function are discussing framerates using a filter of single-frame time width, which is one-to-one and effectively equivalent to DF's frame time measurements. I think it's a silly way to describe things, but it's hardly contextually inappropriate.)
 
Run that past me again in English?
Framerate discusses how many frames get processed over some interval of time. "The number" in DF's performance videos uses a fairly lengthy time interval, which means you don't get information about how frames are distributed on a fine scale. But DF also includes a frame time chart to provide that otherwise smoothed-away information. From this frame time information, you could derive their "framerate" numbers, or numbers using a sharper filter (i.e. a half-second time average), or you could go even finer and talk about individual single-frame stutters. And DF does sometimes talk about those finer characteristics of game performance, so it makes sense to discuss them here, even if "the number" is a very smoothed-out representation of frame throughput.

(Also, even if that weren't the case, considerations of finer information are hardly out of context.)
 
Well it's frames per second and the final number will be whatever the particular median of all FPS figures for each second over the period of capture.



But you realise there's no fallback from 60 to 30, 30 to 20, 20 to 15? Actual frame rates (FPS) rarely jump like that. Tomb Raider Remastered on PS4 is typically in the high 50s, The Last of Us Remastered on PS4 is typically in the 50s except for some sequences where it drops into the high 40s. Far Cry 4 on PS4 is mostly 30 which some dips to high 20s.

Unless your game/design/engine is lowballing in terms of what it's doing relative to the potential of the machine, the odd frame dip when things get hectic are the norm but 30fps games are dropping a frame or two, not dropping from 30 to 20 frames.

Please don't continue to feed the Digital Foundry invented rumor. Have you even played the game? You can sometimes play 2 hours of TLOU at a locked 60fps. The game is typically running at a locked 60fps, it drops rarely in the 50s (like once every ~30mn or at some spots in some big levels for instance on top of the dam) and drops only once, once, in the whole 20h game in the 40s.

Personally the only fps drops I have noticed were during exploration in some big levels at some precise spots (like in the dam level). I have never experienced any drops in any combat scene, not once, and I am very sensitive to judder.

The scenes DF picked in the video are literally the worst moments of the whole 20 hours game. That's the first thing I thought watching the video, that those moments were the worst performing moments of the whole game. This video is not representative of the whole game. Technically TLOU has a solid 60fps framerate, very similar to the solid 30fps framerate of FC4 on both consoles.
 
Framerate discusses how many frames get processed over some interval of time. "The number" in DF's performance videos uses a fairly lengthy time interval, which means you don't get information about how frames are distributed on a fine scale. But DF also includes a frame time chart to provide that otherwise smoothed-away information. From this frame time information, you could derive their "framerate" numbers, or numbers using a sharper filter (i.e. a half-second time average)

I'm still none the wiser here, sorry. Are you saying DF do not capture the actual number of unique frames per second for the entire capture period then report the average/median? I know they don't publish the entire captures, it's usually far less than an hour.

Also in statistical terms what is a sharpen filter? What's being done to the actual figures?
 
Please don't continue to feed the Digital Foundry invented rumor. Have you even played the game? You can sometimes play 2 hours of TLOU at a locked 60fps.

Yeah I played the original to death and have completed the Remaster twice. There are stretches that hold up well and stretches where 60fps is more of a rarity, generally those with lots of enemies like Alone and Forsaken - the hunters at the bookstore in Pittsburgh.
 
The real problem is not a definition of fps measures or whatsoever, the problem is that lately, DF seems to be investing not enough time for their tests. Despite of this, they still make quite strong and absolute conclusions.

I don't like this trend to be honest.
 
I'm still none the wiser here, sorry. Are you saying DF do not capture the actual number of unique frames per second for the entire capture period then report the average/median? I know they don't publish the entire captures, it's usually far less than an hour.

Also in statistical terms what is a sharpen filter? What's being done to the actual figures?
I mean "sharper" as in it doesn't smooth out brief behaviors as much. As in, instead of counting how many frames were spat out in the last second and then dividing that by 1 to get "frames per second", a sharper filter might count how many frames appeared in the last quarter-second and divide that by .25 to get "frames per second." Or you can measure how many frames go by over four seconds, and divide that by 4 to get a "less sharp" reading of "frames per second." My language was imprecise.

I'm just saying that the filter DF uses to produce the "framerate" number in their videos smooths over short phenomena to some degree, since it uses a moving average over a "lengthy" time interval. This shows trends and is very readable, but does not clearly show exactly how the frames are spewing out at any given moment; to clear up those questions, however, they also display frame-time information.
 
I mean "sharper" as in it doesn't smooth out brief behaviors as much. As in, instead of counting how many frames were spat out in the last second and then dividing that by 1 to get "frames per second", a sharper filter might count how many frames appeared in the last quarter-second and divide that by .25 to get "frames per second."

Ok I understand. But if your that focussed on the variations you probably want to avoid dividing and deriving numbers, instead you just publish the tables of values (e.g. 1fps to 30/60fps) and show how many how many seconds in the capture period were at 30fps, 29fps, 28fps and so on.

That way it's clear what proportion of of the capture were at what performance.
 
Back
Top