Defining framerates and methods of describing framerate fluctuations *spawn

function

None functional
Legend
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).

He and DF are using different trailing averages.
 
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).

He and DF are using different trailing averages.

Just watch the video. You'll see plenty different drops like 25fps or 29fps etc.

The opening sequence alone show that his tools are calibrated almost like DF tools: his tools give similar result as DF tools in the opening sequence where we can see the fps meter displaying roughly the same drops shown in DF video. First the 25fps drop during the explosion on PS4 (DF showed 26fps but the graph indicated a lower fps) then similar mid-20s drops on both console during the streamed loading just after.
 
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).

He and DF are using different trailing averages.
Can you expand on this? I don't get what would cause that behaviour.
 
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).

He and DF are using different trailing averages.

That's not really true with triple buffering. For example, in a 60fps, when you have a framerate drop to 50fps, the updated frames could look like this, 16.66-16.66-16.66-16.66-33.33-16.66 and so on. If the game drops to 50fps on a non vsync game, then the game updates every 20ms with torn frames.
 
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).

How so? If one frame in a second takes 40ms you'll get 29 frames not 20. Drop 2 frames and you'll get 28, drop 3 and you get 27 and so on. You'd have to have 10 frames exceed 32ms to get 20fps.
 
Just watch the video. You'll see plenty different drops like 25fps or 29fps etc.

Yeah, but he still seems to be using a different length of trailing average, so there will be differences, as shown when he goes over the sequence DF did. Any idea how he handles measuring tearing and associated frame rate measurements?

He says in the video that DF put the frame rate dips on console down to the CPU, but I can't find DF do this in their article. An additional check on the DF article for the string "CPU" or "processor" yielded precisely no results either. He appears to have made this up (unless I've missed it), and then criticised DF for it in his video commentary. Which is decidedly un-cool.

As in the DF article, performance difference between the two platforms are minimal, with the extra resolution of the PS4 being the clear advantage.
 
Can you expand on this? I don't get what would cause that behaviour.

How so? If one frame in a second takes 40ms you'll get 29 frames not 20. Drop 2 frames and you'll get 28, drop 3 and you get 27 and so on. You'd have to have 10 frames exceed 32ms to get 20fps.

With vsync on a 60 hz display, each frame-display cycle will complete at 60hz, 30hz, 20hz, 15hz etc (these are all rates, and have a ms equivalent per cycle). If you miss one period you drop back to the next, longer one. This is what the frame time graph on DF shows.

How you calculate the average (usually arithmetic mean) affects the actually "fps" figure you get. The same results can be used to calculate different fps figures. The time period (or alternatively the number of frames) you calculate the average over is arbitrary.

For example, if you calculate over the last 10 cycles of the display @ 60hz, then a 30 hz (game) frame preceded by 60, 60, 60, 60, 60, 60, 60, 60 would give an average of 54 fps.

If you only calculated over the last 3 cycles of the display, the the same results would give you 45 fps.

... that is probably an overly convoluted explanation. :(
 
That's not really true with triple buffering. For example, in a 60fps, when you have a framerate drop to 50fps, the updated frames could look like this, 16.66-16.66-16.66-16.66-33.33-16.66 and so on.

What you're describing here is an average, and that's true with or without triple buffering!

If the game drops to 50fps on a non vsync game, then the game updates every 20ms with torn frames.

Potentially. But it's more likely that there will be some variation in frame times and so 20 ms / 50hz will still be an average.
 
What you're describing here is an average, and that's true with or without triple buffering!



Potentially. But it's more likely that there will be some variation in frame times and so 20 ms / 50hz will still be an average.

Yes, but that's the metric that we are measuring, how many frames per "second". So what's true with Vsync in a certain frames per second, but that frames can only update in every 16.66ms, 33.33ms, 50ms, or 66.66ms. People tend to say that if a game dips below 60fps, then it goes to 30fps, and that's simply not true, since you could have only three frames in that second that take longer than the 16.66ms threshold. So frames per second being the metric of measurement, the game never ran at 30fps, seeing as the number of frames rendered per second is the measurement at hand. @function, you know this, I don't mean to be condescending, you are much better versed in the technical side of things that I am. I just believe that when people hear that Vsynced games can only run at 60, 30, 20, and 15 frames a second, they take this very literally. So when watching performance test from DF, it creates some confusion.
 
With vsync on a 60 hz display, each frame-display cycle will complete at 60hz, 30hz, 20hz, 15hz etc (these are all rates, and have a ms equivalent per cycle). If you miss one period you drop back to the next, longer one. This is what the frame time graph on DF shows.

I don't think I follow - it sounds like you're describing a panel that adapts it's frequency to the video input but I don't know any display that does this. Every monitor and TV I've seen (barring the G-Sync things) will set their resolution and frequency according to what the output device (PC video card, console etc) tells it so. My PlayStation 4 outputs everything at 1920x1080p @ 60z with no deviation except for Blu-ray movies which outputs at 24p. The panel does not chop and change between 60z, 30z, 20z and 15z.

Assuming you don't want torn frames, a 60hz (60fps) feed needs each frame completed in 16.32ms or less, for 30fps needs each frame completed in 33ms or less. If just a few frames miss their budget for framerate the entire framerate doesn't go sideways to a lower refresh rate, you just lose a few frames in any given second.
 
What you're describing here is an average, and that's true with or without triple buffering!
But framerate counter programs like that do use moving averages; if it just showed 1/(frame time), you'd need rapid update for it to be meaningful, which would make the output hard to read.

In the case where a game uses double-buffered vsync to a 60Hz display, if you're handling frames in 34ms, the GPU will stall that to 50ms, and so you'll get a stable 20fps output. But if you're triple-buffering, the GPU will keep on rendering, and you'll still output close to 30fps in a moving average, albeit stuttered out with extremely wonky frame timing.

If the game uses triple-buffering (or double-buffering with vsync off, or some compromise of the two) and the game drops to 20fps, it means that the game probably is actually dropping to where it takes 50ms for the processing to churn through a frame.
 
With vsync on a 60 hz display, each frame-display cycle will complete at 60hz, 30hz, 20hz, 15hz etc (these are all rates, and have a ms equivalent per cycle). If you miss one period you drop back to the next, longer one. This is what the frame time graph on DF shows.

Interesting. I didn't think the refresh rate changed. I thought the latency between frames when vsync was engaged on a 60Hz panel is frame time mod 16. Meaning in the worst case scenario the graphics card would hold frame for 15ms to catch the next refresh cycle.
 
Yes, but that's the metric that we are measuring, how many frames per "second". So what's true with Vsync in a certain frames per second, but that frames can only update in every 16.66ms, 33.33ms, 50ms, or 66.66ms. People tend to say that if a game dips below 60fps, then it goes to 30fps, and that's simply not true, since you could have only three frames in that second that take longer than the 16.66ms threshold. So frames per second being the metric of measurement, the game never ran at 30fps, seeing as the number of frames rendered per second is the measurement at hand. @function, you know this, I don't mean to be condescending, you are much better versed in the technical side of things that I am. I just believe that when people hear that Vsynced games can only run at 60, 30, 20, and 15 frames a second, they take this very literally. So when watching performance test from DF, it creates some confusion.

It all depends on the period you calculate your average over.

If you average over a long period then a few frames will have little impact on the fps figure, but over a shorter period the drop will be much bigger. The same 60hz game with a single '30hz' frame could be described as being 59.99 fps or 30 fps. Both would be accurate, but I think the point you're getting at is the meaning that people give to these figures.

Different folks are sensitive to different things. A 29.9 fps game could be more annoying to a person than a 29.9 fps game. It would depend on what sequence of results generated the figure, and what that particular persons sensitivities and preferences were. :)
 
I don't think I follow - it sounds like you're describing a panel that adapts it's frequency to the video input but I don't know any display that does this.

I think we're mis-communicating. My use of "hz" for both the display or the game's update is probably confusing.

The display has a fixed update cycle, but the vsynced game is flipping its buffer at a (potentially) variable update cycle. It's this variation in flipping the buffer I'm talking about. That, too, can be described in hz.

Assuming you don't want torn frames, a 60hz (60fps) feed needs each frame completed in 16.32ms or less, for 30fps needs each frame completed in 33ms or less. If just a few frames miss their budget for framerate the entire framerate doesn't go sideways to a lower refresh rate, you just lose a few frames in any given second.

How sideways your frame rate goes depends on the number of frames you miss and the period of time or number of cycles you're calculating your rate over. That's the point I've been trying (apparently quite badly!) to make.


I don't disagree with anything you've said! ;)

Interesting. I didn't think the refresh rate changed. I thought the latency between frames when vsync was engaged on a 60Hz panel is frame time mod 16. Meaning in the worst case scenario the graphics card would hold frame for 15ms to catch the next refresh cycle.

The refresh rate of the display doesn't change (G-sync aside) but the number of cycles that the monitor completes before the console tells it to display a new frame does.

I appear to have confused people.

I shall go back to sleep.
 
It all depends on the period you calculate your average over.

If you average over a long period then a few frames will have little impact on the fps figure, but over a shorter period the drop will be much bigger. The same 60hz game with a single '30hz' frame could be described as being 59.99 fps or 30 fps. Both would be accurate, but I think the point you're getting at is the meaning that people give to these figures.

Different folks are sensitive to different things. A 29.9 fps game could be more annoying to a person than a 29.9 fps game. It would depend on what sequence of results generated the figure, and what that particular persons sensitivities and preferences were. :)

I think that is where your losing some people, because your using a different measurement than frames per second. Breaking it down into smaller units of time will change the perceived framerate. At a given time, you could have a 60fps game that has a sudden spike, and suddenly frames take upwards of 50ms to complete, but if that only last a few frames, then your framerate will not be perceived as 20fps, even though for a few frames, it was updating like a game that runs at 20fps.
 
I think we're mis-communicating. My use of "hz" for both the display or the game's update is probably confusing.
Yes. Very yes :) Although there's no right or wrong way to use Hz (hertz), the origin is it being an International System of Units for things which are a immutable numbers of cycles per second. So referring to a variable framerate in hz will certainly cause confusion. I present exhibit A ;)

The display has a fixed update cycle, but the vsynced game is flipping its buffer at a (potentially) variable update cycle. It's this variation in flipping the buffer I'm talking about. That, too, can be described in hz.

You can, but you probably shouldn't because.. exhibit A.

How sideways your frame rate goes depends on the number of frames you miss and the period of time or number of cycles you're calculating your rate over. That's the point I've been trying (apparently quite badly!) to make.

Ok accepting you're using framerate and hz interchangeably, it's still not the case that if a game can't maintain 30fps constantly that it drops to 20fps, and if it can't maintain 20fps it drops to 15fps. Not if the game can able to maintain 21fps, 22fps ... 29fps. It's just variable. That's assuming double buffering, Goodtwin has already mentioned how triple buffering can make this even more complicated by smoothing out the framerate.

I.e. this frame takes 10ms to produce, the next takes 20ms and in a triple buffering scenario you can feed both to maintain 60fps even if the turnaround time for 60fps is 16.32ms.
 
Yeah, but he still seems to be using a different length of trailing average, so there will be differences, as shown when he goes over the sequence DF did. Any idea how he handles measuring tearing and associated frame rate measurements?

The opening sequence alone shows that his numbers (and how they fluctuate) and so how they are produced, the averaging method, are very similar to DF numbers. So whatever reasoning you have about 20fps (or any number) indicated by his tools not being a real 20 frames per second produced by the XB1 can be made on any Digital Foundry video.

What's important is that both methods to count fps are consistent and so numbers can be compared.

Finally his video often shows a heavily fluctuating framerate, the 20fps image is just that, a short moment taken in a fluctuating framerate lasting ~10 seconds which is much longer that the very short, now famous, a unique and very short ~25fps drop seen in the opening sequence that DF used to conclude that the XB1 game performed slightly better.

It's called cherry picking. It was famously (and successfully at first) used by the tobacco companies to "prove" that tobacco wasn't harmful to people.
 
Last edited:
I think that is where your losing some people, because your using a different measurement than frames per second. Breaking it down into smaller units of time will change the perceived framerate. At a given time, you could have a 60fps game that has a sudden spike, and suddenly frames take upwards of 50ms to complete, but if that only last a few frames, then your framerate will not be perceived as 20fps, even though for a few frames, it was updating like a game that runs at 20fps.

FPS is a rate, and so doesn't need to be calculated over a full second (in the same way that calculating miles per hour doesn't require you measure in periods of an hour).

In the example you give above, for the few frames that the game is flipping the buffer every 50 ms, if calculated across that period, the game will indeed be running at 20 fps. How much this is noticed or annoys will be down to the individual.

Ok accepting you're using framerate and hz interchangeably, it's still not the case that if a game can't maintain 30fps constantly that it drops to 20fps, and if it can't maintain 20fps it drops to 15fps. Not if the game can able to maintain 21fps, 22fps ... 29fps. It's just variable. That's assuming double buffering, Goodtwin has already mentioned how triple buffering can make this even more complicated by smoothing out the framerate.

What the frame rate will come out as will all depend on the period you calculate the frame rate over.

With vsync enabled, I've always found it easier to think in terms of 20 fps / hz :)o) update or a 30 fps/hz update or a 60 ... etc, because that's the rate at which subsequent buffers are flipping.

Most fps counters average frame times over some fraction of a second, of course, as HTupolev said.
 
With vsync enabled, I've always found it easier to think in terms of 20 fps / hz :)o) update or a 30 fps/hz update or a 60 ... etc, because that's the rate at which subsequent buffers are flipping.
In general usage, though, the word "rate" implies a trend. Saying that "a game is running at 10fps" because a single frame displayed for 100ms is like saying "I'm looking at a 5Hz square wave" when you see a 100ms rectangular pulse on an oscilloscope.

I do generally agree that we need to be careful with comparing moving averages, but you are describing things in unusual language.
 
Back
Top