Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).
He and DF are using different trailing averages.
Can you expand on this? I don't get what would cause that behaviour.Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).
He and DF are using different trailing averages.
Can you expand on this? I don't get what would cause that behaviour.
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).
He and DF are using different trailing averages.
Any time you don't meet 30hz and don't tear you drop to 20hz (on a 60hz display).
Just watch the video. You'll see plenty different drops like 25fps or 29fps etc.
Can you expand on this? I don't get what would cause that behaviour.
How so? If one frame in a second takes 40ms you'll get 29 frames not 20. Drop 2 frames and you'll get 28, drop 3 and you get 27 and so on. You'd have to have 10 frames exceed 32ms to get 20fps.
That's not really true with triple buffering. For example, in a 60fps, when you have a framerate drop to 50fps, the updated frames could look like this, 16.66-16.66-16.66-16.66-33.33-16.66 and so on.
If the game drops to 50fps on a non vsync game, then the game updates every 20ms with torn frames.
What you're describing here is an average, and that's true with or without triple buffering!
Potentially. But it's more likely that there will be some variation in frame times and so 20 ms / 50hz will still be an average.
With vsync on a 60 hz display, each frame-display cycle will complete at 60hz, 30hz, 20hz, 15hz etc (these are all rates, and have a ms equivalent per cycle). If you miss one period you drop back to the next, longer one. This is what the frame time graph on DF shows.
But framerate counter programs like that do use moving averages; if it just showed 1/(frame time), you'd need rapid update for it to be meaningful, which would make the output hard to read.What you're describing here is an average, and that's true with or without triple buffering!
With vsync on a 60 hz display, each frame-display cycle will complete at 60hz, 30hz, 20hz, 15hz etc (these are all rates, and have a ms equivalent per cycle). If you miss one period you drop back to the next, longer one. This is what the frame time graph on DF shows.
Yes, but that's the metric that we are measuring, how many frames per "second". So what's true with Vsync in a certain frames per second, but that frames can only update in every 16.66ms, 33.33ms, 50ms, or 66.66ms. People tend to say that if a game dips below 60fps, then it goes to 30fps, and that's simply not true, since you could have only three frames in that second that take longer than the 16.66ms threshold. So frames per second being the metric of measurement, the game never ran at 30fps, seeing as the number of frames rendered per second is the measurement at hand. @function, you know this, I don't mean to be condescending, you are much better versed in the technical side of things that I am. I just believe that when people hear that Vsynced games can only run at 60, 30, 20, and 15 frames a second, they take this very literally. So when watching performance test from DF, it creates some confusion.
I don't think I follow - it sounds like you're describing a panel that adapts it's frequency to the video input but I don't know any display that does this.
Assuming you don't want torn frames, a 60hz (60fps) feed needs each frame completed in 16.32ms or less, for 30fps needs each frame completed in 33ms or less. If just a few frames miss their budget for framerate the entire framerate doesn't go sideways to a lower refresh rate, you just lose a few frames in any given second.
<snip>
Interesting. I didn't think the refresh rate changed. I thought the latency between frames when vsync was engaged on a 60Hz panel is frame time mod 16. Meaning in the worst case scenario the graphics card would hold frame for 15ms to catch the next refresh cycle.
It all depends on the period you calculate your average over.
If you average over a long period then a few frames will have little impact on the fps figure, but over a shorter period the drop will be much bigger. The same 60hz game with a single '30hz' frame could be described as being 59.99 fps or 30 fps. Both would be accurate, but I think the point you're getting at is the meaning that people give to these figures.
Different folks are sensitive to different things. A 29.9 fps game could be more annoying to a person than a 29.9 fps game. It would depend on what sequence of results generated the figure, and what that particular persons sensitivities and preferences were.
Yes. Very yes Although there's no right or wrong way to use Hz (hertz), the origin is it being an International System of Units for things which are a immutable numbers of cycles per second. So referring to a variable framerate in hz will certainly cause confusion. I present exhibit AI think we're mis-communicating. My use of "hz" for both the display or the game's update is probably confusing.
The display has a fixed update cycle, but the vsynced game is flipping its buffer at a (potentially) variable update cycle. It's this variation in flipping the buffer I'm talking about. That, too, can be described in hz.
How sideways your frame rate goes depends on the number of frames you miss and the period of time or number of cycles you're calculating your rate over. That's the point I've been trying (apparently quite badly!) to make.
Yeah, but he still seems to be using a different length of trailing average, so there will be differences, as shown when he goes over the sequence DF did. Any idea how he handles measuring tearing and associated frame rate measurements?
I think that is where your losing some people, because your using a different measurement than frames per second. Breaking it down into smaller units of time will change the perceived framerate. At a given time, you could have a 60fps game that has a sudden spike, and suddenly frames take upwards of 50ms to complete, but if that only last a few frames, then your framerate will not be perceived as 20fps, even though for a few frames, it was updating like a game that runs at 20fps.
Ok accepting you're using framerate and hz interchangeably, it's still not the case that if a game can't maintain 30fps constantly that it drops to 20fps, and if it can't maintain 20fps it drops to 15fps. Not if the game can able to maintain 21fps, 22fps ... 29fps. It's just variable. That's assuming double buffering, Goodtwin has already mentioned how triple buffering can make this even more complicated by smoothing out the framerate.
In general usage, though, the word "rate" implies a trend. Saying that "a game is running at 10fps" because a single frame displayed for 100ms is like saying "I'm looking at a 5Hz square wave" when you see a 100ms rectangular pulse on an oscilloscope.With vsync enabled, I've always found it easier to think in terms of 20 fps / hz o) update or a 30 fps/hz update or a 60 ... etc, because that's the rate at which subsequent buffers are flipping.