D
Deleted member 11852
Guest
That is actually in there, right under the other statistics.A more interesting data would be the dropped frames which directly indica
That is actually in there, right under the other statistics.A more interesting data would be the dropped frames which directly indica
That is actually in there, right under the other statistics.
If it doesn't update within the 1/60 time frame, it gets dropped (with vsync on).
Technically it's not "dropped" quite contrary, it stays longer than expected.
But what does this metric means? If the target was 30 fps, relative amount of "dropped" will be 50% and the absolute probably huge. Why they state the absolute amounts then?
Can somebody explain to me what they mean by "dropped frames"?
The CPU, GPU, and presentation queue work together to produce a visible frame:
- The CPU submits rendering commands that are followed by a call to Present().
- The GPU executes rendering commands to prepare visible data in a back buffer.
- The presentation queue shows the completed back buffer on the screen, optionally synchronizing to VSync.
I can only speak on behalf of Dx and XB1 ...
...
BUT to sum up .. there should be no dropped frames as of dec 2013, as all "pending" back buffers become "visible" and composited via the presentation queue..
Sry for long winded answer BUT I found presentation interesting on the xb1 and dx and wanted to share
In a uncapped 60fps game outputting at 60hz: 2 consecutive identical frames = 1 dropped frame.
You only drop frames when the renderer cannot catch up with the target refresh rate
VSync can automatically flip these BB's between these states
In a uncapped 60fps game outputting at 60hz: 2 consecutive identical frames = 1 dropped frame.
It's obviously not capped to 30, and the consoles both output on 60Hz progressive-scan video signals.That's also cool and all that, but how DF know that we are in "uncapped 60fps game outputting at 60hz"?
It's obviously not capped to 30
but the assumption is that "a 30fps game" implies capping to 30fps
It it were capped at 30fps, they'd assume dropped relative to the (presumed) 30fps target, and the number would be much lower.Ok, if it's capped. What will be the "dropped frame" number?
If you're vsync'd and your target is 30, it's usually considered a good idea to cap at 30fps since going over will add judder. Being smooth while resting at 30 is the entire reason to use the cap, so any games capped at 30fps are generally considered to be "targeting" 30fps.To me "30 fps game" means "30 fps target", not more not less.
I think it's a little weird, but it's not something wildly abstract.TL;DR as I suspected this whole "dropped frame" business is some strange number that does not represent anything of value.
since going over will add judder
TL;DR as I suspected this whole "dropped frame" business is some strange number that does not represent anything of value.
Well, not without better granularity. Simply giving a number of dropped (relative to a standard) frames over a duration is equivalent data to giving the number of new frames over that duration, and thus is the same as giving the average framerate over the duration, except in a way which is rather weird and requires more context to understand what it means.It's the same information, presented differently, that could make up a frame-time graph for a vsynced game.
I don't follow.So, to measure the "imperfection" we just produce a derivative of the frame times. And because first order derivative will not tell us much (sampling, linear aproximation and all that jazz) we use second derivative, i.e. "average of abs of second order central difference" is probably the best estimator we will ever get. That's it.
Ok, if it's capped. What will be the "dropped frame" number?
Which is a very strange thing to say.
To me "30 fps game" means "30 fps target", not more not less.
And obviously there is no way to know that using video stream analysis.
TL;DR as I suspected this whole "dropped frame" business is some strange number that does not represent anything of value.
Well, not without better granularity. Simply giving a number of dropped (relative to a standard) frames over a duration is equivalent data to giving the number of new frames over that duration, and thus is the same as giving the average framerate over the duration, except in a way which is rather weird and requires more context to understand what it means.
It's the same information, presented differently, that could make up a frame-time graph for a vsynced game.
To many (most?) who like to talk about these things it's neither strange nor valueless.
and thus is the same as giving the average framerate over the duration, except in a way which is rather weird and requires more context to understand what it means
I don't follow.