Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

"framerate" has always been how often we game state and draw a new frame. At minimum that means updating the camera matrix, although for almost any 3d game it also means updating player position if nothing else.

Which for Warzone, the game is updated is 20 times a second, because the ground truth of that game is the server. But locally, it's rendering 120 frames every second. And those frames are interpolated from data sent to and from the server 20 times a second. How would this be different than a game running locally at 20 updates a second, but the output is updated 120 times a second using a mix of rasterized, interpolated, and/or AI generated frames? If a player is running from the left side of the screen to the right side of the screen, the server (again, the ground truth simulation for that title) is only updating 20 times a second, and the local machine running the game is interpolating the movement making the character move smoothly through the scene. If we are tying performance to simulation or game state, how is this different from frame generation?

And I'll bring this up again. What about games that lock certain parts of the simulation to lower rates. Beyond the examples others and myself offered above, there are famous examples of this like Bioshock pre-console gen 8 patch having all of it's physics and some of it's other animations locked to 30fps. But probably most relevant to a conversation about performance, Quake, which was a benchmark staple for years, had it's characters rendered in what amounts to keyframe only animations, and IIRC they aren't all even animated at the same rate. There were limited physics objects in games that ran on that engine as well (Hexen 2 comes to mind) that the physics objects (bodies, barrels and stuff) were moved around the game world at lower than per frame rates. Later releases of both Bioshock and Quake would smooth out those animations, but no one would decry that the real performance of those games was ever limited to those lower numbers.
 
Which for Warzone, the game is updated is 20 times a second, because the ground truth of that game is the server.
Absolutely not correct. The game isn't broadcasting an update 20 times a second and then spending the other frames linerarly interpolating between those values. Certain key events -- enemy player position, projectile hit resolution, and other multiplayer constants are broadcast at 20 times a second. These are very importnat parts of the game, but to call them "the game" is wildly misleading. Aside from the obvious player experience parts of the game that make the higher framerate obvious (moment to moment movement physics, aiming, etc) essential data used in the logic of the 20hz updates is generated inbetween updates -- the specific higher framerate details of player position, bullet physics, etc, in both sides of a fight, are integrated to decide which is "true" when the next tick comes down. The ground truth of who won the match, what happened, and how, can all be reconstructed accurately from the sum of the 20hz updates, but that isn't the same as saying that the whole game ran at 20hz.

I also take issue with the way you use the word "physics" -- yes, rigid body and dynamic objects update at a lower rate in bioshock, but player and enemy movement, raycasts, collisions, etc, all update at the "real" framerate. Taking a small portion of physics updates and deciding they're all of physics serves your point, but obscures what's going on in the games -- in order to update what the developers consider the "core" of the game at the target frame rate, other less important (according to the devs) work is updated less frequently. This doesn't make the framerate lower or ambiguous, this further emphasizes the depths of tradeoffs devs and designers are willing to make to keep framerate high.

These are all ultiamtely arbitrary distinctions, but we've been using one (consistent) definition for framerate forever and I don't think it makes sense to try to change it now -- like you said, games had low frame rate animations (you point to quake, but you don't have to go much further back to see 60fps games where all of the character animations are four frame sprites) decades ago, but nobody was confused about what "game framerate" meant.
 
Absolutely not correct.
I think we've come to an agreement of sorts, then. If Warzone isn't confined by it's 20 tick per second update, then frame generation is more performance because the end user gets more frequent updates, even if those updates are not the ground truth of the simulation.
 
I think we've come to an agreement of sorts, then. If Warzone isn't confined by it's 20 tick per second update, then frame generation is more performance because the end user gets more frequent updates, even if those updates are not the ground truth of the simulation.
If you process input, run physics and collision tests, update the camera matrix based on input or gameplay simulation, dispatch a new render frame with that matrix at the same rate you do frame generation (which nobody has done so far, and sounds kinda absurd at first blush, but maybe could be useful, if you wanted to frame gen data to improve quality of something like dlss?) I would call that a higher framerate even if some arbitrary "ground truth" updated at lower hz.

(However, If you actually think warzone truly "runs" at 20hz in a similar sense to a console game running at 60hz, and aren't being facetious or hyperbolic to try and prove your point, you may want to learn more about multiplayer game design and programming)
 
(However, If you actually think warzone truly "runs" at 20hz in a similar sense to a console game running at 60hz, and aren't being facetious or hyperbolic to try and prove your point, you may want to learn more about multiplayer game design and programming)
I'm arguing against that point, in fact. There is some disagreement regarding frame generation being performance because, essentially, it's generating frames independent of the games logic, animation, physics, and whatever arbitrary criteria people are using to define a "real" framerate and one that includes interpolated of AI generated frames. I'm pointing out that a lot of those things aren't running at the same rate as the output framerate already, so I fail to see how frame generation frames are any different.

Warzone's 20hz tick rate is simply an example of this. I don't think any person would play Warzone at 20fps and 60fps and not say that the 60fps isn't better, assuming the image quality is the same. But that doesn't change the fact that much of the game is updated at 20hz, regardless of what the end user is seeing. And that end user experience is what's really important.

A side note, @doob suggested earlier that people be forced to run games at 15fps with frame gen to hit 30, and I can testify that I just tried that with CP2077, using the DLSS to FSR3 FG mod (I have a RTX2070 Super), and while it's terrible, it's noticeably less terrible than just running the game at 15fps. Not only do the inputs feel heavy, all of the frame generation artifacts are all that much more noticeable. 20 to 40 is better, 30 to 60 actually quite good.
 
Last edited:
Netcode is complicated. Lots have sub-ticks and can rewind and reorder to resolve sequences and do lag compensation etc. I highly doubt warzone is doing straight 20 Hz fixed ticks for the simulation,
 
I'm pointing out that a lot of those things aren't running at the same rate as the output framerate already, so I fail to see how frame generation frames are any different.

<...> and while it's terrible, it's noticeably less terrible than just running the game at 15fps. Not only do the inputs feel heavy, all of the frame generation artifacts are all that much more noticeable. 20 to 40 is better, 30 to 60 actually quite good.
Not to belabor the point even further, but:

I think this idea is somewhat interesting but you're doing a disservice to it by treating the very clear division between "some part of the gameplay loop waits n frames between updates, or is synchronized by another system at a lower refresh rate" and "in between updates, with no input from any game systems, an external system uses past frame rate to generate a new image that makes the game look and feel better" as if it's totally arbitrary or illusory. Nobody found this confusing when oculus rolled out spacewarp, even though it has a profound impact on user experience. TV motion smoothing does the same thing, albeit with vastly worse tradeoffs (by blending past frames). I guess it's a little fuzzier in that for dlss frame gen the specific games are making choices about the feature, rather than it being platform wide, but we have an existing way to talk about framerate, refresh rate, frame generation, etc.
 
A side note, @doob suggested earlier that people be forced to run games at 15fps with frame gen to hit 30, and I can testify that I just tried that with CP2077, using the DLSS to FSR3 FG mod (I have a RTX2070 Super), and while it's terrible, it's noticeably less terrible than just running the game at 15fps. Not only do the inputs feel heavy, all of the frame generation artifacts are all that much more noticeable. 20 to 40 is better, 30 to 60 actually quite good.

I would definitely agree with this from my own experience.
 
Netcode is complicated. Lots have sub-ticks and can rewind and reorder to resolve sequences and do lag compensation etc. I highly doubt warzone is doing straight 20 Hz fixed ticks for the simulation,
It'll be 20 ticks on the servers for physics resolution (though ironically you'll no doubt have multiple iterations so lots of physics steps every 1/20th of a second), but lots more going on with the client including higher frequency physics. But yeah, the physics updates on the server are few and far between causing shonky gameplay and lots of player complaints. On your game you can see yourself shooting their arm but the server only got around to tracing the bullet when the arm had moved. And yes, the doors were closed on your game but the server had them still closing when the bullet was shot.
 
I'm arguing against that point, in fact. There is some disagreement regarding frame generation being performance because
The issue is using the word 'performance'. It used to mean something; now it doesn't, so we need finer granularity in our discussion of rendering. We can talk about how many unique frames rendered per second, or rather probably how many pixels rendered per second to account for upscaling in all dimensions, as an amount of work the computer is doing. Then we can talk about the amount of unique frames/pixels presented per second for the visual clarity provided to the user. Lastly we can talk about the input latency. It now takes three values to talk about what used to take one when all these things were tied to the CRT scanline.
 
The issue is using the word 'performance'. It used to mean something; now it doesn't, so we need finer granularity in our discussion of rendering. We can talk about how many unique frames rendered per second, or rather probably how many pixels rendered per second to account for upscaling in all dimensions, as an amount of work the computer is doing. Then we can talk about the amount of unique frames/pixels presented per second for the visual clarity provided to the user. Lastly we can talk about the input latency. It now takes three values to talk about what used to take one when all these things were tied to the CRT scanline.
One of those (latency) is far more important than the other ones.
And frame generation doesn't help with that.
 
Not really. A 500ms latency of a 200 fps game is bad, but no worse than a 1ms latency on a 5 fps game. Each factor is in balance. If you can have a 60 fps game with 60 Hz input, that's better than a 30 fps game upscaled to 60 fps with a 30 Hz input. However, no interpolation option might lock you to 30 fps which is worse than 60 fps interpolated+30 Hz input.
 
One of those (latency) is far more important than the other ones.
And frame generation doesn't help with that.
Neither does running an online game at 20000fps if the server updates at 20hz, though your perception of the latency may be different. I've also found that, by and large, frame generation+ nVidia Reflex feels as responsive or sometimes more responsive than having both of those options off. I'm sure there are cases when it is not, but in the limited amount of games I've tried, FG+Reflex isn't adding latency, it's just improving some of the other aspects that influence "performance".

So, yes, FG isn't helping with latency, but if it isn't hurting latency and it's improving other things, and it feels better to the end user because of the increase in motion fluidity... So if some of the other variables in performance increase but the most important one stays the same, wouldn't that be better performance?
 
One of those (latency) is far more important than the other ones.

That’s very subjective and depends on the type of game and the sensitivity of the player. Not to mention that adding a frame of latency may be inconsequential relative to other high latency parts of the input->display chain.
 
Neither does running an online game at 20000fps if the server updates at 20hz, though your perception of the latency may be different. I've also found that, by and large, frame generation+ nVidia Reflex feels as responsive or sometimes more responsive than having both of those options off. I'm sure there are cases when it is not, but in the limited amount of games I've tried, FG+Reflex isn't adding latency, it's just improving some of the other aspects that influence "performance".

So, yes, FG isn't helping with latency, but if it isn't hurting latency and it's improving other things, and it feels better to the end user because of the increase in motion fluidity... So if some of the other variables in performance increase but the most important one stays the same, wouldn't that be better performance?
Yes, FG + reflex doesn't add much latency, but what about just reflex without FG? Let's say 60 to 120 with frame gen and reflex versus just 60 with reflex. The first may be more fluid, but the gameplay experience will be worse were it counts, player control.

I think that frame gen is a awesome feature, but to me it's not performance, it's a visual feature.
 
That’s very subjective and depends on the type of game and the sensitivity of the player. Not to mention that adding a frame of latency may be inconsequential relative to other high latency parts of the input->display chain.
I can't find that subjective. I'd argue that latency can make or break a game, and there isn't much on a technical level that is close to make a game feel as good.

For example, prey and Bloodborne on PS4. Both running at 30, one has correct frame pacing. Prey is almost unplayable on PS4, it has a similar feel to Killzone 2 on PS3. Meanwhile Bloodborne has really good input lag at 30, and it's better for it.

Of course this is an exaggerated example, but it's probably one of the reasons why series like cod got so successful. They feel responsive, even if the average user couldn't tell you why.
 
I can't find that subjective. I'd argue that latency can make or break a game, and there isn't much on a technical level that is close to make a game feel as good.

For example, prey and Bloodborne on PS4. Both running at 30, one has correct frame pacing. Prey is almost unplayable on PS4, it has a similar feel to Killzone 2 on PS3. Meanwhile Bloodborne has really good input lag at 30, and it's better for it.

Of course this is an exaggerated example, but it's probably one of the reasons why series like cod got so successful. They feel responsive, even if the average user couldn't tell you why.

That kinda proves the point. Bloodborne at 30fps and COD at 60fps would be considered horrendously high latency among some gamers today. Hence subjective.
 
That kinda proves the point. Bloodborne at 30fps and COD at 60fps would be considered horrendously high latency among some gamers today. Hence subjective.
I want to see this "gamer" person that thinks that playing cod at 60 fps is somehow unplayable due to high latency, sitting on his 480 fps gamer throne.

People that think that are a small minority, and shouldn't be relevant to the discussion.
 
I want to see this "gamer" person that thinks that playing cod at 60 fps is somehow unplayable due to high latency, sitting on his 480 fps gamer throne.

People that think that are a small minority, and shouldn't be relevant to the discussion.
The people who think Prey is 'unplayable' on PS4 are also undoubtedly a very small minority, yet you still thought it was relevant to the discussion.
 
The people who think Prey is 'unplayable' on PS4 are also undoubtedly a very small minority, yet you still thought it was relevant to the discussion.
With unplayable I meant so bad that it makes me not want to play it. And 150-200 Ms of latency in a fps plus a old LCD tv was a horrible experience... so bad that even someone who doesn't know anything about latency would notice it.
 
Back
Top