Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Well that's my issue too, the original point i was making , the over eager marketing presenting generated frames as equivalent to rendered ones (world geometry and yes yes with delayed/acumulated/interpolated shadow&light tricks included) as if it is a performance increase (independent of physics simulation, animation mostly), but some people love to nip at extrapolations and try to bend the original argument. I won't bother going around in circles.
 
Well that's my issue too, the original point i was making , the over eager marketing presenting generated frames as equivalent to rendered ones (world geometry and yes yes with delayed/acumulated/interpolated shadow&light tricks included) as if it is a performance increase (independent of physics simulation, animation mostly), but some people love to nip at extrapolations and try to bend the original argument. I won't bother going around in circles.
Not to try to draw you back into this, but this is exactly the problem with having an undefined term. If you look at any review of Halo 5 that talks about performance they will tell you that it's a 60fps game, and some will mention that animation/simulation of elements at a certain distance drop that framerate by half, and IIRC half again at extreme distances. So in 1 second of 60 frames, you have some elements performing at 30fps and some at 15 as well. So what is the "real" performance in that game, and how is it quantifiable against a game that runs all elements at 60fps. And what about online games. Call of Duty always stived from 60fps but server tick rates were often below that. I believe modern mainline ones are 60 now, but don't those games offer a 120fps mode? And isn't Warzone lower, at 20 or 30 still? And there are games, I believe S.T.A.L.K.E.R. was like this on PC, though I might be misremembering the game, where single player games are played with a client connecting to a server instance running on the host machine. So the games actual update is limited by tick rate regardless of how many frames are being rendered.

This discussion is complicated, because a term we've used for decades is of debatable definition.
 
Didn't DF do a whole article on that recently?
I actually have no idea, I don't keep up with their output like I used to. They definitely covered Halo 5 at the time, and I think the written article had the tagline something like "The cost of 60fps", which was apparently not running everything at 60fps.
 
There was also Plague Tale Requiem with it's 60fps console update, where they decreased the animation of rats and distant NPCs to 30Hz to achieve 60fps.
 
The only recent DF videos that I can think of that had games running things at 1/2 or 1/4 rate was the Resident Evil remakes.
AC Valhalla if I remember has moving shadows running at half the frame rate as well on consoles.

There are many more. It’s not that uncommon.
 
AC Valhalla if I remember has moving shadows running at half the frame rate as well on consoles.

There are many more. It’s not that uncommon.

I didn't say it wasn't uncommon.

I merely highlighted the last set of DF videos that I remember it being highlighted.
 
Not to try to draw you back into this, but this is exactly the problem with having an undefined term. If you look at any review of Halo 5 that talks about performance they will tell you that it's a 60fps game, and some will mention that animation/simulation of elements at a certain distance drop that framerate by half, and IIRC half again at extreme distances. So in 1 second of 60 frames, you have some elements performing at 30fps and some at 15 as well. So what is the "real" performance in that game, and how is it quantifiable against a game that runs all elements at 60fps. And what about online games. Call of Duty always stived from 60fps but server tick rates were often below that. I believe modern mainline ones are 60 now, but don't those games offer a 120fps mode? And isn't Warzone lower, at 20 or 30 still? And there are games, I believe S.T.A.L.K.E.R. was like this on PC, though I might be misremembering the game, where single player games are played with a client connecting to a server instance running on the host machine. So the games actual update is limited by tick rate regardless of how many frames are being rendered.

This discussion is complicated, because a term we've used for decades is of debatable definition.
Warzone runs at 120 FPS if you want on both current gen consoles, fairly consistently at that. You have to enable it though, by default it runs at 60.

Warzone at 20 FPS sounds unplayable lol.
 
Warzone runs at 120 FPS if you want on both current gen consoles, fairly consistently at that. You have to enable it though, by default it runs at 60.

Warzone at 20 FPS sounds unplayable lol.
That's the point, though. The tick rate, the rate the server updates and therefore that maximum the physics, player locations and direction, animations, inputs... Everything is tied to that. Except what the player sees, which is updated at a frame rate faster than the tick rate, and that data is interpolated on the user side to make motion appear smoother. Anyone who plays or has played competitive online FPS knows that higher framerates are better, even higher than the tick rate. How is this, fundamentally, any different from frame generation, where the simulation is locked to a certain level of performance, but the player side output is represented by a faster, smoother, interpolated representation of that simulation. And if that's the case, that they are fundamentally the same, then why is running Warzone at 120fps "better performance" when the tick rate is 20 or 30 or whatever it is, but frame generation is not?
There was also Plague Tale Requiem with it's 60fps console update, where they decreased the animation of rats and distant NPCs to 30Hz to achieve 60fps.
Outrun in arcades is a game that updates the cars and roadside details every other frame, and the road and some other elements on the opposite frames, making unique frames 60 times a second, but your car only updates 30fps as does the driving surface.
 
That's the point, though. The tick rate, the rate the server updates and therefore that maximum the physics, player locations and direction, animations, inputs... Everything is tied to that. Except what the player sees, which is updated at a frame rate faster than the tick rate, and that data is interpolated on the user side to make motion appear smoother. Anyone who plays or has played competitive online FPS knows that higher framerates are better, even higher than the tick rate. How is this, fundamentally, any different from frame generation, where the simulation is locked to a certain level of performance, but the player side output is represented by a faster, smoother, interpolated representation of that simulation. And if that's the case, that they are fundamentally the same, then why is running Warzone at 120fps "better performance" when the tick rate is 20 or 30 or whatever it is, but frame generation is not?

Outrun in arcades is a game that updates the cars and roadside details every other frame, and the road and some other elements on the opposite frames, making unique frames 60 times a second, but your car only updates 30fps as does the driving
I don't believe this is how that works though, a game doesn't need to synchronize with a server to perform game logic. Obviously it needs to phone-home so players are showing at the same positions in everyone's game but comparing that to frame interpolating seems like a bit of a stretch. Tbh I don't know how modern netcode works though so I'd be interested to hear if anyone has industry knowledge on this.
 
I don't believe this is how that works though, a game doesn't need to synchronize with a server to perform game logic. Obviously it needs to phone-home so players are showing at the same positions in everyone's game but comparing that to frame interpolating seems like a bit of a stretch. Tbh I don't know how modern netcode works though so I'd be interested to hear if anyone has industry knowledge on this.
Almost anything that affects gameplay has to by synced, or else each player will have a much different experience. Take physics, for example. There are client side only physics effects like debris from destruction, but if that destruction is to produce gameplay altering physics objects, suck as large chunks that will block bullets or impede a path, that needs to be synchronized with all players or else you will have some players who will not have the those objects in those places. You can see this in modern games like Helldivers 2. If you watch the animation of the Bile Titans when they are walking, it's fairly smooth and looks as natural as a 3 story tall terminid could move. When it dies, it becomes a networked physics object as it ragdolls, and instead of being a smoothly animated monster, it's legs and body become a network synced physics object that often jitters and twitches, to it's final resting place, and it's body can block paths or bullets and be climbed on until it despawns. But you can see the point when it stops being an animated mesh and it becomes a networked physics object because each frame. The quality of it's falling animation isn't at the same level as the walking and attack animations, and I suspect it's because of the way Helldivers 2's networked physics are synced. The lack of dedicated servers probably doesn't help. I haven't counted the frames, but the Bile Titan falling looks to play back at 15-20fps to me.

Not everything needs to be synced server side, but more and more stuff is because the simulation needs to be accurate for everyone to create a level playing field. One of the biggest updates to Counter Strike 2, for example, is that Smoke should now occupy the same space for all players, where in previous games I believe it only had the same spawn point and the volume was created using some client side code, so sometimes players were not obscured by smoke for other players when they were from their viewpoint.
 
Server tick rate and gameplay logic update rate are related, but not the same. The server resolves state from (data received from) each client at the tick rate, the clients update at a much higher rate.
 
Depends on the setup. I'd expect local physics and server synchronisation. Unified physics over the network at 20 Hz from the server would be pretty ropey. Also I'm not at all sure about 'physics at a lower rate'. The lower the rate you go, the more issue you get. You generally want to run physics faster for accuracy, decoupling it from rendering so you can have 60 Hz physics on a 30 fps variable framerate drawing.

Networking doesn't just have to deal with differing framerates but different points in time. I get the analogy, but I also don't think it proves the argument. People aren't fine with low tickrate servers and want them faster.( Not that I'm that sure what the argument is! ;))

Warzone at 20 FPS sounds unplayable lol.
To be clear, Warzone does indeed appear to run at '20 fps' on the server. https://earlygame.com/guides/what-is-tick-rate

GameTick Rate
Call of Duty: Warzone20
Apex Legends20
Fortnite30
Overwatch60
Counter Strike: Global Offenisve60
Valorant120

The reason it's not unplayable is the client is running a lot faster, as fast as your PC can. It interpolates the player movements so you can track a guy who's moving even though their real position is only known every 1/20th of a second. It gives you a smooth sense of motion and clarity in game, but also can be inaccurate and results in things like no reg and gettnig shot through doors and round corners.
 
Server tick rate and gameplay logic update rate are related, but not the same. The server resolves state from (data received from) each client at the tick rate, the clients update at a much higher rate.
Framerate and gameplay logic are also related, but not the same, as well. But we instinctively tie frame rate to performance, even if the frame rate is well above the game logic. Again, I cited a couple examples earlier where things like physics and character animation can be calculated and sometimes even displayed at rates that are below the frame rate, but we often disregard that when we discuss performance or even the more specific discussion of frame rate. And now we are struggling to describe how tech like frame generation is different from "performance", in the traditional sense. And I just don't get it. Why is frame generation any different from any other increase in performance, if in many cases the game logic isn't running any faster. What we are getting on screen is just an interpolated representation of that logic, either calculated locally or remotely and then synced over a network. So why does it suddenly not count if we are creating more in between images via frame generation but it does count if we are generating those frame vis rasterization?
The reason it's not unplayable is the client is running a lot faster, as fast as your PC can. It interpolates the player movements so you can track a guy who's moving even though their real position is only known every 1/20th of a second. It gives you a smooth sense of motion and clarity in game, but also can be inaccurate and results in things like no reg and gettnig shot through doors and round corners.
That's the point, though. People are saying that Warzone, a multiplayer only game, performs at 120fps when the game, at least all of the interactions that matter, run at 20fps. So what is the "performance" of Warzone. And if we concede that Warzone is better at 120fps even though the tick rate is just 20, then the concept of frame generation not being real performance because the game logic is running lower that sort of goes out the window.
 
Last edited:
So I've been trying out the recent The Last of Us Part 1 patch on PC which added FSR3 and FSR3 Frame Generation.. and I must say... that game it silky smooth now WOW. Frame gen, from the amount I played, ran perfectly and I didn't notice any issues which typically happen with FG such as juddery frame times or anything else like that. This was literally perfectly smooth.

I did a quick comparison in the "performance" improvement between settings, and it's huge. First off, the first 3 images all have frame gen off and simply compare performance with FSR off, quality, and NativeAA. The last one is with FSR quality + Frame Gen + Vsync off. All images are of the game maxed out at 3840x1600p resolution.

FSR3 Off
1600pdefaultmax.jpg


FSR3 Quality
1600p-FSR3quality.jpg


FSR3 Native AA
1600p-AAnativemax.jpg


FSR3 Quality FG-On Vsync-Off
1600p-FSR3-FGunlocked.jpg



Color me very impressed. If TLOU Part 2 runs like this out of the box, I will be very happy. It's come a long way.
 
Framerate and gameplay logic are also related, but not the same, as well. But we instinctively tie frame rate to performance, even if the frame rate is well above the game logic. Again, I cited a couple examples earlier where things like physics and character animation can be calculated and sometimes even displayed at rates that are below the frame rate, but we often disregard that when we discuss performance or even the more specific discussion of frame rate. And now we are struggling to describe how tech like frame generation is different from "performance", in the traditional sense. And I just don't get it. Why is frame generation any different from any other increase in performance, if in many cases the game logic isn't running any faster. What we are getting on screen is just an interpolated representation of that logic, either calculated locally or remotely and then synced over a network. So why does it suddenly not count if we are creating more in between images via frame generation but it does count if we are generating those frame vis rasterization?

I think you're making it more complicated than it is -- "framerate" has always been how often we game state and draw a new frame. At minimum that means updating the camera matrix, although for almost any 3d game it also means updating player position if nothing else. If certain elements in the world, like animations, global illumintation, vfx physics, etc, are updated at a lower rate, that's a graphical artifact worth commenting on. Interpolating those frames at a higher rate (via taa or prediction, or something more special purpose like oculus timewarp) can have a good effect and is a different thing.
 
I think you're making it more complicated than it is -- "framerate" has always been how often we game state and draw a new frame. At minimum that means updating the camera matrix, although for almost any 3d game it also means updating player position if nothing else. If certain elements in the world, like animations, global illumintation, vfx physics, etc, are updated at a lower rate, that's a graphical artifact worth commenting on. Interpolating those frames at a higher rate (via taa or prediction, or something more special purpose like oculus timewarp) can have a good effect and is a different thing.

It doesn’t help to cherry pick which updates are important and which ones are just artifacts. There’s no objective way to do that. I think we’ve established that the requirement to update game state every frame is already not being met by many games, even before FG became a thing.
 
So what is the "performance" of Warzone. And if we concede that Warzone is better at 120fps even though the tick rate is just 20, then the concept of frame generation not being real performance because the game logic is running lower that sort of goes out the window.
Not quite. 120 FPS rendering is 120 FPS performance, but 20 Hz physics is not as good as 60 or 120 Hz. 120 FPS rendering is not the same quality as 120 Hz native everything. In the same way, 120 fps interpolated rendering is better than 60 fps rendering, but not as good as 120 Hz native rendering. And 2160p upscaled from DLSS is better than 1440p native rendering but not as good as 2160p native rendering.

I don't really disagree with the argument and I think other examples of offline games with mixed update rates better prove it. We know 'resolution' isn't a real thing and hasn't been for generations as mixed resolution buffers are used, establishing in the PS3 era that pixel counting was of the 'opaque geometry' buffer only. I'm not really sure what the argument against frame interpolation is. Whether we call 120 fps upscaled "120 fps"? If that's a confusion, the problem is just in the definition. Establish that the 'framerate' or 'performance' is "how many times a unique image is presented to the screen" and call it a day. Input responsiveness is then a new metric.
 
Last edited:
Back
Top