And even this doesnt work anymore. Reflex is lowering input latency while reducing slighty the FPS.You're equating "more performance" with lower input latency, which is certainly one way of doing it.
And even this doesnt work anymore. Reflex is lowering input latency while reducing slighty the FPS.You're equating "more performance" with lower input latency, which is certainly one way of doing it.
Not to try to draw you back into this, but this is exactly the problem with having an undefined term. If you look at any review of Halo 5 that talks about performance they will tell you that it's a 60fps game, and some will mention that animation/simulation of elements at a certain distance drop that framerate by half, and IIRC half again at extreme distances. So in 1 second of 60 frames, you have some elements performing at 30fps and some at 15 as well. So what is the "real" performance in that game, and how is it quantifiable against a game that runs all elements at 60fps. And what about online games. Call of Duty always stived from 60fps but server tick rates were often below that. I believe modern mainline ones are 60 now, but don't those games offer a 120fps mode? And isn't Warzone lower, at 20 or 30 still? And there are games, I believe S.T.A.L.K.E.R. was like this on PC, though I might be misremembering the game, where single player games are played with a client connecting to a server instance running on the host machine. So the games actual update is limited by tick rate regardless of how many frames are being rendered.Well that's my issue too, the original point i was making , the over eager marketing presenting generated frames as equivalent to rendered ones (world geometry and yes yes with delayed/acumulated/interpolated shadow&light tricks included) as if it is a performance increase (independent of physics simulation, animation mostly), but some people love to nip at extrapolations and try to bend the original argument. I won't bother going around in circles.
I actually have no idea, I don't keep up with their output like I used to. They definitely covered Halo 5 at the time, and I think the written article had the tagline something like "The cost of 60fps", which was apparently not running everything at 60fps.Didn't DF do a whole article on that recently?
AC Valhalla if I remember has moving shadows running at half the frame rate as well on consoles.The only recent DF videos that I can think of that had games running things at 1/2 or 1/4 rate was the Resident Evil remakes.
AC Valhalla if I remember has moving shadows running at half the frame rate as well on consoles.
There are many more. It’s not that uncommon.
Warzone runs at 120 FPS if you want on both current gen consoles, fairly consistently at that. You have to enable it though, by default it runs at 60.Not to try to draw you back into this, but this is exactly the problem with having an undefined term. If you look at any review of Halo 5 that talks about performance they will tell you that it's a 60fps game, and some will mention that animation/simulation of elements at a certain distance drop that framerate by half, and IIRC half again at extreme distances. So in 1 second of 60 frames, you have some elements performing at 30fps and some at 15 as well. So what is the "real" performance in that game, and how is it quantifiable against a game that runs all elements at 60fps. And what about online games. Call of Duty always stived from 60fps but server tick rates were often below that. I believe modern mainline ones are 60 now, but don't those games offer a 120fps mode? And isn't Warzone lower, at 20 or 30 still? And there are games, I believe S.T.A.L.K.E.R. was like this on PC, though I might be misremembering the game, where single player games are played with a client connecting to a server instance running on the host machine. So the games actual update is limited by tick rate regardless of how many frames are being rendered.
This discussion is complicated, because a term we've used for decades is of debatable definition.
That's the point, though. The tick rate, the rate the server updates and therefore that maximum the physics, player locations and direction, animations, inputs... Everything is tied to that. Except what the player sees, which is updated at a frame rate faster than the tick rate, and that data is interpolated on the user side to make motion appear smoother. Anyone who plays or has played competitive online FPS knows that higher framerates are better, even higher than the tick rate. How is this, fundamentally, any different from frame generation, where the simulation is locked to a certain level of performance, but the player side output is represented by a faster, smoother, interpolated representation of that simulation. And if that's the case, that they are fundamentally the same, then why is running Warzone at 120fps "better performance" when the tick rate is 20 or 30 or whatever it is, but frame generation is not?Warzone runs at 120 FPS if you want on both current gen consoles, fairly consistently at that. You have to enable it though, by default it runs at 60.
Warzone at 20 FPS sounds unplayable lol.
Outrun in arcades is a game that updates the cars and roadside details every other frame, and the road and some other elements on the opposite frames, making unique frames 60 times a second, but your car only updates 30fps as does the driving surface.There was also Plague Tale Requiem with it's 60fps console update, where they decreased the animation of rats and distant NPCs to 30Hz to achieve 60fps.
I don't believe this is how that works though, a game doesn't need to synchronize with a server to perform game logic. Obviously it needs to phone-home so players are showing at the same positions in everyone's game but comparing that to frame interpolating seems like a bit of a stretch. Tbh I don't know how modern netcode works though so I'd be interested to hear if anyone has industry knowledge on this.That's the point, though. The tick rate, the rate the server updates and therefore that maximum the physics, player locations and direction, animations, inputs... Everything is tied to that. Except what the player sees, which is updated at a frame rate faster than the tick rate, and that data is interpolated on the user side to make motion appear smoother. Anyone who plays or has played competitive online FPS knows that higher framerates are better, even higher than the tick rate. How is this, fundamentally, any different from frame generation, where the simulation is locked to a certain level of performance, but the player side output is represented by a faster, smoother, interpolated representation of that simulation. And if that's the case, that they are fundamentally the same, then why is running Warzone at 120fps "better performance" when the tick rate is 20 or 30 or whatever it is, but frame generation is not?
Outrun in arcades is a game that updates the cars and roadside details every other frame, and the road and some other elements on the opposite frames, making unique frames 60 times a second, but your car only updates 30fps as does the driving
Almost anything that affects gameplay has to by synced, or else each player will have a much different experience. Take physics, for example. There are client side only physics effects like debris from destruction, but if that destruction is to produce gameplay altering physics objects, suck as large chunks that will block bullets or impede a path, that needs to be synchronized with all players or else you will have some players who will not have the those objects in those places. You can see this in modern games like Helldivers 2. If you watch the animation of the Bile Titans when they are walking, it's fairly smooth and looks as natural as a 3 story tall terminid could move. When it dies, it becomes a networked physics object as it ragdolls, and instead of being a smoothly animated monster, it's legs and body become a network synced physics object that often jitters and twitches, to it's final resting place, and it's body can block paths or bullets and be climbed on until it despawns. But you can see the point when it stops being an animated mesh and it becomes a networked physics object because each frame. The quality of it's falling animation isn't at the same level as the walking and attack animations, and I suspect it's because of the way Helldivers 2's networked physics are synced. The lack of dedicated servers probably doesn't help. I haven't counted the frames, but the Bile Titan falling looks to play back at 15-20fps to me.I don't believe this is how that works though, a game doesn't need to synchronize with a server to perform game logic. Obviously it needs to phone-home so players are showing at the same positions in everyone's game but comparing that to frame interpolating seems like a bit of a stretch. Tbh I don't know how modern netcode works though so I'd be interested to hear if anyone has industry knowledge on this.
To be clear, Warzone does indeed appear to run at '20 fps' on the server. https://earlygame.com/guides/what-is-tick-rateWarzone at 20 FPS sounds unplayable lol.
Game | Tick Rate |
Call of Duty: Warzone | 20 |
Apex Legends | 20 |
Fortnite | 30 |
Overwatch | 60 |
Counter Strike: Global Offenisve | 60 |
Valorant | 120 |
Framerate and gameplay logic are also related, but not the same, as well. But we instinctively tie frame rate to performance, even if the frame rate is well above the game logic. Again, I cited a couple examples earlier where things like physics and character animation can be calculated and sometimes even displayed at rates that are below the frame rate, but we often disregard that when we discuss performance or even the more specific discussion of frame rate. And now we are struggling to describe how tech like frame generation is different from "performance", in the traditional sense. And I just don't get it. Why is frame generation any different from any other increase in performance, if in many cases the game logic isn't running any faster. What we are getting on screen is just an interpolated representation of that logic, either calculated locally or remotely and then synced over a network. So why does it suddenly not count if we are creating more in between images via frame generation but it does count if we are generating those frame vis rasterization?Server tick rate and gameplay logic update rate are related, but not the same. The server resolves state from (data received from) each client at the tick rate, the clients update at a much higher rate.
That's the point, though. People are saying that Warzone, a multiplayer only game, performs at 120fps when the game, at least all of the interactions that matter, run at 20fps. So what is the "performance" of Warzone. And if we concede that Warzone is better at 120fps even though the tick rate is just 20, then the concept of frame generation not being real performance because the game logic is running lower that sort of goes out the window.The reason it's not unplayable is the client is running a lot faster, as fast as your PC can. It interpolates the player movements so you can track a guy who's moving even though their real position is only known every 1/20th of a second. It gives you a smooth sense of motion and clarity in game, but also can be inaccurate and results in things like no reg and gettnig shot through doors and round corners.
Framerate and gameplay logic are also related, but not the same, as well. But we instinctively tie frame rate to performance, even if the frame rate is well above the game logic. Again, I cited a couple examples earlier where things like physics and character animation can be calculated and sometimes even displayed at rates that are below the frame rate, but we often disregard that when we discuss performance or even the more specific discussion of frame rate. And now we are struggling to describe how tech like frame generation is different from "performance", in the traditional sense. And I just don't get it. Why is frame generation any different from any other increase in performance, if in many cases the game logic isn't running any faster. What we are getting on screen is just an interpolated representation of that logic, either calculated locally or remotely and then synced over a network. So why does it suddenly not count if we are creating more in between images via frame generation but it does count if we are generating those frame vis rasterization?
I think you're making it more complicated than it is -- "framerate" has always been how often we game state and draw a new frame. At minimum that means updating the camera matrix, although for almost any 3d game it also means updating player position if nothing else. If certain elements in the world, like animations, global illumintation, vfx physics, etc, are updated at a lower rate, that's a graphical artifact worth commenting on. Interpolating those frames at a higher rate (via taa or prediction, or something more special purpose like oculus timewarp) can have a good effect and is a different thing.
Not quite. 120 FPS rendering is 120 FPS performance, but 20 Hz physics is not as good as 60 or 120 Hz. 120 FPS rendering is not the same quality as 120 Hz native everything. In the same way, 120 fps interpolated rendering is better than 60 fps rendering, but not as good as 120 Hz native rendering. And 2160p upscaled from DLSS is better than 1440p native rendering but not as good as 2160p native rendering.So what is the "performance" of Warzone. And if we concede that Warzone is better at 120fps even though the tick rate is just 20, then the concept of frame generation not being real performance because the game logic is running lower that sort of goes out the window.