Frame Rates [2019+ Edition] *spawn*

iroboto

Daft Funk
Moderator
Legend
Supporter
It’ll be similar to this generation I think. Perhaps with more flexibility on performance/quality modes. But I expect most twitch games to have a baseline 60, and other titles to be 30/60. Maybe a twitch game may support 60/120. But I’m not sure about that.
 
Well, it's also true that in motion, a 60fps title does look better then a 30fps one.
Might need to introduce an autoban for people using the word 'better' without qualifying it.

It's subjective. Some people will prefer the superior eye-candy of 30 fps. Others, the smoother updates and better fidelity. Others, it'll depend on the game, and on the TV.
 
Last edited:
A few games on base PS4 already run at 120fps. As usual with framerate it's going to be up to the devs.

As it should be.

The one thing I will add to the fps discussion is that the next generation of consoles will be objectively more capable of achieving 60 fps or higher due to the increased CPU capability. It will be less work and developers will have to make fewer sacrifices to achieve 60fps, so I honestly believe more will choose to do so.
 
I ment for animations/movement.
Image clarity and smoothness when the screen is dynamically changing the most notable visual difference of higher frame rate for most people.
Because you show something moving 1 m/s over the course of 120 frames vs 30 frames, the amount of delta between each frame is so much less allowing you to see the image almost as if it’s static vs the ghosting effect that occurs with 30fps.
This is also imo; another way that higher resolution benefit from. If the frame rate is so high that the delta between frames is not enough to change a pixel at lower resolution the render is just the same static image again. Where with higher resolutions, ie 4K, you the delta would have to be even smaller for there to be no movement on the pixels.

So. 4K + HDR + super high framerate working together in tandem should create a very obvious difference from 1080p. For those that cannot see the difference yet.
 
For me is obvious that many games that now runs at 30 fps in ps4/XBOne (or are planned to run as they will be released next 2 years) ... will run double fps rate with ps5/XBScarlett as the only benefit of new gen... then few other developed only for ps5 maybe will be 30 fps... 120 fps ? Not so many 4K tv owned by people can run 120 fps... so this will see maybe in 5 years....
 
A big consideration is frame pacing. Since I game on a 2017 oled (missing vrr). Playing at 30fps without any dropped frames and good frame pacing is a much "smoother" experience than 60fps with dropped frames.

For example, the performance mode in God of war does not provide a locked 60fps (45-60fps) and looks extremely choppy on an oled . The quality mode is not only smoother, but a more consistent, polished and cinematic experience. The response time of the controls are also less variable, making the controls feel more predictable. Even a vrr capable display cannot fully negate the flaws of a variable framerate.

The icing on top is the eye candy. But keeping a locked framerate has always been a fetish for me. Nothing pulls me out more from an immersive experience than stuttering. Panning an in-game camera using joysticks makes frame pacing issues far more obvious. Whereas the twitch like motion using a mouse can mask this problem
 
For me is obvious that many games that now runs at 30 fps in ps4/XBOne (or are planned to run as they will be released next 2 years) ... will run double fps rate with ps5/XBScarlett as the only benefit of new gen... then few other developed only for ps5 maybe will be 30 fps... 120 fps ? Not so many 4K tv owned by people can run 120 fps... so this will see maybe in 5 years....

If that were true, the FFX remaster would run at something like 600fps.

It's going to be trivial for cross generation games to run at 4K60, yes, but only if there are no generational improvements applied beyond resolution and framerate. Throw some ray tracing into the mix, and something's got to give. Same with things like draw distance, shadow map resolution, texture resolution, quantity of NPC's, quality of AI.

There's a whole bunch of settings open to the next generation - without going to the developmental expense of implementing ray tracing - that can be tweaked to result in a 4K30 version of a current generation game.
 
A big consideration is frame pacing. Since I game on a 2017 oled (missing vrr). Playing at 30fps without any dropped frames and good frame pacing is a much "smoother" experience than 60fps with dropped frames.

For example, the performance mode in God of war does not provide a locked 60fps (45-60fps) and looks extremely choppy on an oled . The quality mode is not only smoother, but a more consistent, polished and cinematic experience. The response time of the controls are also less variable, making the controls feel more predictable. Even a vrr capable display cannot fully negate the flaws of a variable framerate.

The icing on top is the eye candy. But keeping a locked framerate has always been a fetish for me. Nothing pulls me out more from an immersive experience than stuttering. Panning an in-game camera using joysticks makes frame pacing issues far more obvious. Whereas the twitch like motion using a mouse can mask this problem

VRR provides an interesting wrinkle. Will games be able to detect if the console is in VRR mode and automatically activate an uncapped FPS mode when it is? That would be interesting.
 
Image clarity and smoothness when the screen is dynamically changing the most notable visual difference of higher frame rate for most people.
Because you show something moving 1 m/s over the course of 120 frames vs 30 frames, the amount of delta between each frame is so much less allowing you to see the image almost as if it’s static vs the ghosting effect that occurs with 30fps.
This is also imo; another way that higher resolution benefit from. If the frame rate is so high that the delta between frames is not enough to change a pixel at lower resolution the render is just the same static image again. Where with higher resolutions, ie 4K, you the delta would have to be even smaller for there to be no movement on the pixels.

So. 4K + HDR + super high framerate working together in tandem should create a very obvious difference from 1080p. For those that cannot see the difference yet.

Yes, i cleary said better and why i thought so. Didn't just say 'higher frame rate is better'. To me it's clear that 60fps is having advantages over 30, aside from being able to put less on screen then. It also depends on the game, an example is MGS2 which was running a smooth 60fps for the most, then MGS3 was doing more obviously, with the jungle and all, but at a unstable 30fps (almost never really reaching that target). MGS2 felt and looked better in the animations and action, while for more static things MGS3 looked nicer, but even then i don't think theres that much of a difference between the two, aside from another location.

I’m most excited to see how the uncapped games of this gen fair on next gen hardware.
That combined with dynamic res could prove pretty exciting results from past gen games.

Shouldn't be a problem since current gen will be 7 years behind PS5 by then.
 
VRR provides an interesting wrinkle. Will games be able to detect if the console is in VRR mode and automatically activate an uncapped FPS mode when it is? That would be interesting.
Why wouldn’t they? TVs report that info to consoles via CEC and the game can query/exchange the status via the console.
 
Why wouldn’t they? TVs report that info to consoles via CEC and the game can query/exchange the status via the console.

I expect the option will be there. I just never thought about it before. Though, if the frame rate is too variable, a lower, locked, frame rate may be preferable to one that has wild swings. Even so, the developer at least has more than the binary 30/60 choice if the display can sync to a frame rate in between.
 
A big consideration is frame pacing. Since I game on a 2017 oled (missing vrr). Playing at 30fps without any dropped frames and good frame pacing is a much "smoother" experience than 60fps with dropped frames.

For example, the performance mode in God of war does not provide a locked 60fps (45-60fps) and looks extremely choppy on an oled . The quality mode is not only smoother, but a more consistent, polished and cinematic experience. The response time of the controls are also less variable, making the controls feel more predictable. Even a vrr capable display cannot fully negate the flaws of a variable framerate.

The icing on top is the eye candy. But keeping a locked framerate has always been a fetish for me. Nothing pulls me out more from an immersive experience than stuttering. Panning an in-game camera using joysticks makes frame pacing issues far more obvious. Whereas the twitch like motion using a mouse can mask this problem

Yup, I do everything I can to lock it at 60 FPS (been doing this ever since I competed professionally in UT back in 1999). There will, of course, always be things that will make it drop (loading assets will almost invariably cause a short hitch, for example) for a frame or two, but that happens at 30 FPS as well in the games where this happens.

VRR doesn't really help this. It does make the screen look smoother versus V-sync off when the framerate does drop, but I can immediately feel the variability in control response feedback. Games just don't feel good with VRR. At least when limited to 60 FPS at the high end.

It's possible that with a 120 FPS cap and frame rate varying between say 110 and 120 that VRR won't feel so bad, but right now dropping from 60 FPS to 55 FPS on a VRR display is immediately noticeable to me from a control feedback response loop POV. I know most gamers probably wouldn't notice, however.

I'd imagine that control feedback response loop would be even more noticeable at 30 FPS. Although again, I wonder how many console gamers will notice as they are already used to the sluggishness of 30 FPS gaming.

Regards,
SB
 
Next gen won't be 30 FPS nor 60. It'll be VRR.

Stop trying to gatekeep "good videogames" through the one feature you like, people. We should have grown beyond that by now.

...
As much as I'm looking forward to VRR (I've been dreaming of this tech for 15yrs), the only issue is that it isn't a feature devs can rely on being readily available any time soon.

VRR won't be prevalent or dominant; and certainly not standard, for some time, so they will always have to provide a capped 30fps or 60fps mode if they don't want a stuttery experience for console users who don't have VRR.

Unless you're ok with gamers having a rough experience, it won't be reasonable to put out a game that only runs at 40-50fps and relying on VRR to keep it smooth on the other end.

I expect we'll see 30fps remain dominant (though somewhat less dominant) next-gen, but I also expect we'll see VRR as an option on ~"60fps" performance modes that gives you a few extra frames when they're available and tidies up the rough edges when it's below. Hopefully as the tech advances we'll see the VRR range in displays go from ~48-120, to something like ~25-120; and the same VRR option could be applied to the 30fps modes (as long as it doesn't become a major developer crutch for sub 30fps drops).

To summarise, it's not reasonable to mandate VRR for a smooth experience; and I expect there to remain a ~30fps and/or ~60fps mode as default with VRR as an option to tighten up the experience. I don't expect many games out there doing ~45fps for eg. and that being the only option.

Hopefully by next, next gen, it'll be reasonable to mandate VRR for a smooth experience (just as we pretty much mandate HD as a minimum now). Of course, you still don't want massive swings, particularly at the very low end, as VRR doesn't fix major swings in response.
 
Sorry if this ends up a double post as I didn't want to jam two subjects into one, but on the subject of 30fps vs 60fps; and slightly contradicting my prior post..

There is one wildcard and that's that temporal reconstruction @ 60fps will be much better than temporal reconstruction @ 30fps. The same will apply for 120fps vs 60fps in less demanding games.

Doubling the samples (frames) for reconstruction means less artefacts and more convincing results. Most people would probably be hard-pressed to tell a major difference in spatial resolution between say, 2688x1512p + temporal injection derived from 60 samples and a native resolution of 3840x2160p at 30fps; plus, with the latter, the increase in temporal resolution increases your perception of spatial resolution, you'll have inherently smoother motion and controls have the potential to feel more responsive. This of course assuming you're not severely bottle-necking elsewhere.

I agree that 60 will never be standard and is unlikely to become dominant: prettier images = more marketable. Though I do feel it is slowly becoming more prominent component of gaming in consumers minds and subsequently more marketable. I also think that a CPU better balanced with the rest of the system and solutions like I've mentioned above may help sway more edge cases.
 
Back
Top