Are higher framerates likely to proliferate?

Now, I know this comes up all the time, in one form or another of "next generation, we'll have more power, so we'll see 60fps!"

The response, of course, is always that it's up to the developers.

Of recent years though, I seem to regularly happen upon mention of techniques that benefit from the additional information a higher framerate provides.

Quantum Break takes 4 previous 720p frames to output a final 1080p image, and I think Ratchet and Clank does something similar on the Pro. Checkerboard rendering uses information from previous frames as well.

I'm pretty sure there are more examples, but none are springing to mind right now.

So, as we head towards 4K resolutions, and the utilisation of techniques like temporal injection, will 45 and 60fps become more common? Does a higher framerate give better results?
 
60fps has already become more comon this gen than it was on last. I think the trend will continue. And yes, Temporal reconstruction benefits from it, as well as Temporal AA, which is the most popular AA tech on most big games currently.
 
60fps has already become more comon this gen than it was on last.

Indeed. I always roll my eyes when people get their nostalgia glasses out and state that the current generation is the worst there's ever been in terms of framerates. Arguably it's the best. Sure, there were quite a few 60fps games on the PS2, but that was due to the fact that the PS2 library was packed with fighting games, racers and arcade experiences in general. Those games still run at 60fps, but now we have quite a few 60fps shooters (all of the big ones in fact. Well, minus Destiny), Nioh, Nier and a 60fps Resident Evil bolstering the 60fps portfolio. Plus the games that run at 30fps do so quite a bit more stable as well.

As for next generation. I don't think much is gonna change really. Devs are gonna need every last drop of hardware juice in order to make games with any kind of appreciatable visual jump from the current generation's offerings. Especially because affordable computational power hasn't really caught up to the demands of current screen resolutions.
 
Last edited:
Indeed. I always roll my eyes when people get their nostalgia glasses out and state that the current generation is the worst there's ever been in terms of framerates. Arguably it's the best. Sure, there were quite a few 60fps games on the PS2, but that was due to the fact that the PS2 library was packed with fighting games, racers and arcade experiences in general. Those games still run at 60fps, but now we have quite a few 60fps shooters (all of the big ones in fact. Well, minus Destiny), Nioh, Nier and a 60fps Resident Evil bolstering the 60fps portfolio. Plus the games that run at 30fps do so quite a bit more stable as well.

As for next generation. I don't think much is gonna change really. Devs are gonna need every last drop of hardware juice in order to make games with any kind of appreciatable visual jump from the current generation's offerings. Especially because affordable computational power hasn't really caught up to the demands of current screen resolutions.
Some of us remember the days before the PlayStation, when games were 50/60Hz... So since 3D arrived things went south and are only starting to get better...
 
I remember those days as well. I also remember that plenty of 60 hz 2d games on my Gameboy, SNES and MS Dos PC (those things were certainly not made with scrolling planes in mind) with plenty of substantial slowdown. Those usually didn't manifest themselves in dropped frames of course. Just in gameplay that felt like wading through the mud. There were actually a couple of 30Hz games on these old machines as well.
But yeah, overall these simplistic games had much fewer problems hitting that tv refreshrate target.
 
Some of us remember the days before the PlayStation, when games were 50/60Hz... So since 3D arrived things went south and are only starting to get better...
60fps was a thing on consoles and arcades (which also had performance issues, but they manifested as slowdown rather than framerate drop)
But arguably, during that same period many people were playing in c64's, amigas, IBM PCs, where 30, 20 and sometimes even less, fps were very much a thing too.
 
I no longer consider 30fps acceptable. I'm done with it. It's not going to stop me from playing a really good game, because that's just he reality you have to deal with on console, but 30Hz is never a good choice. There isn't a single game that wouldn't be better at 60Hz, except maybe card games and turn-based games. Really, with sample and hold displays, 90-100Hz is a nice point where you get a really nice blur reduction before diminishing returns start to kick in. I would say 60Hz is basically the minimum that's acceptable now. And at 100Hz you can do things like strobing backlights to get a really nice blur reduction. Not sure the point of high resolution and high details at 30Hz where your display is going to have a serious impact on visual clarity in any kind of movement, especially when you pile on more intentional motion blur to try to hide 30fps judder.

I think most developers are starting to get it, and they're doing 60Hz when they can. Dynamic resolution is becoming popular, and that's a great thing. My hope is that next gen consoles will use a good chunk of that power to prioritize framerate over resoultion. I honestly don't care about native 4k. Give me 60fps GTA6 at 1800p.
 
But yeah, overall these simplistic games had much fewer problems hitting that tv refreshrate target.

They were incentivized to do so by those machine's architectures. They didn't have enough memory to keep a full framebuffer, so they had to render graphics as they were being drawn to screen. So even if you fail to update a frame on time, at least the graphic hardware would have to re-render everything again. So failing to hit 60fps felt like a bigger lost oportunity.
Also, due to the fact they rendered the graphics as the signal was being sent to the TV, they had to have hard limits on how much stuff they could do on each line. If you went above that limit, some sprites would simply disapear. So that was a bigger insentive to mantain devs under a sane amount of stuff happening at the same time.
 
So the current trend is three modes including Visual, Resolution and Framerate. As long as developers keep the Resolution and Visual mode no matter what, I'll be more than content since I could care less about 60fps. High frame rate on consoles just automatically cheapens the visual presentation when you know straight away with 30fps you could pump out so much further.
60fps does improve the smoothness but 30 still feels good enough, yet the immense visual improvement the latter offers greatly outweighs the benefit of a faster framerate. The majority will always prefer a 30fps Horizon 2 with visuals looking like Avatar far more than a 60fps version that's borderline Horizon up rezed, it's science, you can quote me on that :yep2:
 
30Hz games look great when you're not actively playing them. If you stand motionless and stare at particular things, they'll look good. Otherwise, 30fps is pretty shit. The amount of blur on LCD-based screens is horrible. OLEDs are a big improvement if you can afford them, and most people can't, but they are still sample and hold displays that will cause a form of motion blur. I don't think any of the oled tvs are low persistence.

For most people, 60Hz on their LCD tv will be a big improvement in image clarity, and the games just play a lot better.
 
Not image clarity, motion clarity. And at the expense of native resolution, AF, LODs, texture resolution etc? No thanks. In the end you'll just get a worse looking image overall. Of course if all else is equal, 60fps is the better choice.
Also the highest end LCDs cost more than Oleds mainly due to the superior HDR capability and the motion clarity really doesn't look that much worse even when sitting side by side. You'd think if 24fps blu rays look so terrible in motion people would start riot, not to mention the 5/5 stars for picture quality they've been getting from professional review sites whom have never raised concerning issues about the "low" framerate":). On the contrary people were quite opposed to the high fps transfers of The Hobbit blu ray.
 
Last edited:
You can't really compare movies to games. Movies tend to be sequences of static framing with movement within the frame. And when there's movement the people are experienced in working around it. Not to mention, film is shot to be inherently blurry, but that is different from display motion blur which is a problem for movies. Lots of people use those horrible interpolation modes on their tvs.

I mean, tell me which one of these scrolling banners has the best image quality, even if you turn the speed down.
www.testufo.com/framerates#count=3&background=stars&pps=480

If you move the camera at all in a game, you will lose more detail in a 30fps game than in a 60fps game, and most of the time in games the camera is moving. Motion clarity is a huge part of image quality. When you take input lag and responsiveness into the equation, it's really no contest for 60fps.
 
I think people are experienced in working around the 30fps in games too, the brain simply compensates as much as it can for any loss of motion clarity. Not saying it's the same as 60fps tho, but it's nowhere near as disastrous as you make it out to be, at least for me.
Also like I said earlier when all is equal 60fps does look smoother but also at a heavy cost to other areas of the image quality. Running any 60fps mode (dynamic 1080p or native 1080p Low-Med settings) vs Resolution mode (1440p, 4k CB-4k native Med to High settings)would result a dramatic loss of image quality on the 60fps mode even taking into account the smoother motion clarity. It's a fact well documented by DF in fact. The biggest ad most visible benefit you get with 60fps is smoother gameplay, image clarity is secondary and far less prominent than the immense visual, image quality boost offered by the 30 fps mode.
60fps on consoles would just be like blazing fast low res textures, low to non existent AF, low native res, pop ins and god knows what other sacrifices to be made, which heavily degrades the image quality in its own way.
 
Last edited:
There's no doubt that motion clarity is only one aspect of image quality. 30 fps judder and persistence caused by eye movement at 30fps is awful. Not to mention 30fps games are just not very responsive.There's really not much a game can do to work around the limitations of 30fps except introduce motion blur, to try to hide the judder. And then you're back to what I mentioned before. Game looks great if you're not really playing it, and not moving the camera to get a good look at things. Once you start moving the camera, it starts going out the window. 30fps is great for advertising, screenshot wars. It's not good for playing. I'd give 30fps a D grade, if it were an essay handed in at an american school. Anything less than 30 is a failure. 60fps would be like an acceptable pass, like a B.
 
60 fps is by fare more desirable for me than 30 fps if given an option. I will gladly take a hit in overall graphics to have an overall better and smoother gaming experience. It's what I do on my PC and wouldn't mind it as an option for consoles.

This is an age old argument that will never be solved. People have all types of arguments for their personal preferences and that's just what they are. Some will want an overall prettier image and will gladly sacrifice frame rate. Others will want an overall higher frame rate and will gladly sacrifice a prettier image.

But for some god damn reason 60 fps is like cutting butter with a knife at room temperature compared to 30 fps that is cutting that butter right out of the refrigerator. It's as if something just clicks in my brain when I see a game at 60 fps. It just feels better and is vastly more visually pleasing to me when the image is moving. The image is more persistent. The game definitely plays smoother. And with a game playing better I will always prefer that.
 
There's no doubt that motion clarity is only one aspect of image quality. 30 fps judder and persistence caused by eye movement at 30fps is awful. Not to mention 30fps games are just not very responsive.There's really not much a game can do to work around the limitations of 30fps except introduce motion blur, to try to hide the judder. And then you're back to what I mentioned before. Game looks great if you're not really playing it, and not moving the camera to get a good look at things. Once you start moving the camera, it starts going out the window. 30fps is great for advertising, screenshot wars. It's not good for playing. I'd give 30fps a D grade, if it were an essay handed in at an american school. Anything less than 30 is a failure. 60fps would be like an acceptable pass, like a B.
Indeed.

What I would like to see someone to try a 30fps to 60fps interpolation or similar, allowing typically 30fps game with better readability during motion. (on/off toggle in options would be nice.)

If game uses texturespace shading while shading at 30fps and rasterization at 60fps sounds fun, but at that point you either interpolate objects or run logic at 60.
 
Last edited:
It sounds crazy, but there should be a lot of 120Hz tvs coming out, and I'd like to see next-gen gaming offer 60Hz or 120Hz play. Give gamers the option to play at 120Hz on their tv ... maybe that's at 1080p on a 4k HDMI 2.1 tv with a PS5. I bet some people would choose that option, especially with HDMI VRR. There are diminishing returns for increasing frame rates in terms of responsiveness and motion clarity, but I think it's closer to 100Hz than 60Hz. Then there's black frame insertion, or backlight strobing, but you're probably going to want a game running at least 60Hz on a 120Hz screen for those to look good.
 
The spanner in the works is the move to 4K and great visuals with dynamic scaling/temporal AA the developers still need to push as much as possible beyond 1440p while importantly looking to provide a stable-persistent engine managed 30fps as frequent as possible, case in point is Assassin's Creed Origins/Horizon Zero Dawn (1080p performance not really a major improvement over 4K) amongst others.
Seems like it will be a while until one sees frequently 60fps offered with games on console/tvs, next generation earliest IMO.
 
Last edited:
The higher the resolution, the higher framerate we need. Well that's just a theory.

Also there is a strong diminishing return with resolutions higher than 4K. At some point people certainly won't notice 2x more pixels, but they'll (finally) notice 2x (or 4x) more frames.

I'd say there is a '60fps' trend nowadays. It's slowly becoming a marketing feature.
 
Back
Top