40 fps, how difficult is it to implement?

ZebMacahan

Regular
30 fps on my OLED is a almost unplayable because if the motion judder. I tried A Plague Tape Regiuem in the 40 fps mode, and its super smooth. The motion blur and the fact that it isnt a very fast paced game probably helps a bit, but I tried it in 30 fps again and it was basically unplayable to me. 30 and 40 fps look totaly different on my OLED. (I still wouldn´t play Doom or Tsushima in 40 fps though.)

Some games, like Guardians of the Galaxy and Callisto Protocol, dont have any RT features in their 60 fps modes. I would gladly choose perfomence mode if resolution is all thats different, but these game look much worse in perfomance mode. I would choose 40 fps over resolution every day of the week, especially if RT features were kept.

So how difficult is it to implement 40 fps? Is this something we can expect to see more of? Maybe im wrong about this but it seems like sony is pushing their OLEDs TVs for gaming, and it seems like high end LCD are getting faster pixel resonse time, so Im hoping there will be a stronger incentive to implement 40 fps.
 
Well developers will need to make sure there's enough head room to ensure their game can lock to 40fps, which depending on the game may have to be something they plan for during to dev cycle to ensure they have enough in the frame time budget to allow it.

But while I get it looks janky for you on your OLED is it enough of a general problem for developers to bother with? I think not.
 
Last edited:
I don't know but there is a resolution change as my TV switches mode. Maybe DF knows ?
Well of course, 40 FPS mode runs in the tv's 120hz mode, I bet quality mode is standard 60hz. Even if the resolution is not as high as quality mode, it is higher tan performance mode. It's specially noticeable on transparencies.
 
And some 120hz TVs don't have VRR so of you unlock the framerate it will look juddery.
There are so few TVs that support 120 Hz at 4K resolution but do not support VRR. Some of the TVs with 120 Hz but not VRR are limited to 1080p resolution mode for that 120 Hz refresh. I posted a list of them before, but can't find it now.
 
Yeah, I wager there's something up with the motion interpolation and/or frame pull-down function on your TV if 30 is abhorrent and 40 is buttery smooth. I get that it's another 33% increase in framerate, but it still doesn't seem right. I wager maybe @Reynaldo is onto something with the 120Hz (40Hz native) vs 60Hz (30Hz native) thoughts.
 
Finally found the previous discussion about 120 Hz displays, the listing of sets that are NOT Sony which would benefit looks to be just 1 display, the Samsung KS8000.

The Samsung KS8000.
Hisense H9G and H9F.
At least one more that I can't recall atm.

The Samsung KS8000 despite being a set from 2016 offering 120 Hz is quite impressive for the time.

The Hisense older sets you listed have this note on RTings: "Unfortunately, it doesn't have many gaming features like variable refresh rate (VRR) support, and despite having a 120Hz panel, it doesn't properly display any 120Hz signal." So would it even be able to display a 40 FPS game mode since it has to be set at 120 HZ with vsync? So not suitable for either enhanced mode.
 
why the need 40fps mode instead of simply VRR?
Sometimes it is an active choice I make on PC, for a couple of reasons

1) Targeting 40 FPS, you can have higher res / higher effects etc. But this is the most redundant reason most likely
2) Targeting a locked 40 FPS with dynamic refreshrate without Vsync allows you to have 40 FPS with low input lag. Sometimes, a locked 40 FPS is better than a game that wildly swings between 45 and 60. This is completely understandable. And it be the reason in some games I simply lock it to 40 and forget about it.
3) CPU limitations. A plague tale requiem without a framecap is a game that exemplifies my second point. The game wildly varied between 40 and 70 FPS on my 2700 (not GPU mind you CPU) These consoles have misaligned CPUs for their GPUs worth of share. Naturally, they ran into bottlenecks when targeting higher framerates. It is better to target a lower framerate so that CPU can maintain its course, and steer the game towards the GPU by the virtue of resolution instead. That way you both get to utilize otherwise free GPU resources and enjoy a higher resolution at a playble and nice framerate. Which is practically what I do with my own misaligned CPU and GPU combo all the time.

For the OP, it shouldn't be hard. It is fairly easy. It just depends on if devs are aware of such things and if they care for player comfort. Dynamic res targeting is already there for most games that targets either 30 or 60.

Yeah, I wager there's something up with the motion interpolation and/or frame pull-down function on your TV if 30 is abhorrent and 40 is buttery smooth. I get that it's another 33% increase in framerate, but it still doesn't seem right. I wager maybe @Reynaldo is onto something with the 120Hz (40Hz native) vs 60Hz (30Hz native) thoughts.

Although it is only %33 increase in framerate, it is actually a %50 increase in frametime costs compared to 60 FPS.
30 FPS, 33.3 ms
40 FPS, 25 ms
60 FPS, 16.6 ms

40 FPS practically stands in the exactly in the middle between 30 and 60 FPS in terms of frametimes and "perceived" smoothness.

But it is beyond that. Consoles often employ VSYNC to lock to certain framerates. VSYNC is a thing that adds MORE and MORE latency the less framerates you have. At 30 FPS, it adds an insane amount of latency, whereas at 40 FPS, it adds much less latency. This combined with a 120 hz/40 FPS coupling instead of 60 hz/30 FPS coupling, makes the 40 fps much and much responsive than 30 FPS could ever hope to achieve.

It is indeed possible to achieve that kind of snappy, responsive gameplay even at 30 FPS. You simply need to get rid of VSYNC. But console devs cannot get rid of themselves their Vsync. On PC, you can only employ VSYNC as a fallback to full VRR. A 144 hz container for a 30/40 FPS game will never invoke Vsync. However on consoles, they invoke Vsync all the time, despite using VRR. That's a choice console devs do, since Vsync'ed image appears "more smooth" and pleasant to the eye than VRR'ed image, most likely.

In other words, console devs are not using the VRR and Vsync the way it is intended on PC. On consoles, VRR is seen as a thing that enables dynamic framerates, and that is why VRR modes are hugely more responsive than locked FPS ones, because in that case, they actually do disable Vsync, or rather, only use it as a fallback. On PC, you can still use Vsync as a fall back and lock to an arbitrary framerate by a framelimiter. This is the way, but console devs would rather still use Vsync to lock framerates, which causes latency problems, more so at 30 FPS, less so at 40 FPS and really tolerable at 60 FPS.
 
Last edited:
40 FPS practically stands in the exactly in the middle between 30 and 60 FPS in terms of frametimes and "perceived" smoothness.

But it is beyond that. Consoles often employ VSYNC to lock to certain framerates. VSYNC is a thing that adds MORE and MORE latency the less framerates you have. At 30 FPS, it adds an insane amount of latency, whereas at 40 FPS, it adds much less latency. This combined with a 120 hz/40 FPS coupling instead of 60 hz/30 FPS coupling, makes the 40 fps much and much responsive than 30 FPS could ever hope to achieve.
Yeah, I understand the frametime math but it still doesn't seem to explain "buttery smooth" versus "jittery disaster" perjoratives. Also, I am naively presuming we're using these perjoratives to describe a perception of video output, not necessarily perception of input lag. To address your point and mine at the same time: I can imagine a scenario where the video itself is buttery smooth but the input lag is a catastrophe, leading to a very discombobulated and "swimmy" experience. This explanation doesn't seem to jive with what is being described.

Your callout of the 40FPS/120Hz vs 30FPS/60Hz thing is what I'm honing in on here. I feel like the description we've been given isn't indicative of input lag, but rather the perception of video output smoothness. In this case, 30FPS being called out as a very stark experience because of the nearly-zero image persistence between individual frames means it's a very jarring experience; I can't imagine 40Hz really being SO much better as to completely remove this jarring experience. Rather, I wonder if there's an internal (to the TV) scene smoothing mechanism which applies differently to 120Hz input vs 60Hz when the underlying framerate is less than the refresh rate..

Your suggestion of the console itself pushing a 60Hz refresh rate for 30FPS content, and thus the TV making different decisions as well, is what I'm really pondering here. Suppose if the console decided to show 30FPS content at 120Hz refresh rate, would the TV output somehow look better? Mathematically it shouldn't, the frame pacing of 30FPS content at 60Hz refresh vs 30FPS content at 120Hz refresh should be identical. But I wonder...
 
Your suggestion of the console itself pushing a 60Hz refresh rate for 30FPS content, and thus the TV making different decisions as well, is what I'm really pondering here. Suppose if the console decided to show 30FPS content at 120Hz refresh rate, would the TV output somehow look better? Mathematically it shouldn't, the frame pacing of 30FPS content at 60Hz refresh vs 30FPS content at 120Hz refresh should be identical. But I wonder...
Quite possible, if you want my honest opinion. You would have to compare 30 and 40 FPS on a 120 hz container to put this theory to test though. Could be that being a native 120 hz screen, OLED screens do not play well at 60Hz? TBH, 60Hz should be done away with modern games, even for 30 FPS modes. Devs should use 1/4 Vsync on a 120Hz container instead of using 1/2 Vsync on a 60Hz container.
 
Given the technology underpinnings of how OLED works, I suspect stability at any arbitrary refresh rate is guaranteed. Obviously this statement is true when staying below whatever uppermost refreshrate bounds the display is capable of. The proper question may not be if OLED screens play nicely at 60Hz, rather do the image processing systems in TVs not provide great experience at low refresh rates. At least, that's where my head is.

I'm sure 60Hz is here to stay for a long while, just like 24FPS is sticking around and has done so for the last however many decades. History matters, especially in the "feel" of legacy content that so many people believe is important to retain -- and who am I to argue? It's like the argument of keeping vinyl records alive, and there's certainly some merit to the feeling for so many people.

Circling back: I still feel like this is less to do with OLED and more to do with image processing on the TV. I'm confident in saying there must be a way for an OLED to "feel" like an LCD TV, even if it's as brute force as simply persisting / merging (x) number of prior frames at a determined decay rate per frame. Bluntly, that's exactly what old, slower-reacting LCD tech would've done physically instead of virtually.
 
Given the technology underpinnings of how OLED works, I suspect stability at any arbitrary refresh rate is guaranteed. Obviously this statement is true when staying below whatever uppermost refreshrate bounds the display is capable of. The proper question may not be if OLED screens play nicely at 60Hz, rather do the image processing systems in TVs not provide great experience at low refresh rates. At least, that's where my head is.

I'm sure 60Hz is here to stay for a long while, just like 24FPS is sticking around and has done so for the last however many decades. History matters, especially in the "feel" of legacy content that so many people believe is important to retain -- and who am I to argue? It's like the argument of keeping vinyl records alive, and there's certainly some merit to the feeling for so many people.

Circling back: I still feel like this is less to do with OLED and more to do with image processing on the TV. I'm confident in saying there must be a way for an OLED to "feel" like an LCD TV, even if it's as brute force as simply persisting / merging (x) number of prior frames at a determined decay rate per frame. Bluntly, that's exactly what old, slower-reacting LCD tech would've done physically instead of virtually.
Btw, I'm just talking about refresh rate, not the 60 FPS itself. 24 FPS content on a 120Hz screen also looks less juddery than a 60Hz screen. I know that apps and software uses something called 2-3-2 pulldown to make 24 FPS look good as much as they can on 60Hz, but it naturally divides into 120Hz, which is why I advocate for 120Hz in general :D
 
Yeah, I wager there's something up with the motion interpolation and/or frame pull-down function on your TV if 30 is abhorrent and 40 is buttery smooth. I get that it's another 33% increase in framerate, but it still doesn't seem right. I wager maybe @Reynaldo is onto something with the 120Hz (40Hz native) vs 60Hz (30Hz native) thoughts.

I tried the game in game mode on the TV, no motion enhancing effects on the TV. 30 abhorrent and 40 buttery smooth is pretty much how I experienced it. The game is pretty slow. Slow moving camera and lots of motion blur, so I guess that helps. Seems weird that 40 feels so much better than 30, but thats how I experienced it. Maybe someone else have tried?

I remember that ratchet didnt feel as good in 40 fps, but its a much more fast paced game.
 
Back
Top