why the need 40fps mode instead of simply VRR?
Sometimes it is an active choice I make on PC, for a couple of reasons
1) Targeting 40 FPS, you can have higher res / higher effects etc. But this is the most redundant reason most likely
2) Targeting a locked 40 FPS with dynamic refreshrate without Vsync allows you to have 40 FPS with low input lag. Sometimes, a locked 40 FPS is better than a game that wildly swings between 45 and 60. This is completely understandable. And it be the reason in some games I simply lock it to 40 and forget about it.
3) CPU limitations. A plague tale requiem without a framecap is a game that exemplifies my second point. The game wildly varied between 40 and 70 FPS on my 2700 (not GPU mind you CPU) These consoles have misaligned CPUs for their GPUs worth of share. Naturally, they ran into bottlenecks when targeting higher framerates. It is better to target a lower framerate so that CPU can maintain its course, and steer the game towards the GPU by the virtue of resolution instead. That way you both get to utilize otherwise free GPU resources and enjoy a higher resolution at a playble and nice framerate. Which is practically what I do with my own misaligned CPU and GPU combo all the time.
For the OP, it shouldn't be hard. It is fairly easy. It just depends on if devs are aware of such things and if they care for player comfort. Dynamic res targeting is already there for most games that targets either 30 or 60.
Yeah, I wager there's something up with the motion interpolation and/or frame pull-down function on your TV if 30 is abhorrent and 40 is buttery smooth. I get that it's another 33% increase in framerate, but it still doesn't seem right. I wager maybe
@Reynaldo is onto something with the 120Hz (40Hz native) vs 60Hz (30Hz native) thoughts.
Although it is only %33 increase in framerate, it is actually a %50 increase in frametime costs compared to 60 FPS.
30 FPS, 33.3 ms
40 FPS, 25 ms
60 FPS, 16.6 ms
40 FPS practically stands in the exactly in the middle between 30 and 60 FPS in terms of frametimes and "perceived" smoothness.
But it is beyond that. Consoles often employ VSYNC to lock to certain framerates. VSYNC is a thing that adds MORE and MORE latency the less framerates you have. At 30 FPS, it adds an insane amount of latency, whereas at 40 FPS, it adds much less latency. This combined with a 120 hz/40 FPS coupling instead of 60 hz/30 FPS coupling, makes the 40 fps much and much responsive than 30 FPS could ever hope to achieve.
It is indeed possible to achieve that kind of snappy, responsive gameplay even at 30 FPS. You simply need to get rid of VSYNC. But console devs cannot get rid of themselves their Vsync. On PC, you can only employ VSYNC as a fallback to full VRR. A 144 hz container for a 30/40 FPS game will never invoke Vsync. However on consoles, they invoke Vsync all the time, despite using VRR. That's a choice console devs do, since Vsync'ed image appears "more smooth" and pleasant to the eye than VRR'ed image, most likely.
In other words, console devs are not using the VRR and Vsync the way it is intended on PC. On consoles, VRR is seen as a thing that enables dynamic framerates, and that is why VRR modes are hugely more responsive than locked FPS ones, because in that case, they actually do disable Vsync, or rather, only use it as a fallback. On PC, you can still use Vsync as a fall back and lock to an arbitrary framerate by a framelimiter. This is the way, but console devs would rather still use Vsync to lock framerates, which causes latency problems, more so at 30 FPS, less so at 40 FPS and really tolerable at 60 FPS.