Why isn't framerate upscaling being progressed when TVs have it but it's a better fit in game engine?

Not quite, because AFAIK that just shifts the view to track motion. I think in-game animation is still 60 fps or whatever and not interpolated.


You shouldn't need any more than already present for temporal image reconstruction techniques. You have the motion from the last frame and the motion from the next/current frame. You can extrapolate that for the tween, or delay a frame and tween between known positions.


That's not a solution where low framerate 30 fps looks juddery. Rather than looking for a fix for high framerate games that sometimes drop low, this is looking at a fix for displays that can't handle low framerate games well and need a 60 fps stream. I mean, devs could just ditch 30 fps outright, problem solved. ;) But given the overheads of that, just like upscaling lower res to 2160p is overall a better compromise than rendering a full 2160p, rendering lower framerates and upscaling seems a smart option.

I dunno. Are those TVs that struggle with 30 fps fine with 45 fps? Although devs can't really set 45 fps as a minimum yet because to many gaming displays are 30/60 Hz.
in my experience, 45fps is the same if not even worse than 30fps, choppy as hell. When I set Elden Ring to native 4K I get 45fps on average and even judder reduction and BFI don't do much. At 30fps they seem to be more effective.
 
Huh? TVs and shows have it way better, they've already got two frames to interpolate in between, and at 16ms or way longer to do so.

Unless you want a much bigger hit to latency, which judging by some on here they'd happily take I guess? (My fps counter is all that matters!) you have to guess what an "in between" frame is going to be, while the next frame is rendering, in the span of say, 8ms or less, while not taking enough power to slow the next "real" frame down (much) either. And you have to do all this based on the previous 2 frames or so you actually have (maybe current real frame motion vectors, do they wait that long? It feels like a really good input).

So it's a really bad fit for game engines, it's super hard. DLSS3 is basically just lowering settings so you can go faster, judging by the average persons perceived drop in image quality with any fast movement anywhere. Just that you go faster than you would by dropping other settings, so it's still a cool tradeoff to have if you want. But it's not easy.
 
Curious if any system that's doing a linear interp from 2 samples/images if you end up with detectable motion artifacts when the motion is accelerating/decelerating. The image being constructed shouldn't be the 1/2 way point between them, but with only 2 samples you don't have the information to weight the interpolation any differently than 0.5 and 0.5. I'd expect for low FPS frame generation the artifacts would be much more visible. Presumably when TVs do 24/30->60 they're able to utilize as many frames as they want and thus be able to do some kind of spline/cubic interp.
 
Huh? TVs and shows have it way better, they've already got two frames to interpolate in between, and at 16ms or way longer to do so.

Unless you want a much bigger hit to latency,
I think you misundertand. No-one wants the hit to latency. However, if you want smoother visuals and the game is only capable of managing 30 fps, the only solution is frame interpolation, and at the moment the only place that happens is on the TV. If choosing between motion upscaling on TV and in game, surely the latter is better? Obviously higher initial framerates is the best option, but where you have to render at lower framerates, interpolation is better than 30 fps.
 
in my experience, 45fps is the same if not even worse than 30fps, choppy as hell. When I set Elden Ring to native 4K I get 45fps on average and even judder reduction and BFI don't do much. At 30fps they seem to be more effective.
my experience with LG CX is that 40fps noticeably smoother than 30fps

seriously. 30 fps is so bad, some scenes in some games even feels like its hitting me with strobes of frames. its mainly the worst when little to no motion blur, bright sky, camera pans (or rotates).
 
Well Elden Ring isn't the most technicaly sound game to be using for judging anything framerate related. Instead one should be using games capable of delivering smooth fraterates regardless of the target.
 
my experience with LG CX is that 40fps noticeably smoother than 30fps

seriously. 30 fps is so bad, some scenes in some games even feels like its hitting me with strobes of frames. its mainly the worst when little to no motion blur, bright sky, camera pans (or rotates).
my experience with LG CX is that 40fps noticeably smoother than 30fps

seriously. 30 fps is so bad, some scenes in some games even feels like its hitting me with strobes of frames. its mainly the worst when little to no motion blur, bright sky, camera pans (or rotates).
this was the feeling I had before delving into the TV options, whether it was on a regular monitor or on TV, except if you use one of those techniques from the TV, where it becomes more palatable. I rather prefer to play Elden Ring at 4K 30fps with juddering reduction set to max than 1440p or 1080p 60fps 'cos at that framerate the game isn't commonly stable, even if VRR helps, the animation smoothness of the TV could pass it up as a 60Hz game, until you rotate the camera, where the TV sometimes can't keep up. It's just that I prefer to play native and those settings help. Even so, I wish the game had an options for DLSS/XeSS/FSR2 'cos I had never played a Souls game and I am now beginning to understand whey they are so highly regarded.

In fact, they seem difficult games, but to me Elden Ring is like life itself, learning from your mistakes every single time.
 
this was the feeling I had before delving into the TV options, whether it was on a regular monitor or on TV, except if you use one of those techniques from the TV, where it becomes more palatable. I rather prefer to play Elden Ring at 4K 30fps with juddering reduction set to max than 1440p or 1080p 60fps 'cos at that framerate the game isn't commonly stable, even if VRR helps, the animation smoothness of the TV could pass it up as a 60Hz game, until you rotate the camera, where the TV sometimes can't keep up. It's just that I prefer to play native and those settings help. Even so, I wish the game had an options for DLSS/XeSS/FSR2 'cos I had never played a Souls game and I am now beginning to understand whey they are so highly regarded.

In fact, they seem difficult games, but to me Elden Ring is like life itself, learning from your mistakes every single time.

LG CX with motion interpolation is simply too laggy for most games
 
LG CX with motion interpolation is simply too laggy for most games
you might not believe me but I am totally impressed.

So.... I was battling that great dragon in the first level of Elden Ring, a very very tough opponent. I tried in 3 screens:

- My "old" Phillips TV (2013), native 1080p, 60fps -more or less stable, but stable overall-. No VRR.

I confronted that dragon a few times but no luck, those 60fps looked choppy to me -maybe 'cos of the lack of VRR-

- The 4K TV, at native 4K 30fps with judder reduction and all that stuff.

I was close to beat said dragon, twice or so. And even tried similar settings in Redout 2, where I managed to get a few okay lap times.

- The 165Hz 1440p monitor, (my favourite display at home), Freesync, 1440p 60fps (relatively stable to my surprise).

Well, I beat said dragon in the first try o_O a few minutes ago! It is my first "Great Enemy Felled" message in the game. :) I got so used to 4K 30fps with judder reduction and 1080p 60fps with no VRR that when I confronted him in my monitor with VRR and 60fps it became easy stuff.

After this experience, maybe you are right, input lag and VRR make the biggest difference in any videogame. Not that I didn't notice before, but lately I was trying to get used to play most games at 60fps max once again (I got used to 100fps more and I find 60fps a bit choppy) on a native 4K TV, and it's so different from playing Shadow of the Tomb Raider on my monitor at close to 165fps or FIFA 23 at 115fps, or Redout 2 at 165fps....

Moral: modern 4K TVs are lovely to watch but a monitor with a high framerate can be a boon too. I am starting to talk like @Scott_Arm but the more I play games as of late the more I think that's the way I've been playing my best.
 
Last edited:
Moral: modern 4K TVs are lovely to watch but a monitor with a high framerate can be a boon too. I am starting to talk like @Scott_Arm but the more I play games as of late the more I think that's the way I've been playing my best.
It's not one or the other -- any sufficiently high end TV with vrr and low latency achieves the same results -- my lg c9's vrr rate is down to 40 fps, and I think newer models have it at 20.

(also, elden ring is an amazing game, enjoy!)
 
It's not one or the other -- any sufficiently high end TV with vrr and low latency achieves the same results -- my lg c9's vrr rate is down to 40 fps, and I think newer models have it at 20.

(also, elden ring is an amazing game, enjoy!)
enjoying it very much, yes! Awesome game. It's my first souls-like game and I didn't play it for months but I started, I am hooked and can understand why they are so much loved.

Perhaps the best advantage of the monitor is just the framerate, 165fps is a decent amount and at 4K without DLSS and the likes it's hard to achieve.

Extra 22ms for using judder reduction is not that much, but the monitor's response time is 4ms, and it feels totally immediate. 165Hz is 6.0ms, and 4ms is less than that -there would be no point in having a 165Hz monitor with 7ms response time-. We might see a 1000Hz monitor or TV some day, but its response time MUST be 1ms.
 
This type of frame interpolation is better handled in the game. Considering the great leaps and bounds made in resolution upscaling, why is framerate upscaling not getting the same love and progress? Why are we relying on TVs to scale down to 40 fps VRR instead of motion interpolating all games up to 60+?
ive been thinking of dynamic frame interpolation, for instance instead of locking a game at 30 fps because it has trouble reaching 60. To just lock it at 60 and interpolate any dropped frames to keep it locked at 60 is it possible if yes then i think that could be a good performance option besides dynamic res or reducing graphics features to achieve 60
 
Considering the great leaps and bounds made in resolution upscaling, why is framerate upscaling not getting the same love and progress? Why are we relying on TVs to scale down to 40 fps VRR instead of motion interpolating all games up to 60+?
Some thoughts:
TV motion interpolation has an advantage: Both frames already have motion blur in them, increasing smoothness and limiting chance of visible artifacts.
For games we have the advantage of motion vectors, but it does not help for things like transparency and moving shadows.
Upscaling (approx) 40 fps to 60 means an irregular distribution of generated and real frames. We get irregular sequences of r,g,r,r,g,r,g,r,r... Chances are this looks more smooth, but still stuttery due to this inconsistency, which would be better with MB.
The cost of interpolation subtracts from the frame times, so missing the target will happen more often.

Now compare the promise with movies at 24fps, which look smoother than games because they have MB.
So alternatively we could consider this as the target: 30 fps games with better MB.
To get this, we would store the previous framebuffers including motion vectors, which allows to resolve missing disocclusion data using the previous frame. (has this been done already? idk)
The cost would be constant and not more than 2 times of the current approximate solutions, but less than fps upscaling.
I assume this would feel better than messing with 40 fps, because the fixed 1:2 ratio.
Shadows and transparency is still a problem.

Would be interesting to see what's the better compromise. May depend on the game.
Personally i think we just need those 60 fps for games. Upscaling only makes sense to achieve rates above that, to drive high Hz displays.
Would be nice to see some 30->60 fps videos using DLSS3, but then i would be still unsure if it only looks smoother, or also plays better as well.
 
Now compare the promise with movies at 24fps, which look smoother than games because they have MB.
So alternatively we could consider this as the target: 30 fps games with better MB.

Then you would have to limit your game camera to movie camera movement. That is usually much slower.
 
Then you would have to limit your game camera to movie camera movement. That is usually much slower.
Usual argument, but real world recordings are smoother with fast camera movement too.
What is (or was at least) very noticeable though, is the conversion between various framerates, e.g. NTSC, PAL, 24fps movies. So movies on TV are no longer smooth.

But ofc. it depends. TV used double the framerate from movies (50/60 fps interlaced), because TV productions had more moving cameras.
The question is: How much do we really need? How subjective is it?
And what's better above that point? Having high, but dynamic and fluctuating refresh rate? Or a constant framerate but better motion blur?

I guess the latter is better, also because it's easier to make good displays for constant framerate, i assume.

Currently, there is at least one missed opportunity: If i have RTX4090 but a 60Hz display, there are no options to accumulate multiple frames to one, so i could turn the extra power into smoother motion.
That's sad, because it would be very easy to implement either for game devs or in GPU drivers.
 
If i have RTX4090 but a 60Hz display, there are no options to accumulate multiple frames to one, so i could turn the extra power into smoother motion.
Can you explain that more? It goes way over my head but sounds interesting
 
Back
Top