Are higher framerates likely to proliferate?

http://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article

I read this some years ago, during the PSWii60 generation, and thought it may be the kind of thing that makes its way into games of the current generation. Alas, that hasn't come to pass. Maybe it didn't give great results?

From a brief skim of the article, it seems that they use tech similar to the frame interpolation in TV's. Given that the breakout box for PSVR reprojects 60fps to 120 (although, in my experience, these are often the games that cause people to experience some nausea,) I do wonder if hardware could be included in consoles to facilitate it.
 
From a brief skim of the article, it seems that they use tech similar to the frame interpolation in TV's.
I despise framerate interpolation on TVs, because it keeps breaking. You're watching a smooth video at some point (on movies it makes it look like Video - yes, I'm one of those :( - but on some stuff smoother == better) and then it suddenly drops down to 30 fps judder. The constant breaks are worse than sticking with 30 fps from the off.
 
Last edited:
I despise framerate interpolation on TVs, because it keeps breaking. You're watching a smooth video at some point (on movies it makes it look like Video - yes, I'm one of those :( - but on some stuff smoother == better) and then it suddenly drops down to 30 fps judder. The constant breaks are worse than sticking with 30 fps from the off.
Which is why it should be done within the game engine where you can either predict object locations or truly rerender them in right positions with reusage of shaded samples.
Another thing that could be half decent is camera only interpolation.
And option to turn it off if one wouldn't like it.
 
I despise framerate interpolation on TVs, because it keeps breaking. You're watching a smooth video at some point (on movies it makes it look like Video - yes, I'm one of those :( - but on some stuff smoother == better) and then it suddenly drops down to 30 fps judder. The constant breaks are worse than sticking with 30 fps from the off.
And also some really dislike alias related artifacts or softness compounded that mainstream TVs are 4k these days, which in turn pushes up the dynamic resolution as much as possible with TAA and requiring for a stable-smooth presented framerate locked to 30fps within the rendering engine rather than pushing for an unpredictable 60fps; something I remember Sebbi was quite passionate about.
Would be less of an issue if VRR was mainstream within TVs and critically both console manufacturers, but that is not anytime soon, not sure one could even call it mainstream with PC gaming.

But then I guess it comes down to game genre and gamers' expectations for what is in development; some can get away with the 60fps (whether locked or unpredictable) while other games cannot.
 
Last edited:
Would be less of an issue if VRR was mainstream within TVs and critically both console manufacturers, but that is not anytime soon, not sure one could even call it mainstream with PC gaming.

While that certainly makes the presentation smoother, it still does nothing for variable control response. VRR, IMO, should die and developers should put more development effort into variable resolution with a fixed framerate (preferably minimum 60 FPS).

Then again, I'm obviously biased as I absolutely hate variable control response in games. Just going from 60 fps to 50 fps induces (to me) a rather noticeable change in how the game responds to my control inputs. That's with VRR when I still had my Radeon 290 and could use AdaptiveSync on my monitor. SIDE RANT - I really wish NV would get that stick out of their arse and support AdaptiveSync already, even if I'd only use it to make occasional use of it. I'd still adjust settings in games for 60 FPS but even then it still occasionally (1-3% of the time) drops below 60.

Regards,
SB
 
How much can be decoupled in terms of framerate?

For example: with the X1X version of Hitman, there's a 60fps mode where IIRC the character animations update at 30.

Can that be taken any further with any other aspects?

While that certainly makes the presentation smoother, it still does nothing for variable control response. VRR, IMO, should die and developers should put more development effort into variable resolution with a fixed framerate (preferably minimum 60 FPS).

Then again, I'm obviously biased as I absolutely hate variable control response in games. Just going from 60 fps to 50 fps induces (to me) a rather noticeable change in how the game responds to my control inputs. That's with VRR when I still had my Radeon 290 and could use AdaptiveSync on my monitor. SIDE RANT - I really wish NV would get that stick out of their arse and support AdaptiveSync already, even if I'd only use it to make occasional use of it. I'd still adjust settings in games for 60 FPS but even then it still occasionally (1-3% of the time) drops below 60.

Regards,
SB

Incoming possible stupid question alert

Could control input be decoupled from rendering output? E.g. controls update at 30/60fps, whilst the image is subject to the vicissitudes of VRR?
 
While that certainly makes the presentation smoother, it still does nothing for variable control response. VRR, IMO, should die and developers should put more development effort into variable resolution with a fixed framerate (preferably minimum 60 FPS).

Then again, I'm obviously biased as I absolutely hate variable control response in games. Just going from 60 fps to 50 fps induces (to me) a rather noticeable change in how the game responds to my control inputs. That's with VRR when I still had my Radeon 290 and could use AdaptiveSync on my monitor. SIDE RANT - I really wish NV would get that stick out of their arse and support AdaptiveSync already, even if I'd only use it to make occasional use of it. I'd still adjust settings in games for 60 FPS but even then it still occasionally (1-3% of the time) drops below 60.

Regards,
SB
Isn't AdaptiveSync exactly VRR?
So why would you support that if you are totally against VRR?
Also the input lag is resolved with Gsync and with Freesync - been tested by various monitor review sites.

I am a fan of Gsync and Freesync after using one of them in games, but critically that is with this generation of GPUs and monitors.

Not sure though if AdaptiveSync within TV/HDMI spec with consoles supporting it also gives a constant input lag *shrug*, should do though in theory.
 
Last edited:
How much can be decoupled in terms of framerate?

For example: with the X1X version of Hitman, there's a 60fps mode where IIRC the character animations update at 30.

Can that be taken any further with any other aspects?



Incoming possible stupid question alert

Could control input be decoupled from rendering output? E.g. controls update at 30/60fps, whilst the image is subject to the vicissitudes of VRR?
Well that is why 120/144Hz/etc Gsync or Freesync monitors with the right GPU is popular with those using them, you get the best input response possible with the VRR.
Input lag is a constant in other words.
 
Could control input be decoupled from rendering output?
It is, and has been for many years. You run your physics and game logic on a framerate independent loop and parallel to that create a visual sampling of the universe for the output. Some games (racing sims) update physics a couple hundred times a seconds. I think you'll be hard pushed to find any game where the input isn't a solid 60 fps. Of course, if the output varies, seeing a little variation between pressing a button and the animation changing may give the impression of input lag.

Definitely interested to hear examples of games Silent_Buddha feels have input tied to frame updates still.
 
It is, and has been for many years. You run your physics and game logic on a framerate independent loop and parallel to that create a visual sampling of the universe for the output. Some games (racing sims) update physics a couple hundred times a seconds. I think you'll be hard pushed to find any game where the input isn't a solid 60 fps. Of course, if the output varies, seeing a little variation between pressing a button and the animation changing may give the impression of input lag.

Definitely interested to hear examples of games Silent_Buddha feels have input tied to frame updates still.

The control input, feedback loop. You do a control and you see the result. Even if control response is independent of framerate, the feedback is still entirely dependent on the framerate.

Basically all shooters and racing games will suffer from it. At 60 FPS, the scene will move X units of measurement in 1 frame. At 50 FPS the scene will move Y units of measurement in 1 frame. Those frames also don't take the same amount of time, obviously. If the framerate varies that means what I see as feedback to my control inputs changes constantly.

I'm not talking about input lag that CSI PC thought I was talking about.

This control/feedback loop is incredibly important for precise and more importantly predictable controls. It's why fighting games absolutely require a locked framerate. If it isn't locked then moves become unpredictable and random.

While I no longer game professionally and my reflexes are nowhere near as quick as they used to be, my perception of it is still the same. So with variable refresh rate not only is my age frustrating as I see situations I used to be able to take advantage of but now am no longer able to, but there's also the problem of seeing random and inconsistent responses to my control inputs.

So, for example in the past I could do a controlled 1-180 degree turn in 1 frame (1/60th of a second) and be able to shoot anything I see in that frame in the next 1-2 frames. That relies on controls being tuned such that every frame of every movement is predictable and repeatable. While I can still do the 1 frame turn, I no longer have the reflexes to respond and kill whatever I see in the next frame. It now takes me a few frames to do a precision shot. VRR makes it impossible to even do the first part, a controlled 1-180 degree turn in 1 frame.

As such I always shoot for a locked 60 FPS as much as possible.

The only place where VRR would be of benefit to me in any way is that I could up settings so that instead of the game being 60 FPS 99% of the time, perhaps I'd go as much as 60 FPS 95% of the time. It'd still be slightly annoying on occasion, but at least the VRR would make the presentation remain smooth without tearing.

Also, this isn't even getting into the whole LCD versus CRT discussion on control/feedback loop. :)

Regards,
SB
 
Last edited:
It is, and has been for many years. You run your physics and game logic on a framerate independent loop and parallel to that create a visual sampling of the universe for the output. Some games (racing sims) update physics a couple hundred times a seconds. I think you'll be hard pushed to find any game where the input isn't a solid 60 fps. Of course, if the output varies, seeing a little variation between pressing a button and the animation changing may give the impression of input lag.

Definitely interested to hear examples of games Silent_Buddha feels have input tied to frame updates still.

Isn't it more of two different type of inputs relative to frame delivery; one as you mention is physics and changing the framerate can break physics/algorithms/etc while the other is the signal processing between monitor-GPU-game-keyboard (ignoring other input lag factors).
Both are valid and interesting to the context of higher framerates, games offering both 30fps or 60fps, and monitors or specifically TVs refresh rates going forward for consoles.
 
The control input, feedback loop. You do a control and you see the result. Even if control response is independent of framerate, the feedback is still entirely dependent on the framerate.

Basically all shooters and racing games will suffer from it. At 60 FPS, the scene will move X units of measurement in 1 frame. At 50 FPS the scene will move Y units of measurement in 1 frame. Those frames also don't take the same amount of time, obviously. If the framerate varies that means what I see as feedback to my control inputs changes constantly.

..........

So, for example in the past I could do a controlled 1-180 degree turn in 1 frame (1/60th of a second) and be able to shoot anything I see in that frame in the next 1-2 frames. That relies on controls being tuned such that every frame of every movement is predictable and repeatable. While I can still do the 1 frame turn, I no longer have the reflexes to respond and kill whatever I see in the next frame. It now takes me a few frames to do a precision shot. VRR makes it impossible to even do the first part, a controlled 1-180 degree turn in 1 frame.

As such I always shoot for a locked 60 FPS as much as possible.

The only place where VRR would be of benefit to me in any way is that I could up settings so that instead of the game being 60 FPS 99% of the time, perhaps I'd go as much as 60 FPS 95% of the time. It'd still be slightly annoying on occasion, but at least the VRR would make the presentation remain smooth without tearing.

Regards,
SB
That is an interesting point.
Are you basing this upon your experience with the 290 and Freesync, also depends upon the monitor?
It feels in your example is jumping from say 40fps to 60fps rather than say 50-60fps range with current VRR; a lot of this comes down to having even higher refresh monitor/TV due to the improved and better cycle scan-out per frame, which should overcome this challenge.

Even with 60fps many console games drop below the 60fps by 1-5 frames and so doubling input lag, which also kills competitive gaming.
I appreciate it comes down to personal preferences and quite subjective and yeah there is a balance to both frame-pacing within games and VRR.
Modern games with 30fps frame pacing controlled by the rendering engine quite often cannot even sustain 30fps, only a rare few have this implemented incredibly well.

Separate point most professional gamers with CSGO play with Vsync OFF and this breaks the input to frame display anyway, but provides the best input lag.
BlurBusters generally suggest a VRR 240Hz monitor-solution for E-sports quality gaming.

Edit:
I assume one factor VRR could not overcome with your example is overshooting the turn due to such a large spike occurring when starting the turn.
But even with engine render frame-pacing it is difficult to design a game to not end up with intermittent large frametime spikes, which would also impact your example.
 
Last edited:
That is an interesting point.
Are you basing this upon your experience with the 290 and Freesync, also depends upon the monitor?
It feels in your example is jumping from say 40fps to 60fps rather than say 50-60fps range with current VRR; a lot of this comes down to having even higher refresh monitor/TV due to the improved and better cycle scan-out per frame, which should overcome this challenge.

Even with 60fps many console games drop below the 60fps by 1-5 frames and so doubling input lag, which also kills competitive gaming.
I appreciate it comes down to personal preferences and quite subjective and yeah there is a balance to both frame-pacing within games and VRR.
Modern games with 30fps frame pacing controlled by the rendering engine quite often cannot even sustain 30fps, only a rare few have this implemented incredibly well.

Separate point most professional gamers with CSGO play with Vsync OFF and this breaks the input to frame display anyway, but provides the best input lag.
BlurBusters generally suggest a VRR 240Hz monitor-solution for E-sports quality gaming.

Edit:
I assume one factor VRR could not overcome with your example is overshooting the turn due to such a large spike occurring when starting the turn.
But even with engine render frame-pacing it is difficult to design a game to not end up with intermittent large frametime spikes, which would also impact your example.

Yes, as you go higher in frequency the difference between one frame and the next is less. If the frame drop is more than 1-2 frames for example, at 30 Hz the difference is hugely noticeable, while at 60 Hz most people are unlikely to notice and at 240 Hz it's going to be unnoticeable to virtually everyone. In a case like that VRR isn't going to help the 30 Hz experience much, would improve the experience for most people at 60 Hz, and would be almost unnoticeable to anyone at 240 Hz.

The testing I did back then was with locked 60 Hz versus 50-60 Hz variable. Anything lower than 50 Hz just feels torturous WRT to the control/feedback loop. I also generally go with Vsync off to improve the control/feeback loop if it can't be locked to 60 Hz 99% of the time. But in most cases I can lock it to 60 Hz 99% of the time (still with Vsync Off usually)

Something I wish I had tried but didn't think of at the time was a locked 50 Hz. This was possible as you could set a custom max framerate in the Radeon control panel. I never thought to try setting max framerate to 50 Hz to see if that would have been acceptable to me. It likely would have taken some time to get used to, but might not have been horrible even if it would be noticeably inferior to the 60 Hz control/feedback loop.

Anyway, it's one of the reasons I haven't gotten a 120+ Hz monitor. Once I got used to that, 60 Hz would like look horrible (jittery/stuttery mess) like 30 Hz does today. Hell, already 60 Hz looks a bit stuttery to me after seeing 120+ Hz on a friend's monitor. And the control/feedback loop would take a bit to get used to, but in turn would mean I couldn't go back to 60 Hz. 60 Hz right now I think is a good compromise for control/feedback + image quality + budget (I no longer buy bleeding edge enthusiast GPUs, they are way too expensive now).

Regards,
SB
 
While that certainly makes the presentation smoother, it still does nothing for variable control response. VRR, IMO, should die and developers should put more development effort into variable resolution with a fixed framerate (preferably minimum 60 FPS).

Then again, I'm obviously biased as I absolutely hate variable control response in games. Just going from 60 fps to 50 fps induces (to me) a rather noticeable change in how the game responds to my control inputs. That's with VRR when I still had my Radeon 290 and could use AdaptiveSync on my monitor. SIDE RANT - I really wish NV would get that stick out of their arse and support AdaptiveSync already, even if I'd only use it to make occasional use of it. I'd still adjust settings in games for 60 FPS but even then it still occasionally (1-3% of the time) drops below 60.

Regards,
SB

Damn, you are that sensitive to framerate changes that a difference of 3.4 ms bothers you?

I am definitely not envious of your circumstance. LOL.

I am going to start a variable framerate desensitization program. Program takes 30 days. It involves strapping a gamer to a chair and forcing them to game 18 hours a day where framerate constantly change using a range of 5 fps to 25 fps.

You should sign up. By the time I am finished with you, 30 fps console gaming will be like a miracle from heaven.
 
Last edited:
Sure @Silent_Buddha , that's why Dynamic resolutions are here to stay in order to keep that 60fps (and eventually 120fps) target as much as possible. VRR will be complementary to dynamic resolutions, at least in the MP gaming.

Trying to keep 60fps at all costs (using dynamic resolutions and even dynamic reconstruction) and use VRR when we still can't keep the target framerate. But that's for 60fps games. Cause I can see VRR being used extensively for all AAA 30fps games on next consoles.
 
Damn, you are that sensitive to framerate changes that a difference of 3.4 ms bothers you?

I am definitely not envious of your circumstance. LOL.

I am going to start a variable framerate desensitization program. Program takes 30 days. It involves strapping a gamer to a chair and forcing them to game 18 hours a day where framerate constantly change using a range of 5 fps to 25 fps.

You should sign up. By the time I am finished with you, 30 fps console gaming will be like a miracle from heaven.
It is interesting and good point he raises that shows how perception and associated sensitivity is not the same for everyone; as an example I am really sensitive to input lag and stutter so VRR works great for me and so never really put much thought to those in his situation, one of perspectives.

This though is a good example of the challenges for devs on what they lower to maintain consistent 60fps with modern 4k TVs and just how dynamic they make said options.
Like I mentioned earlier there are those that are very sensitive to alias artifacts and softening that can happen when using TAA and the dynamic resolution not being high enough, there are probably those sensitive to ambient lighting-occlusion solutions and that could be a nightmare solution to make dynamic,LoD-textures,etc.
Relying upon dynamic resolutions unfortunately on its own is not enough unless the devs take the lowest common denominator for the quality of the game visuals that can sustain a specified frame-paced locked fps, and that can be a fair chunk of quality lost that the engine-game can do, exacerbated that as seen in the discussion between myself and Silent_Buddha and some of the other factors I raised as consideration above gamers have different sensitivites.
Of course this would not be applicable to all games but probably quite a few from a developer perspective going forward with 4K and attaining a frame-paced 60fps.
 
Last edited:
everyone is impacted by lowered frame rates, whether they're aware of it or not. Lowered frame rates means increased input lag. Last gen most people could tell that COD felt good to play, even if they were not aware of the difference of it being 60Hz vs most other console titles being 30Hz. They also were not aware it was sub-HD. I'd guess most people would not know if games were 1800p or 1440p vs 2160p if Digital Foundry did not tell them. Frame rate stability will always be the best option.
 
Back
Top