Digital Foundry Article Technical Discussion [2024]

That's not good. Sure but can't you also use 120hz VRR without FSR3? The difference of latency should be the same.
Why would it be the same? Frame generation does add latency by its very nature. You are taking a frame that is ready to be displayed and instead delaying it and putting a different interpolated one in there, then displaying the original one later. The reason there's the soft requirement to have things like reflex and/or 60+Hz base frame rate is because it is definitely noticeable. The most you can do is mitigate it by trying to get the base latency as low as possible, but it will never not add a bit of latency.

It's part of the reason why I still don't really like people presenting frame generated "fps" numbers as if they were comparable. Even assuming generated frames as "perfect" in terms of graphics quality (which they are not, but it may not matter much), it's still conflating too many things presenting it that way. At the very least if people are going to present frame gen'd frame rates (which speak only to "motion smoothness" at best), I would request that they always pair that with an input-to-photons latency measurement as well. Frame rate has always been an imperfect proxy for that in the past, but it becomes even less useful with frame generation.
 
Why would it be the same? Frame generation does add latency by its very nature. You are taking a frame that is ready to be displayed and instead delaying it and putting a different interpolated one in there, then displaying the original one later. The reason there's the soft requirement to have things like reflex and/or 60+Hz base frame rate is because it is definitely noticeable. The most you can do is mitigate it by trying to get the base latency as low as possible, but it will never not add a bit of latency.

It's part of the reason why I still don't really like people presenting frame generated "fps" numbers as if they were comparable. Even assuming generated frames as "perfect" in terms of graphics quality (which they are not, but it may not matter much), it's still conflating too many things presenting it that way. At the very least if people are going to present frame gen'd frame rates (which speak only to "motion smoothness" at best), I would request that they always pair that with an input-to-photons latency measurement as well. Frame rate has always been an imperfect proxy for that in the past, but it becomes even less useful with frame generation.
I agree with you, it's not the same framerate if it costs you latency. I know I am very sensitive to input latency (I'd even prefer to play a responsive 30fps game than a laggy 60fps one) and I would never activate frame generation whatever the initial framerate. And there is also an IQ cost with the technique. I really hope this will always stay optional in the future.
 
I agree with you, it's not the same framerate if it costs you latency. I know I am very sensitive to input latency (I'd even prefer to play a responsive 30fps game than a laggy 60fps one) and I would never activate frame generation whatever the initial framerate. And there is also an IQ cost with the technique. I really hope this will always stay optional in the future.
I disagree. Triple buffering at the same framerate is the same framerate, even though it adds latency. So if that is true, then why would the added latency of frame generation be the disqualifying metric?
 
I disagree. Triple buffering at the same framerate is the same framerate, even though it adds latency. So if that is true, then why would the added latency of frame generation be the disqualifying metric?
Triple buffering does not necessarily add latency; it depends how you do the present logic.

The reason it's different though is because you are choosing that additional latency; you can always switch to double buffering and get the same performance with the proportionally lower latency. With frame generation there's no option to get the same performance but without the added latency. Unlike with triple buffering, the frame rate no longer tells you how fast your machine can generate a frame from start to finish (and thus that portion of the latency pipeline). Without frame generation you can of course roughly compute the average frame times.

But yeah if you want to be strict about it then I wholeheartedly agree that we should start reporting end to end latency more commonly in game reviews along with frame times.

To be clear it's not that frame generation is a "bad" technology or anything; it's just smarter motion smoothing. We don't report post-motion-smoothing "frame rates" on our TVs, and similarly I'd argue the frame generation "frame rate" numbers are not particularly meaningful. It's generally just safe to assume that in most situations you can get some more animation fluidity by turning on frame generation.
 
For FG, you need to start with a high base frame rate to minimize the latency but also the motion artefacts you will get. Having tested and used this extensively on a PC with a 4090, unless you’re above 100fps with FG on, just avoid it.

120fps is about the sweet spot for FG to give the best value without the feel of added latency and visual artefacts. Anything above is just great for high hz monitors to maximize the display’s motion clarity.

I wouldn’t wish FG on a base of 30-45fps on anyone….
 
But yeah if you want to be strict about it then I wholeheartedly agree that we should start reporting end to end latency more commonly in game reviews along with frame times.
This is sort of the crux of the argument. People keep conflating framerate and performance and attaching variables to said performance (like latency, logic updates, etc.) when in reality these things are not always connected. Different games have different levels of end to end latency, and settings like triple buffering, vsync or framerate limits can affect that latency, even if they might affect the framerate in a way that is in direct opposition to the conventional wisdom of "higher framerare=lower latency". For example, triple buffered vsync can benchmark higher framerates than double buffer vsync, but because it's buffering 1 additional frame, it often has more latency.
 
But yeah if you want to be strict about it then I wholeheartedly agree that we should start reporting end to end latency more commonly in game reviews along with frame times.
In technical analysis maybe, but not in general game reviews. That's just gonna produce one more metric for people to start freaking out about and acting like it's a big deal when they would have never noticed otherwise.

Cuz I really dont think even most gaming enthusiasts are really aware of how high such end to end latencies are in games.

For FG, you need to start with a high base frame rate to minimize the latency but also the motion artefacts you will get. Having tested and used this extensively on a PC with a 4090, unless you’re above 100fps with FG on, just avoid it.

120fps is about the sweet spot for FG to give the best value without the feel of added latency and visual artefacts. Anything above is just great for high hz monitors to maximize the display’s motion clarity.

I wouldn’t wish FG on a base of 30-45fps on anyone….
I mean, we already have an example here where something like 40-50fps base with frame gen can feel good enough, and that's for people who are fairly sensitive to this stuff.

Just saying, maybe be careful making blanket statements. I've generally found most gamers to overestimate their sensitivity to input lag, not to mention that we are quite adaptable. In situations where there's no 'back to back' option to test, most gamers will usually just get on with whatever the game gives them. There's extreme cases where people might complain about poor controls or whatever stemming from high input lag, but for the most part gamers just get used to things quite quickly and never think about it.

So overall, I think it could depend a lot on the game in terms of the type of game it is and the base input lag numbers and how much FG actually adds(cuz it's not some flat cost for any game), but also the player and how much sensitivity and tolerance they have to this weighed against how much they care about visual fluidity.

For PC, I do think it should remain an option, and off by default, but I can maybe see a world where if the visual outcome is good enough and original input lag isn't terrible, using FG to prop up lower framerates on consoles could be....entertained....
 
Steam is superb. The only flaws I could find about Steam is optimization of games, and things like the use of gamepads, where PC Gamepass is the best imho. Steam input gets in the way and causes a lot of havoc to the point that I have to manually disable it for most games, while on the pc gamepass app its a breeze -there is an advantage at least for other stores-.
Steam Input is usually disabled by default for Xbox/Xinput controllers.
 
For FG, you need to start with a high base frame rate to minimize the latency but also the motion artefacts you will get. Having tested and used this extensively on a PC with a 4090, unless you’re above 100fps with FG on, just avoid it.

120fps is about the sweet spot for FG to give the best value without the feel of added latency and visual artefacts. Anything above is just great for high hz monitors to maximize the display’s motion clarity.

I wouldn’t wish FG on a base of 30-45fps on anyone….
Maybe it depends on the game. I used settings that gave 50-60FPS most of the time in Witcher 3 and turning on framegen was a huge improvement. I could not notice the additional input lag. I assume once Reflex is enabled, the added input lag is too low for my human senses and brain to register. Also being third person probably helps, though I was playing with a mouse. Very surprising, from what I'd read about the technology I thought I would be able to tell. I've always been one to complain about input latency when nobody else I knew had any clue what I was complaining about except in Guitar Hero and Rock Band. My friends and I all got way worse when switching from CRTs to flat panels. We kept some old CRTs around for the sole purpose of rocking out.
 
Steam Input is usually disabled by default for Xbox/Xinput controllers.
do you mean that it doesn't work with those controllers or that it gets disabled by default when it detects a Xbox/Xinput controller? In my experience it's enabled by default for all games and I have to manually disable it. It's easy to do that but when you just launch a game you don't realize things aren't working properly until you try to play a 2 players game.
 
do you mean that it doesn't work with those controllers or that it gets disabled by default when it detects a Xbox/Xinput controller? In my experience it's enabled by default for all games and I have to manually disable it. It's easy to do that but when you just launch a game you don't realize things aren't working properly until you try to play a 2 players game.
It's disabled by default for my Xbox 360 controller but I enable it to increase the dead zone to work around some stick drift. Stick drift and turn based RPGs is rage mode. Blocking when you meant to attack :mad:
 
It's disabled by default for my Xbox 360 controller but I enable it to increase the dead zone to work around some stick drift. Stick drift and turn based RPGs is rage mode. Blocking when you meant to attack :mad:
have you tried to fix your controller? My two Xbox like controllers are clones, one of the Xbox 360 gamepad and the other is based on the Xbox One controller, drivers wise, because its design is akin to the Switch Pro controller. Maybe that's the reason Steam Input is always enabled for my controllers. When you set a preferred gamepad in Windows, it works well with Steam and its the controller that Steam chooses -if you have more than one gamepad plugged in- for games. That's all fine and dandy until you want to play the game with another person. The second gamepad acts as if doesn't exist. That cost me many headaches until I found out that disabling Steam Input fixed the issue. On PC Gamepass this was never an issue, at least it has something good compared to Steam.
 
do you mean that it doesn't work with those controllers or that it gets disabled by default when it detects a Xbox/Xinput controller? In my experience it's enabled by default for all games and I have to manually disable it. It's easy to do that but when you just launch a game you don't realize things aren't working properly until you try to play a 2 players game.
Well Steam Input works with all controllers basically, it’s just emulating an Xinput controller, but that’s somewhat less useful for Xbox controllers since they’re Xinput already (however for games with no controller support obviously it’s useful for that). By default, Steam is configured to not auto enable Steam Input when an Xbox controller is in use, you have to specify in the settings to use Steam Input by default for Xbox controllers.

Recently they added the option to have it be the default but only if the controller isn’t natively supported by the game, specifically for PS4/5 controllers since those are sometimes natively supported but are otherwise treated as Direct Input controllers, where you’d need Steam Input. It figures out the support from the games Steam listing.
 
have you tried to fix your controller? My two Xbox like controllers are clones, one of the Xbox 360 gamepad and the other is based on the Xbox One controller, drivers wise, because its design is akin to the Switch Pro controller. Maybe that's the reason Steam Input is always enabled for my controllers. When you set a preferred gamepad in Windows, it works well with Steam and its the controller that Steam chooses -if you have more than one gamepad plugged in- for games. That's all fine and dandy until you want to play the game with another person. The second gamepad acts as if doesn't exist. That cost me many headaches until I found out that disabling Steam Input fixed the issue. On PC Gamepass this was never an issue, at least it has something good compared to Steam.
I haven't tried to fix it. Considering it's ~15 years old it works remarkably well. It is a genuine Xbox 360 wireless controller with the wireless USB dongle. Still works in every game by default, and having multiple controllers hasn't been an issue in the few splitscreen/local multiplayer I've tried. Even the lights around the Xbox logo show which one is controller 1, 2 etc. But these are all genuine Xbox controllers.
 
I haven't tried to fix it. Considering it's ~15 years old it works remarkably well. It is a genuine Xbox 360 wireless controller with the wireless USB dongle. Still works in every game by default, and having multiple controllers hasn't been an issue in the few splitscreen/local multiplayer I've tried. Even the lights around the Xbox logo show which one is controller 1, 2 etc. But these are all genuine Xbox controllers.
oh man, I had forgotten the lights around the Xbox logo. I have some of those, but in pretty bad condition somewhere in the house, without the wireless dongle. The gamepads I use as of currently are generic gamepads a la Xbox and Switch Pro, that accept Xinput and Dinput -maybe that's where the issue lays- and they work with both STeam and PC gamepass, but much better on PC Gamepass and on the Epic Store.

Steam Input is always enabled for me, I don't remember enabling it in the settings menu. My Switch Pro like gamepad, a Canyon GPW3 is like 4 years old but still looks new. The other gamepad, some generic X360 look alike, is 6 years old and it works fine but my nephews peeled the rubber of the left joystick.

I got a couple of wired Playstation gamepad lookalikes but they prefer to use the wireless ones, so I am going to use the 6 years old gamepad for a while and it works well, but your left thumb can end up hurting after a while. Time to switch to something else.
 
oh man, I had forgotten the lights around the Xbox logo. I have some of those, but in pretty bad condition somewhere in the house, without the wireless dongle. The gamepads I use as of currently are generic gamepads a la Xbox and Switch Pro, that accept Xinput and Dinput -maybe that's where the issue lays- and they work with both STeam and PC gamepass, but much better on PC Gamepass and on the Epic Store.

Steam Input is always enabled for me, I don't remember enabling it in the settings menu. My Switch Pro like gamepad, a Canyon GPW3 is like 4 years old but still looks new. The other gamepad, some generic X360 look alike, is 6 years old and it works fine but my nephews peeled the rubber of the left joystick.

I got a couple of wired Playstation gamepad lookalikes but they prefer to use the wireless ones, so I am going to use the 6 years old gamepad for a while and it works well, but your left thumb can end up hurting after a while. Time to switch to something else.
Controller support on PC got infinitely better with XInput and Microsoft's support. I really hope Xbox dying doesn't hamper this because it is so nice to have my controller just work in every game with no fiddling.
 
It is a genuine Xbox 360 wireless controller with the wireless USB dongle
I haz one of those, I also haz a pad with a playstation type layout because i prefer that in some games, it's also switchable between x-input and direct-input for older games, I can also swap the left d-pad and thumbstick so it acts like an xbox pad
 
I think the most interesting way to play controller on pc is using gyro and flick stick on pc. When I got my last controller, I should have bought a ps5 controller.


 
I think the most interesting way to play controller on pc is using gyro and flick stick on pc. When I got my last controller, I should have bought a ps5 controller.


Seems cool but I play those kind of games with a mouse. It's mostly Japanese games that I use controller for since they sometimes have poor mouse/kb support. Especially older games. Some games even the menus are weird and you have to click the top option to change the sub options even if you can clearly see everything on the screen. So it's like your mouse is emulating the behavior of a controller. I think Alex mentioned this once.
 
Back
Top