This is copied from the
DF thread, but is relevant here as well.
Latency varies from game to game at the same frame rate. Red Dead Redemption 2 will have 60ms of latency at 120fps, while Call of Duty will have 30ms of latency at that very same 120fps. So, if your latency is not universally tied and fixed to the frame rate, then you can't state that this is the one true measure of performance.
Additionally, that very same Call of Duty will have 20ms of latency at 120fps with Reflex on, Vertical Sync on will increase latency at that very same 120fps, while off will drastically reduce it. Variable Refresh Rate will mildly increase latency too. So you have additional layers of tech further decoupling latency from frame rate.
Comparing the game to itself is also meaningless after all of these variables, the gamer will adjust himself to whatever he finds comfortable, he might play Call of Duty with frame generation at 180 fps, reflex on, with a latency of 30ms and finds it acceptable, we can't say that frame generation increased his latency any more than Variable Refresh Rate increased his latency. Is testing with VRR now bad because it increases latency?
You simply can't draw the line at frame generation after all of these variables.
There are clear benefits to frame generation, such as improving the frame pacing of the game, increasing the fluidity of the presentation, and unlocking frame rates beyond the CPU limited code (which is a widespread issue in gaming nowadays). It's simply a very good tool in the box to achieve good performance, just like all the tools listed above.