There can be difference in latency between IHV without frame gen. Just turning on Reflex or Antilag can have a noticeable difference in lag in many cases, but those do not perform the same. nVidia's slides show a
33% reduction in input delay in Destiny just by turning on Reflex at 60FPS. And turning on Reflex often results in slightly lower FPS, so this talk equating latency with performance gets even muddier.
We've had this conversation
here before, in fact. I don't think there was a consensus reached, and I doubt there will ever be, what "performance" means. But for me, FPS is clearly defined as frames per second. I don't care how the frames are produced - rasterized, generated using ai, or simple interpolation, because if they are frames, they count as frames. I think other people are attach benefits to increasing frame rate that aren't always constant. Look at that nVidia slide from earlier in my post, they are saying Destiny has 75ms of latency at 60hz! That's 4-5 frames of latency. With Reflex, moving to a 3080, and 360hz they reach 31ms. Valorant has 30ms of latency with Reflex off at 60hz. And this is why I've never felt like FPS and latency were interchangeable like others do. Related, yes, but latency is so wildly variable by game, by settings, and worst of all, subjective to user perception.
I've personally found the added latency from frame generation to be marginal in most cases, but I've found the benefits of FG to be extremely noticeable, especially with higher multipliers. I find games that run in the 40-50 range to be jittery messes, but frame gen'd up to 80-100 they are quite playable. The real question is, if I were to get new hardware that would allow me to hit 60-70fps native, would that be better than the 80-100fps I get using frame gen? Would I notice the latency difference? And would I just keep frame gen on and go for 120-140, living with the latency penalty of frame generation still?