Why would it be the same? Frame generation does add latency by its very nature. You are taking a frame that is ready to be displayed and instead delaying it and putting a different interpolated one in there, then displaying the original one later. The reason there's the soft requirement to have things like reflex and/or 60+Hz base frame rate is because it is definitely noticeable. The most you can do is mitigate it by trying to get the base latency as low as possible, but it will never not add a bit of latency.That's not good. Sure but can't you also use 120hz VRR without FSR3? The difference of latency should be the same.
It's part of the reason why I still don't really like people presenting frame generated "fps" numbers as if they were comparable. Even assuming generated frames as "perfect" in terms of graphics quality (which they are not, but it may not matter much), it's still conflating too many things presenting it that way. At the very least if people are going to present frame gen'd frame rates (which speak only to "motion smoothness" at best), I would request that they always pair that with an input-to-photons latency measurement as well. Frame rate has always been an imperfect proxy for that in the past, but it becomes even less useful with frame generation.