Why do you expect it to improve frame rate? Upscaling costs something as a bunch of the pipeline still runs at the upscaled size. The way NVIDIA has marketed it as a performance improvement is by putting "super-sampling" in the title and constantly comparing it to running brute force at the upscaled resolution. Don't get me know, upscaling is great and a better use of resources in most cases than "native 4k" (as much as such a concept even really exists anymore), but the process itself costs a few ms of performance compared to not upscaling. Unless you're saying that upscaling some lower resolution *to* 1080 is slower than rendering at 1080, in which case that's a bit odd but it's possible it's not actually doing that as there are a number of interacting cvars (also note the upscaling you see in editor is driven by the editor controls while in a cooked game it is based on some different cvars).Strangely, upscaling be it TSR or DLSS sometimes seems to tank framerate, or not improve it by much.
It also costs vram, as discussed. A bunch of the buffers are higher resolution, and the texture LOD bias will cause more virtual texture usage (although I'm not sure how the pool is sized). If performance is taking a reasonably dive, check vram usage as that might be what is putting it over the top here.
Thus it is not unexpected that upscaling 1080->4k will cost performance vs. just not upscaling the 1080 buffer. It should look notably better though... closer to the 4k image than the 1080 one in most cases.