I definitely use 4k DLSS perf over native 1080p every time too, it's no contest - but yeah - as your screenshots show, native 1080p has a significant performance advantage, and it can be greater in other games. DLSS has a cost, so it's only really relevant to compare it against bilinear upscaled modes that deliver the same performance, it's internal rendering res really isn't that relevant.
Also to properly judge image quality you need to compare shots with some camera motion, as that's where lower levels of DLSS can break down. My gripe with DLSS at these lower settings is not that when it produces artifacts, they're similar to what you get at the native resolution it's starting from - they can actually be worse. DLSS can amplify these at times, if it was just producing them at a quality of what you would get with native 1440p/1080p I wouldn't care much, they're a little blurrier, big whoop. It's only noticeable when they product artifacts you never see with regular upscaling, no matter what the starting res.
Also bear in mind these hair/fur artifacts only really occur with the High hair setting in Spiderman, DLSS just doesn't play well with its implementation, but it even affects the PS5 in Quality mode too, albeit not nearly to the same degree - but hair in performance mode at a lower res is more stable than the hair in quality more, precisely because it's using something close to PC's Medium setting performance vs. High (or slightly less) inn Quality. You just get far more noticable shimmer with it, to the point I don't think I'd ever go over Medium hair regardless of my GPU power. Medium's less fine detail is more than compensated for its image stability imo.