Thanks, interesting. I'll play around with it a bit.
It is perhaps both, meaning that there is a higher load, but it maybe shouldn't be affecting it this much - I gotta figure (hope) that the game being rendered properly with very high textures is not intended to swamp a 12gb card without RT, but that's what it does when I do this fix - my 3060 will eventually choke at 4K dlss peformance after playing for a while, and that's just very high textures but with a mix of high and medium settings.
Did you notice the LOD change then switching DLSS modes after boot though? There are definitely more objects being drawn/higher quality when you switch DLSS modes, and when comparing to the PS5, that is actually how it should look - switching DLSS modes just makes it render the world properly.
This is what I see, zooming into a section when booting up the game with DLSS performance, then switching to another DLSS mode, then back to performance. The video just swaps back and forth between the original and the 'corrected' version, exact same settings:
The vram cost to this is always around 1 extra GB, which basically makes very high textures (eventually) unplayable on my 12GB 3060 with 4K DLSS performance. Whether that's due to a streaming system bug or not who knows at this point, but my suspicion is that at least some of these benchmarks are basically being run with lower than console settings in a rather significant aspect.
The GPU load from this change, aside from vram, is highly variable too. In enclosed areas, it's maybe 1-2 fps. But in areas like this, I can get a 7+ fps hit as it's simply drawing far more in the distance. Albeit this game's performance is highly tied to vram usage even if you're not technically 'at your limit' so who knows what it's doing with this extra detail, it could just start spilling over into main memory already.
DLSS Performance at boot, compared to switching to DLSS Ultra Performance. Despite rendering internallly at 720p compared to 1080p, DLSS Ultra Performance resolves more near/mid-term detail, simply due to the switch being performed and resetting the LOD. Even with DLSS Ultra, vram usage shoots up ~800mb.
There was nothing obvious to me from eyeballing it across runs in my test area but that's likely just me not being that good at noticing those things without a side by side. Clearly there is a difference as you've shown. I need to take some comparison shots and flip between them.