Christ, I'm ping-ponging back and and forth on this like the hamster on V from The Boys.
Remembered NvInspector, and -LODBias, which is the go-to for games that fuck up their mip settings for DLSS and base them off the internal res and not the output. So set it to -1.5 for DLSS Performance, and...boom. Upon fresh game load, same quality as resetting the DLSS - except this time, the vram doesn't skyrocket, basically no increase at all. Game is perfectly playable with VH textures displayed in their proper full glory, no stuttering.
So, for my 150th variant of "What I think is happening": Nixxes didn't forget about the proper lodbias for DLSS, but it's not set properly on game load. Resetting DLSS does indeed fix it, but that just reveals the settings bug that has affected the game since launch, it causes the streaming system to go haywire. So the game can handle the VH textures just fine, displayed at their proper LOD in 4K, on 12GB cards - their vram cost at the proper lod is not the issue.
However, there are two caveats to forcing the lodbias: There's a performance cost to this as it is rendering more detail than before (mostly in areas with vegetation) - even more than the dlss reset trick, albeit not with the vram thrashing so it's still a win. While the detail wrt object lod and textures in most areas seems identical to the dlss fix, one weird addition is reflectivity - it's massively increased for surfaces that employ screen space reflections, or at least with some materials. Every metal surface in the game now becomes buffed to a mirror finish. It's actually kind of cool in some cutscenes as you can see so much detail of the world reflected in Clank, but I don't think it's intended as this doesn't happen with native TAA/DLAA or the PS5. Maybe that's the reason this is more costly than the DLSS reset fix which doesn't do this.
As for why this wasn't picked up before, the degree with which this is noticeable depends highly upon your starting internal res, screen size and a PS5 nearby to use as a base for comparison. As the game was reviewed mostly on higher-end cards, they were using DLSS Quality or DLAA, and there it's very hard if not impossible to spot as the LOD for 1440p and up is going to be higher than 1080p for DLSS perf. For DLSS performance, even viewing stuff like the single objects in the model viewer in the game can make it evident, but only if you have either a PS5 to flip to on the same display, or do side by side comparisons as I show below.
The wrong lod bias on boot does kind of resemble the PS5 in performance mode more closely for scenes with vegetation like I said, but now I think that's just a case of that mode on the PS5 being sub-4k often, and the weakness of IGTI compared to DLSS when resolving fine detail. The PS5's 4k Fidelity mode more closely resembles the LOD of DLSS with this fix than its performance mode.
The enhanced shimmering in some scenes was also likely partly to me keeping the default sharpening at 10, which helps the textures a little bit with the wrong lod bias set at boot. When it's fixed though, it's aggressiveness just exacerbates DLSS trying to deal with the increased detail. Cutting it in half or off helps.
PS5 vs PC DLLS Perf (fresh boot)
PS5 vs PC DLLS Perf (DLSS reset, lod fixed)
4K Native AA vs. DLSS Perf (dlss reset, lod fixed)
DLSS Perf with DLSS Reset Fix vs. DLSS Perf with LOD Bias Fix on Fresh boot (look how shiny! Also note the vram decrease with the lod bias fix vs. the dlss reset)
Showing that it's not just distant detail that's affected:
DLSS Perf vs DLSS reset fix, single object in model viewer
View attachment 9351
Awesome work! I hope you're reporting this to Nixxes lol. I might try that -LODBias fix if nothing else then for those shiny metal textures, that looks pretty cool!