They accepted blame but they never said they were the primary developer. Considering the game has the exact same issues as Uncharted 4, almost down to the relative performance deficits being near identical, I'd say it's quite likely Iron galaxy did most of the heavy lifting.
My theory is that ND did indeed do a significant portion of the development and 'oversaw' a lot of it - but the underlying
engine they worked with was basically IG's UC4 PC port code and their work there did uh, not scale well to the more ambitious usage of it (it didn't exactly scaled well to the warmed-over PS4 code in UC4 even).
The PC, at least on my 12400f, manifests a little differently, but probably for the worst. You can obviously just adjust your resolution/DLSS settings to not be GPU limited nearly as much, so combat encounters can play out far more smoothly on my 3060 with DLSS quality/balanced with 1440p output res. But holy shit, those traversal stutters. Stutters when you approach a door. Stutters for a moment after. Random stutters for missed shader compiles. Stutters where there are not actual framedrops, but just the animation halts (the PS5 get these too). Also these clusters of stutters (and that's really the problem - we're not talking about periodic 1 fps drops, we're talking about 4-6 stutters in a short span which are each 5-10fps drops) can occasionally make the engine go haywire, where my GPU will top out at 80% in a room and be stuck in the 50's, and will only get 'unstuck' by going into another room, triggering another stutter that somehow fixes it and returns GPU utilization to normal. You're constantly revisiting sections and walking through doors in this game, so Alex's comment of "A stutter every few meters" really wasn't an exaggeration. There are even some
DLSS oddities (even with LOD bias fixed), and some indications that some post process effects are being reconstructed where they shouldn't be, albeit relatively minor in the scope of things.
<snip>
So an update on this. Discovered a couple of surprising things.
First, the issue of where GPU usage would get stuck into the 80% range and not recover until another hard stutter 'reset' it - disabling hardware-accelerated GPU scheduling appears to have remedied this. Haven't seen it since through hours of play after disabling that, and seems have remedied a lot of the smaller frametime stutters too. A hassle that you have to disable this but it's always been a somewhat flaky option ime.
The far more surprising thing - SSD vs HDD. I experimented with moving this to a HDD as so often I always see people recommend on other forums to 'move to the fastest SSD you have!' as a way to 'solve' traversal stutters, when most of us know it's never, if ever about pure throughput - it's memory contention and how the asset streaming is coded, you will get traversal stutters on games with this problem if you install the whole thing to a ram drive. So I assumed there would be little difference here, outside of initial loading and perhaps some texture pop-in.
What I didn't expect was actually a noticeable
reduction in traversal stutters. This is running through a long corridor back and forth, and through several doors. This area reliably produces some prominent traversal stutters, other areas in the game are much better but this is one area where I'm sure these weren't shader related:
SSD, 7GB/sec NVME PCIE4:
HDD, 5400RPM 6TB through same area:
Ok no, it's still pretty bad.
But the reduction in frequency and severity of the stutters is readily apparent in gameplay (pay attention to the
scales in each graph - the HDD looks to have larger peaks at first glance but that's just because the scale tops out at 60ms as opposed to the SSD graph which needs to scale to 100 to fit the spikes), and that was a particularly egregious section. Replaying the entire opening level on New Game + produced only a handful of actual frametime stutters across an hour + when installed to a HDD, that's going through 30+ doors. There is still the odd uncovered shaders during a first run, and those stutters that don't actually drop frametimes but the world just 'skips' which the consoles have too, so it's likely never doing to be perfect, but going back and forth between a HDD install and SDD, and the HDD basically cuts them in half, if not significantly more.
Downside? Well as you expect, some texture pop-in, and some instances where the sound effects will trail the action - such as say, opening up a save point for the first time after jumping right into a level and the sound of it the mechanism will trail the animation. Those are rare though, and especially the texture pop-in is only visible in some select instances (and I cranked up the brightness to full to try and catch them). You will see them on occasion, but it's nothing like turning a corner/opening a door and watching the world fade into view, by the time you've walked around in a level for a bit most of the textures will have loaded and you're vram will be consistently in the 6GB range for that level. By and large, my experience has been substantially improved playing this from a middling-performance HDD, at least when targeting 60fps as the ceiling.
My theory on this is that the engine's poor asset streaming parallelization is highlighted more when it does everything it can to avoid texture pop-in and thus the throughput of an NVME allows that thread to drown itself, whereas a HDD forces it to take smaller, more frequent sips instead of huge gulps. Or, could be a defect with my motherboard's nvme integration, albeit I think this kind of problem would surface in other games before this.
Extended playthrough early in the game from a HDD so you can see the extent of stutters and any pop-in: