Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
A lot of people keep pushing this even though Naughty Dog themselves said they were the lead devs on the port.
They accepted blame but they never said they were the primary developer. Considering the game has the exact same issues as Uncharted 4, almost down to the relative performance deficits being near identical, I'd say it's quite likely Iron galaxy did most of the heavy lifting.
 
They accepted blame but they never said they were the primary developer. Considering the game has the exact same issues as Uncharted 4, almost down to the relative performance deficits being near identical, I'd say it's quite likely Iron galaxy did most of the heavy lifting.

My theory is that ND did indeed do a significant portion of the development and 'oversaw' a lot of it - but the underlying engine they worked with was basically IG's UC4 PC port code and their work there did uh, not scale well to the more ambitious usage of it (it didn't exactly scaled well to the warmed-over PS4 code in UC4 even).

The PC, at least on my 12400f, manifests a little differently, but probably for the worst. You can obviously just adjust your resolution/DLSS settings to not be GPU limited nearly as much, so combat encounters can play out far more smoothly on my 3060 with DLSS quality/balanced with 1440p output res. But holy shit, those traversal stutters. Stutters when you approach a door. Stutters for a moment after. Random stutters for missed shader compiles. Stutters where there are not actual framedrops, but just the animation halts (the PS5 get these too). Also these clusters of stutters (and that's really the problem - we're not talking about periodic 1 fps drops, we're talking about 4-6 stutters in a short span which are each 5-10fps drops) can occasionally make the engine go haywire, where my GPU will top out at 80% in a room and be stuck in the 50's, and will only get 'unstuck' by going into another room, triggering another stutter that somehow fixes it and returns GPU utilization to normal. You're constantly revisiting sections and walking through doors in this game, so Alex's comment of "A stutter every few meters" really wasn't an exaggeration. There are even some DLSS oddities (even with LOD bias fixed), and some indications that some post process effects are being reconstructed where they shouldn't be, albeit relatively minor in the scope of things.

<snip>

So an update on this. Discovered a couple of surprising things.

First, the issue of where GPU usage would get stuck into the 80% range and not recover until another hard stutter 'reset' it - disabling hardware-accelerated GPU scheduling appears to have remedied this. Haven't seen it since through hours of play after disabling that, and seems have remedied a lot of the smaller frametime stutters too. A hassle that you have to disable this but it's always been a somewhat flaky option ime.

The far more surprising thing - SSD vs HDD. I experimented with moving this to a HDD as so often I always see people recommend on other forums to 'move to the fastest SSD you have!' as a way to 'solve' traversal stutters, when most of us know it's never, if ever about pure throughput - it's memory contention and how the asset streaming is coded, you will get traversal stutters on games with this problem if you install the whole thing to a ram drive. So I assumed there would be little difference here, outside of initial loading and perhaps some texture pop-in.

What I didn't expect was actually a noticeable reduction in traversal stutters. This is running through a long corridor back and forth, and through several doors. This area reliably produces some prominent traversal stutters, other areas in the game are much better but this is one area where I'm sure these weren't shader related:

SSD, 7GB/sec NVME PCIE4:


1699992731288.png

HDD, 5400RPM 6TB through same area:

1699992837769.png

1699992864824.png

Ok no, it's still pretty bad. :) But the reduction in frequency and severity of the stutters is readily apparent in gameplay (pay attention to the scales in each graph - the HDD looks to have larger peaks at first glance but that's just because the scale tops out at 60ms as opposed to the SSD graph which needs to scale to 100 to fit the spikes), and that was a particularly egregious section. Replaying the entire opening level on New Game + produced only a handful of actual frametime stutters across an hour + when installed to a HDD, that's going through 30+ doors. There is still the odd uncovered shaders during a first run, and those stutters that don't actually drop frametimes but the world just 'skips' which the consoles have too, so it's likely never doing to be perfect, but going back and forth between a HDD install and SDD, and the HDD basically cuts them in half, if not significantly more.

Downside? Well as you expect, some texture pop-in, and some instances where the sound effects will trail the action - such as say, opening up a save point for the first time after jumping right into a level and the sound of it the mechanism will trail the animation. Those are rare though, and especially the texture pop-in is only visible in some select instances (and I cranked up the brightness to full to try and catch them). You will see them on occasion, but it's nothing like turning a corner/opening a door and watching the world fade into view, by the time you've walked around in a level for a bit most of the textures will have loaded and you're vram will be consistently in the 6GB range for that level. By and large, my experience has been substantially improved playing this from a middling-performance HDD, at least when targeting 60fps as the ceiling.

My theory on this is that the engine's poor asset streaming parallelization is highlighted more when it does everything it can to avoid texture pop-in and thus the throughput of an NVME allows that thread to drown itself, whereas a HDD forces it to take smaller, more frequent sips instead of huge gulps. Or, could be a defect with my motherboard's nvme integration, albeit I think this kind of problem would surface in other games before this.

Extended playthrough early in the game from a HDD so you can see the extent of stutters and any pop-in:

 
Last edited:
They accepted blame but they never said they were the primary developer. Considering the game has the exact same issues as Uncharted 4, almost down to the relative performance deficits being near identical, I'd say it's quite likely Iron galaxy did most of the heavy lifting.
No man, we've all had this conversation already.

Naughty Dog themselves said they did this in-house:


Danny O Dwyer asked Naughty Dog themselves who confirmed they were doing it in-house.

And an Iron Galaxy rep said they only 'helped' on the game.

People only keep persisting with this Iron Galaxy-to-blame nonsense cuz it better fits a preconceived narrative people would prefer to believe.
 
PS5 and Xbox Series X in Performance Mode render at a resolution of 1920x1080 and use a form of temporal upsampling to reconstruct a 3840x2160 resolution.
PS5 and Xbox Series X in Quality Mode render at a resolution of 2560x1440 and use a form of temporal upsampling to reconstruct a 3840x2160 resolution.
Xbox Series S in Performance Mode renders at a resolution of 1600x900 and uses a form of temporal upsampling to reconstruct a 2560x1440 resolution.Xbox
Series S in Quality Mode renders at a resolution of 1920x1080 and uses a form of temporal upsampling to reconstruct a 2560x1440 resolution.
Cutscenes are letterboxed which results in a lower effective resolution during these scenes.
There appears to be a difference in sharpening and/or the temporal upsampling used between Performance Mode and Quality Mode where Performance Mode can produce a sharper but noisier image. This applies to all three consoles https://bit.ly/3G3oT8uhttps://www.youtube.com/redirect?ev...Q2tDNA&q=https://bit.ly/3G3oT8u&v=wG3htnFFiqY
PS5 and Xbox Series X in Quality Mode appear to be using Lumen which doesn't seem to be used in Performance Mode or on Xbox Series S. Quality Mode has improvements to foliage and shadow quality compared to Performance Mode on all three consoles https://bit.ly/3ujm3d3https://www.youtube.com/redirect?ev...TklZdw&q=https://bit.ly/3ujm3d3&v=wG3htnFFiqY
PS5 and Xbox Series X have improvements to foliage and shadow quality compared to Xbox Series S in both modes.
There appears to be a bug where the texture quality on Xbox Series X is worse than the other two consoles on the Desolate Island map which can be seen at 23 minutes and 7 seconds.

Looking at the performance metrics, PS5/XSX are evenly matched for the most part in performance mode with an almost locked 60fps. XSX holds a 2-3fps advantage in quality mode, however, both systems should be locked to 30fps since neither can hit a stable 60fps (mostly in the 40s).
 
Under the PC and PS5/XSS/XSX labels in comparison videos, I'd like to see the cost that was paid for each hardware setup.

They can also show average cost per frame on a sterling, euro and dollar average.

edit: /s /joke #not_serious :-|
 
Last edited by a moderator:
On my god, I think there was a smaller difference between Crysis on the PC and Xbox 360! That's ridiculous.
This was definitely a situation where the game and engine utilisation was wholly PC-centric with the console port being SEP.
 
Under the PC and PS5/XSS/XSX labels in comparison videos, I'd like to see the cost that was paid for each hardware setup.

🙄

You could say that with every video comparing PC max settings and consoles, the cost of high-end PC components and console value is well known by now. Should Digital Foundry be adding cost per frame analysis in all their comparison videos too? Make sure to add ~3 years of PS+ or Xbox live into the equation when comparison anisotropic filtering, gotta be accurate!

As someone who is not going to pay $2500+ CAD for a GPU anytime soon myself, how games look on 4090's isn't particular relevant to me in the here and now either - but it is still noteworthy to see what that technology can scale to if you want to invest that much. The argument against paying that price, aside from the large initial outlay, is that so few games actually take advantage of it in the first place to justify it, outside of Cyberpunk. We may start to see more games that can indeed show a gulf with more advanced engines, I still want to see what that hardware is capable of.

All that being said, not really going to use the Ark games as any measuring stick for realistic optimization on any platform.

This was definitely a situation where the game and engine utilisation was wholly PC-centric with the console port being SEP.

Perhaps, but the opposite also goes for console ports like The Last of Us. If someone focuses on games like those to make value propositions, then that's just as skewed as focusing on a few select games with the development team being PC-focused.
 
Under the PC and PS5/XSS/XSX labels in comparison videos, I'd like to see the cost that was paid for each hardware setup.

They can also show average cost per frame on a sterling, euro and dollar average.
For those who missed it, the italicised shows the poster is being sarcastic - 'Cost per frame' is silly. Not to mention Sterling, Euro and Dollar is nonsensical as they are fluctuating reference points.

When someone misses your joke -humour is subjective, culturally influenced, and easily lost on the internet ¯\_(ツ)_/¯. Point out you were kidding gracefully and everyone can move on.
 
For those who missed it, the italicised shows the poster is being sarcastic - 'Cost per frame' is silly. Not to mention Sterling, Euro and Dollar is nonsensical as they are fluctuating reference points.

When someone misses your joke -humour is subjective, culturally influenced, and easily lost on the internet ¯\_(ツ)_/¯. Point out you were kidding gracefully and everyone can move on.
For the sake of clarity, "They can also show average cost per frame on a sterling, euro and dollar average." was not a part of the original post. That was edited in after.

Point stands.
 
For the sake of clarity, "They can also show average cost per frame on a sterling, euro and dollar average." was not a part of the original post. That was edited in after.

Point stands.
Okay, in that case it's not at all obvious it's a joke. Would have benefitted from an old school ;) - these were invented for a damned good reason!
 
Came across this video which I think is a well presented explainer of common current game optimization methods that some of the non-developers here may find interesting:

When Your Game Is Bad But Your Optimisation Is Genius


In particular, I really appreciated this explanation of LOD presentation methods. I was always wondered why some games, such as Horizon Zero Dawn, had this approach to LOD where the terrain appeared to "grow" while you approached it. Now I know why this 'terrain sinking' method is used.
 
What do people think of GTA VI's graphics? I've watched the trailer a few times and I think it looks cross-gen. It'll be interesting to hear what others think.

You're out of your mind to think that looks cross-gen. It's hard for most games to do any ONE of those things that trailer does at that level... let alone all of them with a city simulation on top of it. This game looks like it costs hundreds of millions more $$$ than other studios can afford.

The pixels may not be the cleanest pixels ever rendered... but the production quality is off the charts. There's a reason why it takes 10 years to make these games.
 
You're out of your mind to think that looks cross-gen. It's hard for most games to do any ONE of those things that trailer does at that level... let alone all of them with a city simulation on top of it. This game looks like it costs hundreds of millions more $$$ than other studios can afford.

The pixels may not be the cleanest pixels ever rendered... but the production quality is off the charts. There's a reason why it takes 10 years to make these games.
The real reason for 12 (!!!) years of development time is because "rockstar" gives a crap about the real needs of gamers because they sold 100+ million copies of a single game, business...

Looks nice, two years later...
 
You're out of your mind to think that looks cross-gen. It's hard for most games to do any ONE of those things that trailer does at that level... let alone all of them with a city simulation on top of it. This game looks like it costs hundreds of millions more $$$ than other studios can afford.

The pixels may not be the cleanest pixels ever rendered... but the production quality is off the charts. There's a reason why it takes 10 years to make these games.
Really? I'm not talking about the scale or the production quality. Rockstar makes one of a kind games. I mean, it's next gen in the sense that GTA 5 was a ps3 game so this is definitely a generational leap from that.... However, the game started development in 2014, 6 years before the release of the PS5 and Series X. When I look at the visuals, it looks like it's from the best visuals of that era but just more dense with more of everything. Do you feel that its more impressive than some of the stuff pushed out by UE5? Visually speaking....
 
Status
Not open for further replies.
Back
Top