Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Stole my idea? šŸ˜‚ I already said previously they should have just ported TLOU remastered. It would have been a disappointment to some but the game would have ran well with little issues

Oh you sweet summer child

But honestly... if they're smart.. they'll start rewriting and fixing the deeper core issues this engine/port has for the upcoming TLOU P2 and Factions games. It benefits them to do the dirty work now... because if TLOU P2 and Factions has anything close to the same issues as this one... it will absolutely sell like trash. Fixing TLOU P1 and making things right before they release P2 or Factions will help them in the long run.

This is my hope as well, that the (eventual) upcoming TLOU2/Factions ports provides the impetus they need to really address this at a deeper level now. I hope they take their time with it. Get the crashes fixed, maybe find a way to reduce the CPU load a bit in the meantime, but over the next few months maybe look at an overhaul.
 
Last edited:
Stole my idea? šŸ˜‚ I already said previously they should have just ported TLOU remastered. It would have been a disappointment to some but the game would have ran well with little issues

Sort of, they should have sold a cloud-only game. No additional work needed. šŸ˜Ž
 
..you guys can't be serious.. lol.

No, they need to get these birthing pains out of the way. This isn't a shitty port because PC sucks... this is a shitty port because they didn't do as good of a job as they could/should have.

I have a level of respect for Naughty Dog in the fact that, well, they claim to have done this mostly in-house. That's a big deal. So if we take things at face value, they might have bit off a bit more than they could chew.. but as the team gets experience, their products will get better. Of course I'm not making any excuses for them... this should have never released in the state that it did, and the fact that it's Naught Dog, and one of Sony's crowning jewel game franchises... it's pretty disheartening to see it put out in this state. Again, I think they pushed and made bad decisions to get the port done relatively quickly so it could be released around the same time that the show was in full swing. They clearly missed that mark, and I feel that Sony probably pressured them to get it done pronto giving them only a few weeks extension.. ultimately leaving them in this situation. Yes, it's not an easy thing to port in the best of circumstances, but I can't help but think this all came about relatively quickly.

For all the issues that the port currently has wrong.. and all the reasons that we may think may be causing it... we can look at what this port does right. They incorporated a shader precompilation step, their options and menus are well done, they've incorporated Ultrawide support including cutscenes while warning users that oddities may happen in the periphery in certain scenes. I'm about 10h in an only noticed one small issue during a cutscene with the 21:9 aspect ratio. I suspect the warning is largely for the 32:9 support. And they also support FSR2, and DLSS2. Yes, these all may seem like basic things.. but we see a lot of games which don't even bother with those kinds of features and support.

Finally, the thing I always appreciate, no matter how upset I might get when these kinds of things happen.. is communication from the developer. Naughty Dog has engaged with the community, accepted fault, and have outlined their intentions to do right by players and this game. Another thing I appreciate is how direct and easy it is to leave feedback.. as well as communicating the "Known Issues" which they are working on.

So while it may be disappointing, there's some effort there. I hope they can turn this around. One of the best things about Sony's PC releases thus far is that they've all had relatively excellent post release support. They could have probably just left Horizon: ZD alone after a certain point, but even many many months later they were still optimizing/overhauling the shader compilation process and fixing other issues. So hopefully this title can get similar love shown to it.
 
Regarding decompression, here's a perf capture of the Oodle DLL. It seems to use ~1.5 logical cores while active which might explain some 6 core CPU performance issues.
When are you testing this? Is it during the initial load, or just during streaming while playing the game? If it's the former, it doesn't make sense why they wouldn't be going full out on the CPU during that time.

What's interesting is I actually noticed something kinda weird. The RTSS overlay shows my overall CPU utilization at 46% during load.. whereas the in-game performance metrics show the CPU utilization at just ~2-3% at most during load. And this is actually true during gameplay as well. RTSS shows the CPU cores working hard doing something, and the game shows CPU utilization doing basically nothing. The GPU much more closely matches between the 2 overlays in general... and so does the VRAM utilization 9.1GB on both. So I guess the in-game CPU utilization display is only accounting for certain processes? I dunno.. weird.

20230402201422-1.jpg
 
TLOU has brought some of the best discussion to life on this forum. Really. Thereā€™s a lot of various topics being covered here that is often being mixed up
1) the divergence in coding styles between console and PC architecture. We are now getting much better perspective of how developers would fully utilize console hardware. Worth discussing as a separate thread.
2) whether a port of a game has had sufficient resources to convert the game to support PC architecture (no in this case)
3) how impactful the bugs are and how much would be resolved by fixing them.
 
When are you testing this? Is it during the initial load, or just during streaming while playing the game? If it's the former, it doesn't make sense why they wouldn't be going full out on the CPU during that time.

What's interesting is I actually noticed something kinda weird. The RTSS overlay shows my overall CPU utilization at 46% during load.. whereas the in-game performance metrics show the CPU utilization at just ~2-3% at most during load. And this is actually true during gameplay as well. RTSS shows the CPU cores working hard doing something, and the game shows CPU utilization doing basically nothing. The GPU much more closely matches between the 2 overlays in general... and so does the VRAM utilization 9.1GB on both. So I guess the in-game CPU utilization display is only accounting for certain processes? I dunno.. weird.

20230402201422-1.jpg
Keep investigating! Itā€™s this process that is making the difference and not random speculation!
 
What's interesting is I actually noticed something kinda weird. The RTSS overlay shows my overall CPU utilization at 46% during load.. whereas the in-game performance metrics show the CPU utilization at just ~2-3% at most during load.

Windows 11 made some fundamental changes with how the CPU performance counters can be captured, and it broke a lot monitoring tools - Rivatuner was updated, but even stuff like the Xbox gamebar overlay still hasn't, it will show you ~5% CPU for any game. My guess is they ND never updated their code to reflect the change.
 
Keep investigating! Itā€™s this process that is making the difference and not random speculation!
Yea, it's strange. Something weird is going on.. We all assume that the game is completely using all these cores... but the loading screen being so long, and actually now that I think about it.. the shader compilation process being abnormally long.. might point to the fact that all the cores aren't being utilized like we think they are. I'm almost tempted to delete the shader cache, start it up again with the in-game metrics being displayed and see if the CPU utilization is high.
 
Windows 11 made some fundamental changes with how the CPU performance counters can be captured, and it broke a lot monitoring tools - Rivatuner was updated, but even stuff like the Xbox gamebar overlay still hasn't, it will show you ~5% CPU for any game. My guess is they ND never updated their code to reflect the change.
Well in that case I wonder what Windows 10 users are reporting. If we have anyone here who could test it out as I have, that would settle that very quickly.
 
So yea, I quickly tested during load screen, and Windows Task Manager shows similar usage to RTSS, so the cores are definitely being taxed as it shows.

In the name of science I deleted my shader cache and looked to see if the in-game metric changed during that process and.. and it went up to... 6% lol.

test3.png


So yea, I guess it's just a bugged as Flappy Pannus says. Oh well, nice to know. The game still has weird unexplainable behavior... and we'll probably never understand it haha.
 
When are you testing this? Is it during the initial load, or just during streaming while playing the game? If it's the former, it doesn't make sense why they wouldn't be going full out on the CPU during that time.

What's interesting is I actually noticed something kinda weird. The RTSS overlay shows my overall CPU utilization at 46% during load.. whereas the in-game performance metrics show the CPU utilization at just ~2-3% at most during load. And this is actually true during gameplay as well. RTSS shows the CPU cores working hard doing something, and the game shows CPU utilization doing basically nothing. The GPU much more closely matches between the 2 overlays in general... and so does the VRAM utilization 9.1GB on both. So I guess the in-game CPU utilization display is only accounting for certain processes? I dunno.. weird.

20230402201422-1.jpg
Just running around in game. I used my then current savegame and ran back and forth around a small area.

Regarding CPU perf counters, something has changed on Windows recently. My RTSS also shows numbers that are completely out of whack compared with Task Manager so I'd expect the game is just using those same (wrong) counters. VTune doesn't rely on any of those counters though so I think my results should be accurate.
 
My RTSS also shows numbers that are completely out of whack compared with Task Manager so I'd expect the game is just using those same (wrong) counters. VTune doesn't rely on any of those counters though so I think my results should be accurate.

You need to update RTSS to the latest beta.
 
So Iā€™m part way into the DF video where theyā€™re comparing texture and Alex said something which left me scratching my head. He says ā€œ I think Ps5 is using High quality texturesā€. He then follows with ā€œI canā€™t tell the difference between high and ultra texturesā€. So if you canā€™t tell the difference between both of them, how can you even make an educated guess as to which setting is which. Especially since we canā€™t use performance metrics as a reliable metric because the port is broken? A really head scratching statement.
 
Seems pretty innocuous. Itā€™s at least high, maybe ultra, but the visual difference between the two is not easy to spot.
Itā€™s innocuous but, again itā€™s completely unnecessary and there are more statements like this in the video. If you donā€™t know, itā€™s ok to say you donā€™t know.

Anyway, Iā€™m still watching the video and as I suspected, the memory management issues appear to stem from the fact that they might not have rewritten memory management for pc. Also, decompressing on ps5 is almost free due to the dedicated hardware. Rich noticed the cpu usage as it attempts to decompress things in the background. I donā€™t know that it will be fixed. It will be a lot of work to fix. Until direct storage is standard on pc, itā€™s going to be a problem.

Finally Alex keeps using Spider-Man as an example while forgetting that nixxes did a ton of work bypass the issues raised by the inherent architectural differences. Also Spider-Man pales in asset variety. Since Sony bought the best pc porting studio, I expect things to get worse before they get better.
 
Last edited:
Final comments on the video. It highlighted some only 2 of the many concerns of the port, memory management and cpu utilization. The video didnā€™t need to be an hour long, 30 minutes would have sufficed.

The conclusion is a bit funny. One of the trio says that a ps5 spec pc should achieve adequate performance but when you think about the statement, it doesnā€™t make any sense. The pc Alex used is a ps5 spec pc and yet it would slap the ps5 when it comes to Ray tracing due to superior dedicated raytracing hardware. Vice versa, a ps5 would slap an equivalent pc when on the fly decompression of assets is in play due to superior dedicated hardware.

Another one of the trio says that Naughty Dog should fix the cpu issues. My question exactly is how do they propose they do that? If on the fly decompression is a requirement to keep ram utilization in check, how do you speed up decompression without a significant cpu penalty? Again the ps5 has dedicated hardware for this so itā€™s almost free. I suppose they could use GPU decompression in an attempt to speed it up but there will be a performance penalty?

Honestly, I think what will happen is that theyā€™ll try to fix the easy win bugs. Maybe optimize where they can but, I donā€™t know that this will be a complete turn around. Does anyone know if all of unchartedā€™s issues were sorted? Finally, I think the most disappointing thing about this port is naughty dogs lack of attention to detail. Then again, I donā€™t know if this is due to their team expanding but since the uncharted collection released on ps5, it was evident that they had started to slip up.

Finally, I must say, I do not believe devs should care about or cater to the concerns of 8gb GPU holders. Itā€™s been a good 7 years but itā€™s time to move on. Devs should not shackle themselves to old hardware. For a long time, we read lots of comments about how old consoles were shackling pcs. Now consoles have stepped up the requirements and weā€™re met with incessant whining. In the 90s and early 2000s, people would have upgraded without blinking an eye. Frankly I blame Nvidia for this because they create their products with planned obsolescence. 3 gens of GPUs with 8 gb as the primary vram buffer? Ridiculous. All so they can protect their 67% gross margin. A margin that makes the likes of apple look saintly.
 
Last edited:
The fact of the matter is, the bulk of PC gamers - and even that I'm completely excising older systems - are going to have ~8GB GPU's and 6 core CPU's. Naughty Dog knows this. Sony knows this. As such, you have to target your project, no matter how tightly tied it may be to the PS5's architecture, to at least run acceptably on this target spec if you plan to do a PC port at all.
I sort of hard disagree here - this year we will be finally freed from the cross gen shackles of PS4. Asking Sonys first devs to plan ahead for PCs with (much less) CPU or GPU power would essentially put the shackles back on. Wider ones perhaps but shackles non the less.

That doesn't mean it runs exactly like the PS5 version on such a system - Spiderman doesn't, it requires more CPU grunt than an equivalent CPU in the PS5 to deliver the exact same RT performance - yet Alex considers it an excellent port. The problem with TLOU:RM isn't that a 2070 Super isn't matching the PS5 1:1 or the 5600X doesn't exhibit some dropped frames, it's that it's wholly inadequate ..
It's a bad port because it's a bad fit to the hardware that the majority of your target market shares, full stop.
No ! No full stop at all. If this mindset sets place no PS5 game will ever target the Console accordingly from now on!! What the hell?
I absolutely dont care if the Game runs well on someones Quad Core / GTX 1060. Neihter should the Devs
That's ridiculous to even ask for to happen.
PC only Titles can try and do that.
But PS5 First Party Titles should , can and will demand HW that hovers roughly around PS5 Specs. And HERE is the Full Stop! No where else!

The fault may certainly have little to do with the developers, their hands may have been tied in trying to hit this schedule which is clearly focused on the TV series and they may have communicated this difficulty up the chain. But it's a bad port nonetheless. It's a product that's being sold for a hefty price, and runs very poorly on a huge swath of the userbase it's being sold to.
I can agree with you that communication is important, they should let people clearly know if they could expect PS5 Quality on their system.
If the System requirement chart says now something false they should either alter it.
Or maybe they already know they missed their target and working on a performance patch right now .
Happens all the time.
 
Last edited:
In summary: very very long loading times, CPU limitations, very long shader compilation time, VRAM mismanagement and visual bugs .. it's the total package!

Why would a VRAM heavy title waste VRAM like this? On 8GB GPUs the game wastes 1.6GB on imaginary "system reservation stuff", on 24GB GPUs the game wastes 5GB! What the actual hell? who thinks like that? If I didn't know better I would say the game is intentionally sabotaging VRAM! 20% of VRAM is thrown in the garbage for absolutely no reason!
It doesn't actually do that, game will happilly use whatever free VRAM you have. Stutters they see happen even within the boundaries of threshold so they were just seeing regular asset streaming stutters that will happen regardless on limited budgets if you're close to card's limits regardless. Their video have hitches here and there even with textures set to medium and within the threshold...

Native 1440p
Game application VRAM meter: 7671 MB
Total: 9277 MB
In-game usage: 7400-7500 MB

Played for 12 hrs with game application VRAM at 7600 MB. Only crashed 3 times. The stutters you see in the video does happen even more infrequently if I do not record (due to recording, VRAM usage dropped a bit ) And I'd say that's fair considering how unstable the game is in general. They're not really immersion breaking or severe as shown in the analysis video. And a frame cap really gets rid of them most of the time. The PS5 running %30 faster than equivalent hardware is in effect here, making the 3070 barely get 55-60 FPS at native 1440p. Not much I can do about that. It sucks. What I question here is that stutters happen even if you're within the boundaries. But crashes are so rare I could see them happening regardless (as they crashed too)

1440p DLSS Q
Game application VRAM meter 7233 MB
Total: 8839 MB
In game usage: 7332 MB

Aside from rare stutters that happen in transition scenes, stutters only happen when you open the doors to a new scene which also happens on their video, or in their video, game produced a huge stutter when the car explosion happened. Even that does not happen on my end (neither at 1440p or 1440p DLSSQ). I specifically get a repetable stutter when I move past the first soldier group and when the jeep arrives. And they're really microstutters. In their video, total VRAM usage is around 7-7.1 GB and dedicated game VRAM usage is around 6.2-6.4 GB (since they set the game to medium textures to fit within the 8 GB boundaries based on game's claims of 1.6 GB OS usage) Most stutters I experience they also experience with 6.4 GB usage. So I don't even think stutters I get has anything to do with threshhold.

These transitional stutters are preferrable to N64 textures. I'd like see them give us a better texture quality option, still.
 
Last edited:
Status
Not open for further replies.
Back
Top