Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Glad they're revisiting it.


Great revisit, Alex! I just recently double dipped on this title, and boy, the initial install was a pain in my ass! During the shader-compiling process, I had several crashes due to a reported memory leak on Steam surrounding Oodle’s compression dll. After replacing the dll (oo2core_9_win64.dll) with a prior Oodle version, not only did it work, it also improved the performance of compiling the shaders (20 minutes in all). You’re absolutely right about the shader-compilation stutters in TLoU, however, I did mitigate some of the stuttering (not all!), which led to a better experience. I simply locked my framerate (VSYNC 60fps), and then created a registry file (.reg) for regulating CPU priority/processes specifically for TLoU, and lastly, locked all my CPU cores frequencies to a single frequency (4GHz). By doing so, my framerates aren’t bouncing so much from shader stutter during transitioning scenes or new areas. Great work once again!

pV7OhCo.jpg

I2MjEF9.jpg

tsg7DFJ.jpg

dP1WngW.jpg

4IUFO00.jpg

8tdjQ6G.jpg

cTiZiab.jpg

q6jHEzV.jpg

0FWfC5v.jpg

wOF7Ihg.jpg

T29lC8V.jpg

9GQTHnB.jpg

5HIKsLH.jpg

ZY3PB6h.jpg

gzyEymI.jpg
 
Last edited:
I'm curious about one thing though, what is with the Naughty Dog engine that makes the PS5 perform so remarkably well in relation to PCs? We don't see that with any of their games such as Horizon, God of War, or Returnal. Generally, the PS5 tends to perform better, which is a given, but in TLOUI and Uncharted 4, it performs like a 2080 Ti, which is kind of insane.
Probably poor GPU utilization due to suboptimal resource and barrier management. Lower power draws have become more commonplace in these low level API titles.
 
You've misinterpreted own data reference. The dark shaded portion of the bar represents the avg, not the absolute lows.
Please explain the correct interpretation. The image for PS5 shows 71 fps, I assume an average - it's not qualified. The graph for 3080 shows 83 average. 68 is the bottom 1% low rate. the 71 fps of the PS5 image isn't matched to any value on the 3080 graph, so what does it represent?
 
Please explain the correct interpretation. The image for PS5 shows 71 fps, I assume an average - it's not qualified. The graph for 3080 shows 83 average. 68 is the bottom 1% low rate. the 71 fps of the PS5 image isn't matched to any value on the 3080 graph, so what does it represent?
It isn't the average. It's a cherry-picked moment displaying the highest difference.


TLOUI 2070S vs PS5.PNG

Alex said that it's "nearly 50% faster in similar scenes". A quick glance at the short clip shows the PS5 being anywhere from 30% to 50% faster than the 2070S in that scene. A 2080 Ti will likely give you similar results. No need for a 3080.
 
.
Please explain the correct interpretation. The image for PS5 shows 71 fps, I assume an average - it's not qualified.

I'm afraid if I tell you in particular you're wrong, I'll be banned for whatever frivolous reason. But I'll risk it - you're wrong. This isn't a representation for PS5 average. it is a screen capture from the recent DF video we are discussing. Has nothing to do with PS5 average and I challenge you to identify any point where I said it did. You can see in my original post that I am referring to performance compared to 2070S, as DF has done in the video

The graph for 3080 shows 83 average. 68 is the bottom 1% low rate. the 71 fps of the PS5 image isn't matched to any value on the 3080 graph, so what does it represent?

Again, this is a screen grab from DF video as narrated by Dictator video. Even he says the PS5 gpu "absolutely blasts the 2070 Super into oblivion". Is he a troll too or does this special treatment only apply to me? As it relates to PS5 performance, I can tell you anecdotally as someone who's played the game hours on end the PS5 performance mode at 1440p averages remains in the 80s during most gameplay and can get into 90s. Either believe it or don't. I never drew comparison to that particular frame grab and the 3080.
 
Again, this is a screen grab from DF video as narrated by Dictator video. Even he says the PS5 gpu "absolutely blasts the 2070 Super into oblivion". Is he a troll too or does this special treatment only apply to me?
The 3080 is around 70% faster than the 2070S whereas Alex said that the PS5 is "nearly" 50% faster than the 2070S there. Where are you getting 3080+ level performance from? That's why you're being called out.
As it relates to PS5 performance, I can tell you anecdotally as someone who's played the game hours on end the PS5 performance mode at 1440p averages remains in the 80s during most gameplay and can get into 90s. Either believe it or don't. I never drew comparison to that particular frame grab and the 3080.
Yeah, so you had an fps counter in your head and could tell it was in the 80s. Sure.
 
After replacing the dll (oo2core_9_win64.dll) with a prior Oodle version, not only did it work, it also improved the performance of compiling the shaders (20 minutes in all).

"At launch, this was around 41 minutes on a Ryzen 5 3600. It's 25 minutes now for the core game, with an additional four minutes for the Left Behind DLC"
In the 8 bit era with 1400 baud analogue tape loads, a new game could take 5 to 10 minutes to load before you could play it. A future of faster loads was sorely wanted. Floppy drives had a small impact, but HDDs greatly reduced load times. And carts provided effectively no load times with many kilobytes per second.

And now, decades later, with SSD storage and GDDR rates and processing power that weren't even imaginable back then, we have...longer load times than anyone could ever have foreseen. :runaway::ROFLMAO:

Doesn't stop here. The number of times I've wanted to play Apex Legends only to face a 40 minutes download and install. Someone needs to invent a time machine, head back in time, and tell them of the travesties ahead so they can be averted!!
 
Last edited:
Digital Foundries review say it's an almost locked 60fps.

Almost is not the same as locked.

VGCharts fps stats also show a minimum of 53fps for the performance mode also.

So PS5 is not able to play the game at native 1440p while maintaining a complete and utterly locked 60fps.

Something the 3080 can do with ease while offering higher quality settings.
 
I'm afraid if I tell you in particular you're wrong, I'll be banned for whatever frivolous reason. But I'll risk it - you're wrong. This isn't a representation for PS5 average. it is a screen capture from the recent DF video we are discussing. Has nothing to do with PS5 average and I challenge you to identify any point where I said it did. You can see in my original post that I am referring to performance compared to 2070S, as DF has done in the video
Stop being so dramatic and explain your point properly. I assumed it was an average metric because of the comparison with the 3080. As a random sample, it's kinda useless.
Again, this is a screen grab from DF video as narrated by Dictator video. Even he says the PS5 gpu "absolutely blasts the 2070 Super into oblivion".
We acknowledge it's faster than a 2070. The comparison is your assertion "PS5 is performing at 3080+ level." to counter your point, a graph was shown of 3080's performance, which shows it 'never' dropping below 68 fps and averaging higher than your PS5 example framerate. You've said that's misinterpreted, and as such doesn't disprove your statement that PS5 performs at 3080+ levels.

Can you please explain civilly how the graph reflects on your assertion that PS5 performs at 3080 levels. Does it support your ideas, or contradict them?
 
Or I have a VRR capable display with a frame counter, not unlike Dictator here in this very video. Respectfully, apologize as this was uncalled for or I have no choice but to report you.
Or you could put some effort and simply link to a test instead of your dubious anecdote.


This is in the docks which is one of the lightest sections in the game. Go to areas with a lot of foliage, and watch your performance drop down a cliff. The benchmark @davis.anthony provided were done by Techspot who says:

For testing we're using The Woods section of the game, as this appeared to be one of the more graphically demanding areas with a lot of foliage.

Again, everyone here acknowledges that the PS5 performs way better than equivalent PC parts. Simply not to the level of a 3080, more like a 2080 Ti.
 
Last edited:
In the 8 bit era with 1400 baud analogue tape loads, a new game could take 5 to 10 minutes to load before you could play it. A future of faster loads was sorely wanted. Floppy drives had a small impact, but HDDs greatly reduced load times. And carts provided effectively no load times with many kilobytes per second.

And now, decades later, with SSD storage and GDDR rates and processing power that weren't even imaginable back then, we have...longer load times than anyone could ever have foreseen. :runaway::ROFLMAO:

Doesn't stop here. The number of times I've wanted to play Apex Legends only to face a 40 minutes download and install. Someone needs to invent a time machine, head back and time, and tell them of the travesties ahead so they can be averted!!

Wasn't MegaTexturing and greater compression techniques like RAD/Oodle were SUPPOSED to be the saviors of gaming? One for reducing the need for so many repeating textures (and compilation), and the other for greater data compression and rapid decompression.
 
Or you could put some effort and simply link to a test instead of your dubious anecdote.


This is in the docks which is one of the lightest sections in the game. Go to areas with a lot of foliage, and watch your performance drop down a cliff. The benchmark @davis.anthony provided were done by Techspot who says:

For testing we're using The Woods section of the game, as this appeared to be one of the more graphically demanding areas with a lot of foliage.

Surprised to see that even looking at the sky PS5 struggles to break 100fps.
 
Or you could put some effort and simply link to a test instead of your dubious anecdote.


This is in the docks which is one of the lightest sections in the game. Go to areas with a lot of foliage, and watch your performance drop down a cliff. The benchmark @davis.anthony provided were done by Techspot who says:

For testing we're using The Woods section of the game, as this appeared to be one of the more graphically demanding areas with a lot of foliage.
Me with my 3070 and my friend with 6700xt did direct comparison tests at "high" preset in that specific location (docks). We both confirmed that PS5 and 6700xt performs very close, however PS5 usually is a bit faster, 3070 is around on par with 6700xt as expected.

It's still afar from a 3080, as 3080 itself is %30-40 faster than 3070, but the game's GPU performance is truly horrendous. I hope it does not become a trend for future titles. DStranging was another title that came dangerously close to 3060ti and alike.
 
From everything I've seen the 3070ti is the closest thing to PS5 performance in this and Uncharted 4. IIRC the lowest framerate point DF was able to find was the foggy/spore filled hallway where you encounter one of the enemies whose head looks like a face hugger. Maybe someone can provide some data from their PC at that point.
 
In the 8 bit era with 1400 baud analogue tape loads, a new game could take 5 to 10 minutes to load before you could play it. A future of faster loads was sorely wanted. Floppy drives had a small impact, but HDDs greatly reduced load times. And carts provided effectively no load times with many kilobytes per second.
Before floppy drives, you may recall that loading from tape on computers like the ZX Spectrum and Commodore 64 went through an epic period of "turbo loader" phase. What this did was take the null CPU usage during loading of uncompressed data, and replaced it will uncompressing compressed data in realtime during load.

Why I am telling you this? I just like the stripey screen colours during 8-bit computer tape loads in the 1980s. :yes:
 
Last edited by a moderator:
Me with my 3070 and my friend with 6700xt did direct comparison tests at "high" preset in that specific location (docks). We both confirmed that PS5 and 6700xt performs very close, however PS5 usually is a bit faster, 3070 is around on par with 6700xt as expected.

It's still afar from a 3080, as 3080 itself is %30-40 faster than 3070, but the game's GPU performance is truly horrendous. I hope it does not become a trend for future titles. DStranging was another title that came dangerously close to 3060ti and alike.
2080/S/3060 Ti-like performance from a PS5 is understandable. Those GPUs aren't overspecced compared to it. A 2080 Ti though trounces it badly in every metric and unless something is terribly wrong, the PS5 shouldn't match it but it does in this game and in Uncharted 4 so I assume it's something to do with the engine perhaps?
 
Me with my 3070 and my friend with 6700xt did direct comparison tests at "high" preset in that specific location (docks). We both confirmed that PS5 and 6700xt performs very close, however PS5 usually is a bit faster, 3070 is around on par with 6700xt as expected.

It's still afar from a 3080, as 3080 itself is %30-40 faster than 3070, but the game's GPU performance is truly horrendous. I hope it does not become a trend for future titles. DStranging was another title that came dangerously close to 3060ti and alike.

I do wonder how much of the old PS3 (cell) code has been carried over to the more modern version? Yes, the engine from a graphical standpoint looks more modern and well beyond the original PS3 title, but I have this weird feeling when it comes to threading (or thread ordering mostly) some old PS3 code is hampering the games overall performance, more specifically the PC edition. Cyberpunk 2077 PC is far more graphical and feature rich, and performs WAY better than TLoU in it's current form.
 
Please explain the correct interpretation. The image for PS5 shows 71 fps, I assume an average - it's not qualified. The graph for 3080 shows 83 average. 68 is the bottom 1% low rate. the 71 fps of the PS5 image isn't matched to any value on the 3080 graph, so what does it represent?

With what CPU?
 
Glad they're revisiting it.


Agreed, very good to see the occasional update video. Frankly I would have waited another month considering the state of it but hey, 7 weeks is a hell of a generous buffer already before you 'officially' review a title. Like many said, if this title is going to be fixed to the degree it needs to, it's a months-long process. I've always had little hope of substantial improvements on the CPU/GPU side and this doesn't exactly dissuade that pessimism.

Also very frustrating to see new shader compile stutter, as well as DLSS affecting shadow quality - which was an issue in Uncharted as well that was also never addressed.

As for the whole kerfuffle on "You need a 3080 to beat a PS5 in TLOU", one of the problems with measuring the game is the wild swings in GPU usage, you have to keep that in mind depending on the location. Walking in the open during the day is far less of a strain on the game than (oddly enough) indoors when using your flashlight, or outside in the rain. There are some pretty significant variances here so a benchmark run done in one location really won't give you an accurate picture of how the game performs as a whole, or even in 30 minute increments.

That aside, Directstorage could definitely help in these area transition bottlenecks, but the PS5's I/O has nothing to do with the massive variance in GPU-limited scenes, that's just silly. You don't get a massive boost in an alpha-heavy explosion scene because you have assisted texture decompression, wtf?
 
Status
Not open for further replies.
Back
Top