Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Asking Sonys first devs to plan ahead for PCs with (much less) CPU or GPU power would essentially put the shackles back on. Wider ones perhaps but shackles non the less.

That's not how it works, having to create an extra lower tier of assets will not put any shackles on PS5 at all.

And if developers want to make money from their PC ports then they absolutely need to ensure the ports are good or they loose out.

STEAM are already and have already issues loads of refunds to those who bought TLOU on PC, meaning the developers have not only lost out on financial revenue but their image as a developer has also taken a blow.
 
Last edited:
I could have put in a Plague Tale - I even downloaded it to perhaps make that comparison. But in the heat of the moment of the recording, I kept thinking how Nixxes did a lot better so I said that. But yeah, A Plague Tale is a great comparison point for texture quality and VRAM usage while being visually and design-wise extremely similar to TLOU Pt 1
 
The comparision with Spider-Man was alright. Even on medium texture setting it looks very good and only needs 5.2GB in 1440p. And this game has to render hundreds of dynamic objects every frame...

I sort of hard disagree here - this year we will be finally freed from the cross gen shackles of PS4. Asking Sonys first devs to plan ahead for PCs with (much less) CPU or GPU power would essentially put the shackles back on. Wider ones perhaps but shackles non the less.
Nobody asked them to port this game to the PC.
Using this argumentation ND should start using Raytracing (or go straight Pathtracing) and then let see how PS5 gamers will react when this new game runs in 720p with 30 FPS...
 
Last edited:
I could have put in a Plague Tale - I even downloaded it to perhaps make that comparison. But in the heat of the moment of the recording, I kept thinking how Nixxes did a lot better so I said that. But yeah, A Plague Tale is a great comparison point for texture quality and VRAM usage while being visually and design-wise extremely similar to TLOU Pt 1

It's crazy to think this games uses ~6GB of VRAM at 4k with these level of graphics.
 

Attachments

  • APlagueTaleRequiem_x64_2022_11_23_09_26_42_566.jpg
    APlagueTaleRequiem_x64_2022_11_23_09_26_42_566.jpg
    230.2 KB · Views: 25
  • APlagueTaleRequiem_x64_2022_11_23_09_24_21_255.jpg
    APlagueTaleRequiem_x64_2022_11_23_09_24_21_255.jpg
    252.7 KB · Views: 25
  • APlagueTaleRequiem_x64_2022_11_23_09_23_18_787.jpg
    APlagueTaleRequiem_x64_2022_11_23_09_23_18_787.jpg
    352.8 KB · Views: 25
Another one of the trio says that Naughty Dog should fix the cpu issues. My question exactly is how do they propose they do that? If on the fly decompression is a requirement to keep ram utilization in check, how do you speed up decompression without a significant cpu penalty? Again the ps5 has dedicated hardware for this so it’s almost free. I suppose they could use GPU decompression in an attempt to speed it up but there will be a performance penalty?

He references this, talks about how PC textures are usually packed in a different format that's far more CPU friendly instead of natively using the PS5's compression. To fix this of course, assuming there is a far more CPU-friendly solution that can handle this volume (which really, does not seem exhorbitant) would require re-authoring all the the textures, so I'm very skeptical that will happen regardless, at least within the next 2 months.

Honestly, I think what will happen is that they’ll try to fix the easy win bugs. Maybe optimize where they can but, I don’t know that this will be a complete turn around. Does anyone know if all of uncharted’s issues were sorted?

Not the mouse issue which apparently ND will fix on Tuesday for TLOU, maybe they will finally port it back to Uncharted. Still an extremely GPU hungry game but not quite as bad as this, but other than that it can run a far wider range of systems very decently, you basically just have to run at a lower res than you may be accustomed to for ports.

It never required anything like the work here, and its release was far less in the limelight - Sony/ND wanted TLOU out to coincide with the interest in the TV series, and welp - they got their attention alright.

As @Remij mentioned, there are still two big releases - Factions and (I assume) TLOU2 to come, so that may give more push for them to make the necessary time investment to get this engine into a more performant state, who knows.

Finally, I must say, I do not believe devs should care about or cater to the concerns of 8gb GPU holders.

That's just silly. Even among the midrange-to high, it's the majority of the market, it's not the bargain basement. If you don't care about 8GB cards, then you don't care about the PC, as you're completely shutting out the huge majority.

Alex is not against running textures at medium, what should not happen is choosing that setting completely breaks texture quality so it resembles something from 20 years ago. That's broken. Preventing that from happening isn't 'catering', it's just designing your port so it actually works on what your users have.

It’s been a good 7 years but it’s time to move on. Devs should not shackle themselves to old hardware. For a long time, we read lots of comments about how old consoles were shackling pcs. Now consoles have stepped up the requirements and we’re met with incessant whining.

It's a product. It's being sold, not bestowed. It's not 'whining' if it doesn't live up to what was advertised. That's called a product review. 8GB of vram is not 'shackling' games, no one is asking to run this at 4k with Ultra textures and a new ray tracing feature on a 3070, they just expect textures that work correctly. There is just nothing on display here which warrants this game crippling 8gb cards. No other game that doesn't use ray tracing behaves this way. There are some games like Far Cry 6 where you can't use the Ultra HD texture pack yes, but they scale down as you expect - the 'regular' textures are just a little less sharp, not a sea of toothpaste.

In the 90s and early 2000s, people would have upgraded without blinking an eye.

Lol no they wouldn't, my man I was gaming on the PC during that time as well. The PC gaming market was absolutely tiny by comparison to today. The Voodoo2 was the new hotness in the late 90's, and it was $299, so ~$500 today. If a game came out that required SLI Voodoo2's people would absolutely shit a brick.

Any volunteers for running TLOU shader comp on a PC with a mechanical HDD....?? 👀

Yeah I could do it later if you want, albeit I'm not sure of the purpose - even with ~10GB of compiled shaders, the majority of the time is still going to be the CPU compiling, not the IO. Writing 10GB to a HDD takes a couple of minutes, if that.
 
Last edited:
He references this, talks about how PC textures are usually packed in a different format that's far more CPU friendly instead of natively using the PS5's compression. To fix this of course, assuming there is a far more CPU-friendly solution that can handle this volume (which really, does not seem exhorbitant) would require re-authoring all the the textures, so I'm very skeptical that will happen regardless, at least within the next 2 months.



Not the mouse issue which apparently ND will fix on Tuesday for TLOU, maybe they will finally port it back to Uncharted. Still an extremely GPU hungry game but not quite as bad as this, but other than that it can run a far wider range of systems very decently, you basically just have to run at a lower res than you may be accustomed to for ports.

It never required anything like the work here, and its release was far less in the limelight - Sony/ND wanted TLOU out to coincide with the interest in the TV series, and welp - they got their attention alright.

As @Remij mentioned, there are still two big releases - Factions and (I assume) TLOU2 to come, so that may give more push for them to make the necessary time investment to get this engine into a more performant state, who knows.



That's just silly. Even among the midrange-to high, it's the majority of the market, it's not the bargain basement. If you don't care about 8GB cards, then you don't care about the PC, as you're completely shutting out the huge majority.

Alex is not against running textures at medium, what should not happen is choosing that setting completely breaks texture quality so it resembles something from 20 years ago. That's broken. Preventing that from happening isn't 'catering', it's just designing your port so it actually works on what your users have.



It's a product. It's being sold, not bestowed. It's not 'whining' if it doesn't live up to what was advertised. That's called a product review. 8GB of vram is not 'shackling' games, no one is asking to run this at 4k with Ultra textures and a new ray tracing feature on a 3070, they just expect textures that work correctly. There is just nothing on display here which warrants this game crippling 8gb cards. No other game that doesn't use ray tracing behaves this way. There are some games like Far Cry 6 where you can't use the Ultra HD texture pack yes, but they scale down as you expect - the 'regular' textures are just a little less sharp, not a sea of toothpaste.



Lol no they wouldn't, my man I was gaming on the PC during that time as well. The PC gaming market was absolutely tiny by comparison to today. The Voodoo2 was the new hotness in the late 90's, and it was $299, so ~$500 today. If a game came out that required SLI Voodoo2's people would absolutely shit a brick.



Yeah I could do it later if you want, albeit I'm not sure of the purpose - even with ~10GB of compiled shaders, the majority of the time is still going to be the CPU compiling, not the IO. Writing 10GB to a HDD takes a couple of minutes, if that.

Oodle Kraken is not a PS5 format. It is used on PC too and the usage begins on PS4/Xbox One. For example, if I remember well COD use oodle kraken. Sony used Oodle kraken because it was becoming the standard of compression/decompression with videogames. It will probably stay the standard for decompression of CPU data but on PC and Xbox Deflate will be the standard for GPU data. For PS5 GPU data, people will use oodle Kraken. Kraken is faster to decompress than zlib on the CPU and provide a bit more compression. The problem is not the encoding algorithm but the amount of data to decode and probably the way ND engine is implemented on PC.

WWE 2K19 goes from zlib to oodle kraken

If I remember well Activision use it too and tons of other devs. Oodle kraken was so fast it was faster than the zlib hardware decompressor of PS4 and Xbox One. This is the reason it becomes a de facto standard.



A 2019 article about state of the art decompression:
Finally, at the top of the hill of general-purpose compression algorithms sits a commercial package called Oodle[20]. Its high compression algorithms Kraken and Leviathan beat any other currently available compression algorithm in one or more categories.


How oodle kraken was created:
Kraken is following up on a bunch of leads we ran into last year while working on LZNA and BitKnit, several things we learned tuning LZNib for the PS4 (1.6GHz AMD Jaguar cores, 2 instrs/cycle max, ~200 cycle memory access latency, ~24 cycle L2 cache latency; relatively challenging target for compression, though heaven compared to older game consoles), and earlier stuff we've been experimenting with since about 2014. It's basically combining the more successful ideas from BitKnit, LZNib (that one Charles has written about extensively) and LZB (LZ-bytewise, essentially a LZ4 derivative, which Charles has been very explicit about from the beginning

Thread about oodle kraken from 2016 on the encode forum
 
Last edited:
I could have put in a Plague Tale - I even downloaded it to perhaps make that comparison. But in the heat of the moment of the recording, I kept thinking how Nixxes did a lot better so I said that. But yeah, A Plague Tale is a great comparison point for texture quality and VRAM usage while being visually and design-wise extremely similar to TLOU Pt 1

I appreciate your work but respectfully disagree. High quality animation can have significant memory impact and in this realm, Part I is in an entirely different league. Character models geometry are comparable although I would give the edge to Part I. Interior lighting is also a memory hog and Part I outclasses a APT here again. I think the art style of APT will resonate with more people than Part I. I will agree texture is similar. But I imagine many of those cutscenes in Part I are particularly psycho with memory requirements.
 
Sorry, I can't take you seriously anymore with these responses.

Do not even attempt to debate TLOU having better textures than APT.

I've played both on max settings and APT slaughters TLOU in the texture department.

And it also has these character models.APlagueTaleRequiem_x64_2022_11_23_09_24_21_255.jpg

In typical ND style the model of Joel you get in cutscenes is vastly superior to what you get in gameplay.

This is a typical wall texture in APT - Leagues above anything in TLOU.

3.jpg

If you feel TLOU is even close to those textures then I (And I'm sure many others in this forum) will not be taking you seriously.
 
Last edited:
That's not how it works, having to create an extra lower tier of assets will not put any shackles on PS5 at all.

And if developers want to make money from their PC ports then they absolutely need to ensure the ports are good or they loose out.

STEAM are already and have already issues loads of refunds to those who bought TLOU on PC, meaning the developers have not only lost out on financial revenue but their image as a developer has also taken a blow.
what are you talking about?? making a game run (even if barely) on old quad cores automaticly puts restrictions on the games design in terms of general complexity. Ai (that includes Weather systems and physics btw) for Enemys will suffer. World size will suffer. Are you now going to tell me now that Insomniac when they planned for Rift Apart could have aimed for exact the same Game if they had to consider it to be able to run on a Quad Core/ GTX 1060 Combo in a later coming PC Port??
THATS not how it works. Several Console Gen cycles have proven already that it is the sudden development for one (new) Hardware Solution that brings the big step ups in Graphics and Scope. Not the dragging along support of very old Systems.
So NO - Sony better keeps not gimping their own First Party Games in favor of PC. And anyway there are People outside with very good Gaming PCs. They are usually fine with the Ports. It was so often said that one of the biggest Advantages a PC has over a Console is that it can be upgraded. So thats it what Kevin, Angie, Paul and Ismael going to do - they go outside for once go to a hardwareshop and upgrade their efing PC. Not Sony gimping their PS5 games ..
Oh and before the hard feelings hit - that included myself. I was gaming with a an Ryzen 1500x / GTX 1650 Super for quite some time. Last year in autumn i upgraded to a Ryzen 5 5600 and a RTX 3060 12GB. And since my PC is connected to a 1080p Screen i will be fine for the duration of the Gen in terms of PS5 Equivalency of Ports. But for now iam playing Warhammer 3 with the System and it runs quite well.
 
what are you talking about?? making a game run (even if barely) on old quad cores automaticly puts restrictions on the games design in terms of general complexity.
This is false as some of those 'old' quad cores still offer IPC higher than the early Ryzens and as games are still mainly single threaded limited your claim is ridiculous.
Ai (that includes Weather systems and physics btw) for Enemys will suffer.
No they won't, you just turn those things down or off for low end systems.
World size will suffer.
In what way?
Are you now going to tell me now that Insomniac when they planned for Rift Apart could have aimed for exact the same Game if they had to consider it to be able to run on a Quad Core/ GTX 1060 Combo in a later coming PC Port??
Spiderman (which uses the same engine as R&C) runs just fine on a quad core and GTX1060 thank you and it actually has a more complex world than R&C.

Now explain what you think R&C on PS5 is doing that won't work on a Quad core CPU?
THATS not how it works.
Again, yes it is.
Several Console Gen cycles have proven already that it is the sudden development for one (new) Hardware Solution that brings the big step ups in Graphics and Scope. Not the dragging along support of very old Systems.
You're right, PC should stop dragging these old systems (consoles) with it.
So NO - Sony better keeps not gimping their own First Party Games in favor of PC.
We've had several 1st party games on PC now, all of them offering better graphics and options than what their respective PS version offered.

We're these games gimped on PS?
And anyway there are People outside with very good Gaming PCs.
Like me.
They are usually fine with the Ports.
No we're not.
It was so often said that one of the biggest Advantages a PC has over a Console is that it can be upgraded.
True.
So thats it what Kevin, Angie, Paul and Ismael going to do - they go outside for once go to a hardwareshop and upgrade their efing PC.
No we won't.
Not Sony gimping their PS5 games.
The only thing gimping PS5 games is that 2018 level PC GPU inside PS5.
Oh and before the hard feelings hit - that included myself.
What hard feelings?
I was gaming with a an Ryzen 1500x / GTX 1650 Super for quite some time.
You poor thing.
Last year in autumn i upgraded to a Ryzen 5 5600 and a RTX 3060 12GB. And since my PC is connected to a 1080p Screen i will be fine for the duration of the Gen in terms of PS5 Equivalency of Ports.
TLOU port says you won't.
But for now iam playing Warhammer 3 with the System and it runs quite well.
That's nice.
 
Last edited:
Do not even attempt to debate TLOU having better textures than APT.

I've played both on max settings and APT slaughters TLOU in the texture department.

And it also has these character models.View attachment 8638

In typical ND style the model of Joel you get in cutscenes is vastly superior to what you get in gameplay.

This is a typical wall texture in APT - Leagues above anything in TLOU.

View attachment 8639

If you feel TLOU is even close to those textures then I (And I'm sure many others in this forum) will not be taking you seriously.

Yeah no I'm not playing the screenshot game with you, but thanks anyway. And considering past consensus among members of this forum that turn out to be hilariously off the mark, you can imagine how much credibility I give to this forum's groupthink.
 
Yeah no I'm not playing the screenshot game with you, but thanks anyway. And considering past consensus among members of this forum that turn out to be hilariously off the mark, you can imagine how much credibility I give to this forum's groupthink.

You don't need to play the screen shot game, anyone who's played both games or anyone who is unbiased will and can see it's no contest and APT easily wins in the texture department.

Meanwhile in reality, a typical wall in TLOU.

1680517514489.png
 
It seems this port has really brought the level of discourse down to neogaf/twitter levels. "Forum groupthink"? This is one of the most rational and constructive forums out there IMO.

Often times, yes. But I can point to many occasions where herd mentality seems to take over only to be proven incorrect later. Not really a big deal, and I just brought it up to knock down the fallacy davis used by suggesting consensus within a forum certifies whether or not his statements are correct.
 
Often times, yes. But I can point to many occasions where herd mentality seems to take over only to be proven incorrect later. Not really a big deal, and I just brought it up to knock down the fallacy davis used by suggesting consensus within a forum certifies whether or not his statements are correct.

I suggest you work on being impartial and unbiased in comparisons rather than worrying about what I'm saying or doing.

You'll contribute so much more to this forum that way.
 
Status
Not open for further replies.
Back
Top