Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Here is the performance in 1080p with low settings with a RTX2060 (6GB) and 13600K:

Not even stable 60 FPS in these outside parts. And image quality isnt even on par with the PS4 pro Version...
 
Cpu limited to below 60 fps on a 13600k with low settings? That’s pretty wild. Would be very curious to see what the dominant cpu bottlenecks are.
 
This is the first PC release of ND and Direct X 12 isn't easy to master. I hope more and more engine will use virtual texture it can help solve the problem of 8 GB GPU. This is probably difficult to work with PC with less RAM for graphics than PS5 and without a mandatory SSD. The next big ND single player PC port will probably use Direct Storage.
 
The vram change between high and low is fairly small if I’m remembering correctly. The quality change is like 1/4 resolution. It doesn’t make a lot of sense.

@BitByte 8GB or less are probably 80% of the market. Getting better results from medium and low would be a big improvement. High/ultra being for 10-16 GB cards is fine.
Textures aren't typically the biggest consumer of VRAM nowadays. Render targets and buffers often double or even triple that and is possibly what's happening in TLOU.

Going from low to ultra textures only increases the VRAM allocation by ~3GB. Something else is taking up the remainder. RTs, geometry, and buffers all need VRAM as well and not all of them scale with resolution.
 
Textures aren't typically the biggest consumer of VRAM nowadays. Render targets and buffers often double or even triple that and is possibly what's happening in TLOU.

Going from low to ultra textures only increases the VRAM allocation by ~3GB. Something else is taking up the remainder. RTs, geometry, and buffers all need VRAM as well and not all of them scale with resolution.

Yah that’s fair. Just curious why the quality drop in the textures is so steep from high to medium.
 
Cpu limited to below 60 fps on a 13600k with low settings? That’s pretty wild. Would be very curious to see what the dominant cpu bottlenecks are.

It's strange, my overclocked 12400f is literally locked to 60fps throughout the whole game so something is crazy with that system.

As the game streams so much I wonder how much influence the PCIEX slot speed has on the frame rate.
 
Ultra preset needs 10GB in 1080p, but only 14GB in 4K. So it doesnt scale with resolution at all...

/edit:
VRAM in 1080p with Ultra: 9.309MB ->
VRAM in 2160p with Ultra: 11.410MB ->

2GB more for 4x the resolution.
 
Last edited:
Not even stable 60 FPS in these outside parts. And image quality isnt even on par with the PS4 pro Version...

I think Sony figure out PC ports was not worth the effort, so now, they release under par ports, so they can avoid releasing them in the future. J/K
 
Is plague tale requiem a better comparison? It barely uses 6GB at 4K on PC.
While I don’t find APT to be impressive, it does have great textures. However, asset variety pales in comparison to TLOU. Again, a plagues tale doesn’t appear to have been architected solely for the ps5 then ported back to pc which is the problem here. It’s clear Naughty dog is decompressing textures like crazy on the ps5. Why? I don’t know but if that behaviour is intentional then, I don’t expect the issue to be fixed easily.
 
It's strange, my overclocked 12400f is literally locked to 60fps throughout the whole game so something is crazy with that system.

As the game streams so much I wonder how much influence the PCIEX slot speed has on the frame rate.

Does a 2060 have pcie3? Can’t remember.

Edit: cpu is showing 97% or so though. On a 13600k at less than 60 fps, that’s an achievement. Very strong gaming cpu. Unless the guys memory timings are really bad.
 
Does a 2060 have pcie3? Can’t remember.

Edit: cpu is showing 97% or so though. On a 13600k at less than 60 fps, that’s an achievement. Very strong gaming cpu. Unless the guys memory timings are really bad.

It's not his CPU as it's faster than mine and has more cores, it's the way he's set something.

His power consumption looks lows to me as I would expect a 12 core 13th gen Intel CPU to be around the 100w mark at stock.

His temps are also lower than I would expect for a CPU architecture that's known to running scolding hot, I know he has a monster CPU cooler but so I do and I can hit the 70c range at 90%+ CPU load with 6 less cores than he has.
 
He references this, talks about how PC textures are usually packed in a different format that's far more CPU friendly instead of natively using the PS5's compression. To fix this of course, assuming there is a far more CPU-friendly solution that can handle this volume (which really, does not seem exhorbitant) would require re-authoring all the the textures, so I'm very skeptical that will happen regardless, at least within the next 2 months.
Yea, he referenced ZLib. Again though, when you do a cost benefit analysis, I don’t know that it’s worth fixing at all.
Not the mouse issue which apparently ND will fix on Tuesday for TLOU, maybe they will finally port it back to Uncharted. Still an extremely GPU hungry game but not quite as bad as this, but other than that it can run a far wider range of systems very decently, you basically just have to run at a lower res than you may be accustomed to for ports.

It never required anything like the work here, and its release was far less in the limelight - Sony/ND wanted TLOU out to coincide with the interest in the TV series, and welp - they got their attention alright.

As @Remij mentioned, there are still two big releases - Factions and (I assume) TLOU2 to come, so that may give more push for them to make the necessary time investment to get this engine into a more performant state, who knows.
This is good to know.
That's just silly. Even among the midrange-to high, it's the majority of the market, it's not the bargain basement. If you don't care about 8GB cards, then you don't care about the PC, as you're completely shutting out the huge majority.


Alex is not against running textures at medium, what should not happen is choosing that setting completely breaks texture quality so it resembles something from 20 years ago. That's broken. Preventing that from happening isn't 'catering', it's just designing your port so it actually works on what your users have.
In a way it is catering. Those who have >10 gb of vram can run high textures with no problem. The texture issue is solely with those who have 8gb of vram. Even then, there are people on this forum providing evidence that they’ve found a way to run with high textures. The simple fact is that the base requirements have increased. Imo, the minimum acceptable vram on any card now is 12gb.
It's a product. It's being sold, not bestowed. It's not 'whining' if it doesn't live up to what was advertised. That's called a product review. 8GB of vram is not 'shackling' games, no one is asking to run this at 4k with Ultra textures and a new ray tracing feature on a 3070, they just expect textures that work correctly. There is just nothing on display here which warrants this game crippling 8gb cards. No other game that doesn't use ray tracing behaves this way. There are some games like Far Cry 6 where you can't use the Ultra HD texture pack yes, but they scale down as you expect - the 'regular' textures are just a little less sharp, not a sea of toothpaste.
Sea of toothpaste is hyperbolic. In fact, some of DFs comments in the video about the textures were hyperbolic. It’s not great but, they were complaining as if it was rage mega textures ps3 level of bad and it’s not.

The fact that an 8gb GPU is still being sold doesn’t make it a good product. Once the console specs were announced, it was very clear that if you plan to keep your GPUs for an extended period of time, you shouldn’t buy anything with 8gb of vram. The minimum acceptable amount of vram was 12gb and even then, that was barely enough. This happens every single console generation so I’m not sure why people are surprised? Once a new console comes out, it’s standard practice to buy a GPU with vram greater than or equal to the console’s memory capacity. Like many people knew of the perils of 8gb. Go look back at ampere’s launch and you’ll see lots of comments warning about it. If someone made the bad decision of purchasing an 8gb card in 2020 when you could easily exceed the 8gb buffer with mods, that’s a personal choice.

Lol no they wouldn't, my man I was gaming on the PC during that time as well. The PC gaming market was absolutely tiny by comparison to today. The Voodoo2 was the new hotness in the late 90's, and it was $299, so ~$500 today. If a game came out that required SLI Voodoo2's people would absolutely shit a brick.
You realize that crysis came out and most GPUs couldn’t run it well? Now I’m not saying TLOU is equivalent to crysis because it’s not but, I’m just providing an example to invalidate your statement. People upgraded their GPU to play crysis.
 
Last edited:
The fact that an 8gb GPU is still being sold doesn’t make it a good product. Once the console specs were announced, it was very clear that if you plan to keep your GPUs for an extended period of time, you shouldn’t buy anything with 8gb of vram. The minimum acceptable amount of vram was 12gb and even then, that was barely enough. This happens every single console generation so I’m not sure why people are surprised? Once a new console comes out, it’s standard practice to buy a GPU with vram greater than or equal to the console’s memory capacity. Like many people knew of the perils of 8gb. Go look back at ampere’s launch and you’ll see lots of comments warning about it. If someone made the bad decision of purchasing an 8gb card in 2020 when you could easily exceed the 8gb buffer with mods, that’s a personal choice.

Series-S offers 8GB of total, useable RAM.

So while your comment will be true for Sony exclusives that have a base minimum of 12GB of useable RAM, for games built to run on Xbox 8GB VRAM will and should be OK.
 
Historically consoles are best bang for buck type of machines. If you can’t see the difference, developers will choose the better performing option. It’s the reason we continue to move away from native to reconstruction and so forth.

In a question about texture quality, console games also must be designed with enough ceiling to ensure that locked 60fps, from
What I understand even on PS5 it does not, so why take an additional hit on textures if there is no visual difference.

Secondly, ultra settings are usually terrible in terms of value for performance, they are unoptimized and usually only available for cards with enough power to brute force, which is something that is not typically a console feature.

Sometimes consoles have custom features that selectively allow them to improve some
Items are ultra and some items below, some weird mixture of low and high, and these settings have never been available to PC.
Using ultra textured should not incur an additional performance penalty if you have enough vram. We know the port is has significant issues so the performance metrics here cannot be trusted.
DF does more than just reporting on things, they aim to educate. And education takes longer than some sound bites. At the end of the day, someone has to do the heavy lifting instead of rushing to put out over-sensationalized YouTube videos for the sake of monetization and getting there first.

Many people complaining about it are looking for an answer, are often looking for weaponizing data points for an argument. Taking the time to provide nuance stops a lot of that, and their core audience appreciates detail. That’s what the patron surveys are for and about.
I don’t think you need an hour to provide nuance. Half the video was just retreading old points. If they were going to spend a whole hour, they should have talked about other issues plaguing the port.
 
Series-S offers 8GB of total, useable RAM.

So while your comment will be true for Sony exclusives that have a base minimum of 12GB of useable RAM, for games built to run on Xbox 8GB VRAM will and should be OK.
This is assuming devs will not do the "PS2 texture, take it or leave it" attitude on the console.

Which some of them started to do;


(For the record, Xbox SX, PS5 or PC ultra textures do not look hot either. So I dunno what went wrong with that game, LOL)

I don't understand why it is so hard for some devs to provide decent looking, accceptable textures for lowed end VRAM budgets. I too theorized that Series S would be a "safety" check for 6-8 GB GPUs, since I thought devs would feel compelled to create special set of textures that are tailored for 1080p and <1080p and that look "decent" (not groundbreaking or super good, just decent. something that does not destroy the art direction of the game. something that does not resemble ps2/n64 style textures).

So I dunno. Good luck to us folks I guess. We're practically at the mercy of devs.

Then again, A Plague Tale is a compelling example.


This is how it should be done... Not PS2 like textures. If Asobo or Playground can do this (a plague tale/forza5).... I don't why others can't.
 
@BitByte We should never go back to the days where people had to upgrade their pcs to play new games. That was pure stupidity and it's financial suicide for game companies. The reality is, most games can scale in reasonable ways so low and medium settings actually look pretty good. Most PC games scale well, and that's a good thing. There are limits. You don't want to be supporting HDDs forever, or 8GB RAM, or CPUs that don't support AVX, or maybe a particular shader model for gpus. 8GB of VRAM is still incredibly common. It's not time to drop it yet. I think in general people just want medium textures (and even low) to look a bit better. It really does look like a very old game, like PS360 era stuff.

The cpu performance is a mystery to me. The ryzen 3600 is a crappy gaming cpu. It always was. I know because I had one. Unfortunately the way cpu reviews are done in general presented the early ryzen parts as being better for gaming than they actually are. I know Alex kind of compares the 3600 to what's in the consoles, and maybe spec for spec it is, but I think the console environment plus the dedicated hardware for decompression etc doesn't make for a great comparison. I'll have to dig a bit deeper into reliable sources to see earlier intel cpus perform. That video posted above with the 13600k is wild, but I have no idea if that pc is messed up or not.
 
Last edited:
Series-S offers 8GB of total, useable RAM.

So while your comment will be true for Sony exclusives that have a base minimum of 12GB of useable RAM, for games built to run on Xbox 8GB VRAM will and should be OK.
I guess you missed the numerous complaints about the series s? Anyway, I don’t think devs will take the series s into consideration when designing their games. They’ll design their games around the constraints of the ps5 and series x then they’ll just deliver a half-assed port on series s. The Xbox demographic doesn’t really buy games anyway due to gamepass so it’s not a big loss anyway. I mean Microsoft data shows this reality. If I examine my spending habits, I’ve owned a series x off and on(buy -> sell -> rebuy) since launch and I’ve purchased exactly 0 games for it due to gamepass. Did the $1 dollar conversion trick and locked it in for 3 years.
 
Using ultra textured should not incur an additional performance penalty if you have enough vram. We know the port is has significant issues so the performance metrics here cannot be trusted.
So if there is no visual difference, why incur a larger impact to your VRAM allocation unnecessarily I would want to know the benefit here because that space could be used for render targets and other items.

As per the format, it was indicated at the beginning of the video that this would _not_ be their normal review of the game due to the constant hot fixes that we're occurring often. They instead opted to have a fun time playing through the game together for the first hour and discussing differences they are seeing on their machines in real time, and fixing things back in post.

I don't see an issue with it, because this wasn't their formal review of the game and because they're also having fun while doing it. I would much rather listen to people having a good time then fake outrage.

I mean, I see the exact same type of arguments cropping up at Neogaf, and honestly, those are people that are just finding anything to nitpick against DF because they hate DF. There's no point in discussing that honestly, DF can never do right in their eyes.
 
@BitByte We should never go back to the days where people had to upgrade their pcs to play new games. That was pure stupidity and it's financial suicide for game companies. The reality is, most games can scale in reasonable ways so low and medium settings actually look pretty good. Most PC games scale well, and that's a good thing. There are limits. You don't want to be supporting HDDs forever, or 8GB RAM, or CPUs that don't support AVX, or maybe a particular shader model for gpus. 8GB of VRAM is still incredibly common. It's not time to drop it yet. I think in general people just want medium textures (and even low) to look a bit better. It really does look like a very old game, like PS360 era stuff.
Do you know what is funnier Scott? The argument of "we had 8 GB for RX 480 in 2016?" And do you know the reality? Nearly half of that amount was unused at 1080p till 2021 unless you were into the niche stuff like using ultra settings/30 FPS lock or stuff (which awkward people like me could've done actually). This is the brutal truth none of these people can admit. RX 480/580 never had the grunt to run 1440p/ultra settings that required 8 GB VRAM back then. Only in recent years that 6 GB was saturated at 1080p actually. I'd say 3060ti/6600xt and alike is okay with 8 GB VRAM. But yeah 4060ti/3070ti and stuff is a bit pushing at 1440p.

I can understand 8 GB being the new 4 GB. But people act like it is the 2 GB of the PS4 era.

PS4 had 8 GB, which made 2 GB GPUs obsolete. But games gracefully scaled back to 4 GB VRAM at 1080p (PS4 resolution).
PS5 has 16 GB, which should, yes, make 4 GB GPUs obsolete. But it targets 1440p or 4K usually, even if it is upscaled. So at 1440p/1080p, 8 GB GPUs should scale gracefully with decent graphics. This would help to not alienate the large 8 GB userbase as well.

4 GB was never obsolete or produced "PS2" like graphics at 1080p with PS4 settings. Now we see 8 GB producing PS2 like graphics at LOWER resolutions and settings than PS5 (I'm excluding myself).

8 GB being present on 290 and 480 is irrevelant. Nowadays maybe these GPUs work better than their 4 GB counterparts, but we're at point where both 4 gig 580 and 8 gig 580 would be deemed unusable for this game due to raw performance regardless.

4 GB at 1080p with decent settings was enough 1 YEAR AGO in almost all games. Now 8 GB is not enough to provide decent textures at 1080p all of a sudden? I'm not buying it.
 
Status
Not open for further replies.
Back
Top