Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
@yamaci17 Yah. I think if companies are going to enter the pc market they need to think about scaling. Texture quality is one of those things on pc games that always scales well. A lot of people set textures to ultra all the time not realizing it's usually pointless at lower resolutions, but also textures usually scale pretty nicely. If I'm playing more competitive games I'll even lower my textures to avoid some small hitches and generally they still look pretty good unless you go to low or ultra low.

I'm guessing that The Last of US probably has a lot more unique textures than games typically do. A lot of games reuse a lot of textures, but just looking at TLOU it seems like most things on screen look unique. That definitely increases the burden on vram. I'm just not sure why medium looks so horrible when the space saved is so small. Like if the total VRAM consumed by textures is small, that means they actually have a very efficient streaming system in terms of utilizing space, which means they should be able to have something between the current High and Medium offerings.
 
@BitByte We should never go back to the days where people had to upgrade their pcs to play new games. That was pure stupidity and it's financial suicide for game companies. The reality is, most games can scale in reasonable ways so low and medium settings actually look pretty good. Most PC games scale well, and that's a good thing. There are limits. You don't want to be supporting HDDs forever, or 8GB RAM, or CPUs that don't support AVX, or maybe a particular shader model for gpus. 8GB of VRAM is still incredibly common. It's not time to drop it yet. I think in general people just want medium textures (and even low) to look a bit better. It really does look like a very old game, like PS360 era stuff.
Every gen, you always have to upgrade your pc? When the ps4 came out, I had a radeon hd 6950 and I didn’t complain that I had to upgrade my GPU to play “next gen” games. I just upgraded. People are not forced to upgrade if they don’t want to. They can play on medium or low settings but they don’t want to so they complain instead. There are lots of games you can play on 8gb gpus. Old games, indie games, even other new games. One does not have to play TLOU part 1.
 
Yes but 4 GB GPUs were enough to match PS4 graphics at PS4 resolution.

Technically, 8 GB GPUs should be enough to PS5 graphics at lower resolutions (PS5 targeting upscaled 4K with 4K textures with 10 GB budget, and 8 GB GPUs targeting 1440p/upscaled or native 1080p with 1080p/1440p textures). Should be pretty easy to do the math. No need to create a massive waste of e-waste . Do not spin this like how it happened with 2 GB GPUs versus PS4. There, console VRAM to GPU VRAM ratio was 1:4. Here, it is 1:2. Whole another beast, whether you accept or not.
 
Yes but 4 GB GPUs were enough to match PS4 graphics at PS4 resolution.

Technically, 8 GB GPUs should be enough to PS5 graphics at lower resolutions (PS5 targeting upscaled 4K with 4K textures with 10 GB budget, and 8 GB GPUs targeting 1440p/upscaled or native 1080p with 1080p/1440p textures). Should be pretty easy to do the math. No need to create a massive waste of e-waste . Do not spin this like how it happened with 2 GB GPUs versus PS4. There, console VRAM to GPU VRAM ratio was 1:4. Here, it is 1:2. Whole another beast, whether you accept or not.
I disagree with this strongly. 16gb of memory was chosen for the new consoles due to cost constraints but, it’s not enough for a generational leap. That’s why mark cerny and co looked at ways they could utilize the 16gb in a more efficient manner to compensate for the lack of memory. That’s why they put a dedicated decompression chip in the device. They provide oodle kraken with their dev kits as in Sony pays for the licensing costs. The ps5’s memory architecture is far superior to that of a regular pc. The idea that 8gb of vram should be sufficient is frankly ridiculous.
 
This card had 8GB because nVidia released the GTX1060 with 6GB.
I don't know the reason but it is unnecessarily brought into discussion. My friend was playing God of War at 1080p/medium on his 1060 where his card was barely mustering 45-50 framerate average. VRAM usage? 4.-4.2 That's it. High? 35-40 FPS. Get more VRAM usage but card's raster gives out.


This is one of the peak lastgen PS4 titles.


5.2 gigs but;


Works decent enough too with 4 GB budget.

But yeah... Let's act like 290 and 580 made good use of 8 GB. Or any PS4 titles actually required 6-8 GB.
 
I disagree with this strongly. 16gb of memory was chosen for the new consoles due to cost constraints but, it’s not enough for a generational leap. That’s why mark cerny and co looked at ways they could utilize the 16gb in a more efficient manner to compensate for the lack of memory. That’s why they put a dedicated decompression chip in the device. They provide oodle kraken with their dev kits as in Sony pays for the licensing costs. The ps5’s memory architecture is far superior to that of a regular pc. The idea that 8gb of vram should be sufficient is frankly ridiculous.
If 4 GB can produce peak lastgen graphics with games like Spiderman, RDR2, why do you expect me to believe or accept that 8 GB's limit at 1080p or 1440p is N64 textures? If 4 GB VRAM can produce RDR2 levels of graphics, I could simply expect a graphical generational leap on 8 GB at lower resolutions.

Also, you're dismissing multiplatform development and constraints of Series S and X and also dismissing what we have in tow.

If you think "console ssd magic" makes 8 GB obsolete, you can also theorize 12 GB will be obsolete too. What is the limit? Why do you think 12 gb will be enough? Or won't it? If not, who do you think these devs will sell the games? Throwing out "ps5 uses super duper compression will need a lot of vram on pc!" how much VRAM do you concur people will need then? 12 GB barely has enough idle VRAM to give games 10-11 GB which is what PS5 uses.
 
Every gen, you always have to upgrade your pc? When the ps4 came out, I had a radeon hd 6950 and I didn’t complain that I had to upgrade my GPU to play “next gen” games. I just upgraded. People are not forced to upgrade if they don’t want to. They can play on medium or low settings but they don’t want to so they complain instead. There are lots of games you can play on 8gb gpus. Old games, indie games, even other new games. One does not have to play TLOU part 1.

I don't think anyone is saying their 2060 should be able to play the game on Ultra. The only complaints I'm seeing are that medium, low textures look very bad, which they do compared to pretty much any other game, and that the game is heavily cpu bottlenecked even on cpus that are much better than what's in the consoles. I mean, I'd like to see a texture comparison between the pc on medium and the PS4 Pro version of the last of us. I'd guess the PS4 Pro would win.

The Ryzen 3600x came out in 2019. The same with the 2060 Super. They're not even old. Looking at the prices of pc parts, do we really want to go back to the days when people had to upgrade all of the time? Is it not better for consumers to make games scale? This attitude of in the old days we had to upgrade our pcs all the time is plain weird. Those days absolutely sucked. It was expensive and that's why tons of people fled the PC market as soon as consoles had internet and multiplayer. It's bad for the health of the pc market.
 
I don't know the reason but it is unnecessarily brought into discussion. My friend was playing God of War at 1080p/medium on his 1060 where his card was barely mustering 45-50 framerate average. VRAM usage? 4.-4.2 That's it. High? 35-40 FPS. Get more VRAM usage but card's raster gives out.


This is one of the peak lastgen PS4 titles.


5.2 gigs but;


Works decent enough too with 4 GB budget.

But yeah... Let's act like 290 and 580 made good use of 8 GB. Or any PS4 titles actually required 6-8 GB.
This is funny lol but, thanks for inadvertently proving my point. The ps4/xb1 had like 5-6 gb for games and 2-3 gb was reserved for the os. The vram usage of games varied but a 6gb card worked well because devs couldn’t really use more for games.

A ps5/series x has 13-14gb for games and 2-3 gb is reserved for the os. However you want to use a card with 8gb of vram and before you say the ps5/Xbx are targeting 4k, we’ve already seen a lot of 1080p games on the console. Funny joke.
 
This is funny lol but, thanks for inadvertently proving my point. The ps4/xb1 had like 5-6 gb for games and 2-3 gb was reserved for the os. The vram usage of games varied but a 6gb card worked well because devs couldn’t really use more for games.

A ps5/series x has 13-14gb for games and 2-3 gb is reserved for the os. However you want to use a card with 8gb of vram and before you say the ps5/Xbx are targeting 4k, we’ve already seen a lot of 1080p games on the console. Funny joke.
Nope, these games function perfectly fine on PS4 equivalent settings with 4 gig VRAM on a 1650 super/980. Games can always allocate more. IF this is what your take was from videos what I've shared... bravo.


Works brilliantly.


Works brilliantly and stable.


Works brilliantly and does not even saturate full of 4 GB at 1080p on One X equivalent settings (higher than PS4)

Even 1440p is doable on 4 gigs with one x equivalent settings, with stable frametimes;


Look at another game that scales gracefully back to 4 GB without looking like a PS2 game.

Aaand all of a sudden: Here we go, obsoleting your 8 gb gpu at 1080p. PS2 textures, take it or leave it!

Do you think or believe any recent game that hard requires 8 GB looks 2x better than these games?
 
Last edited:
PS4 had 8 GB, which made 2 GB GPUs obsolete. But games gracefully scaled back to 4 GB VRAM at 1080p (PS4 resolution).
To be pedantic, games only had 5GB of RAM total for PS4 games. Some part of that has to be for code and base objects, so scaling to 4GB VRAM should have been Easy.
 
Nope, these games function perfectly fine on PS4 equivalent settings with 4 gig VRAM on a 1650 super/980. Games can always allocate more. IF this is what your take was from videos what I've shared... bravo.
I think it’s you who’s missing the point. You’re the one who was talking about ratios while not understanding the issue at hand. A ps4 game could use a maximum of 6gb of ram and not all that would be dedicated to graphics. Using Killzone Shadowfall as an example, it used just 3072mb for graphics so obviously, if that got a pc port, a 4gb card would work fine.

Now, devs have 13-14gb to play with and if > than 8gb is allocated to graphics, then a GPU with 8gb of vram is insufficient. It’s not really a hard concept to grasp. Like I said, thanks for proving inadvertently proving my point.
 
No, you are the one who is not getting the point. It is at this point clear that both consoles most likely allocate around 10 GB for GPU operations for 4K textures, assets and 4K upscaling (requires a lot of memory buffer, even if the internal resolution is low). A game that targets 10 GB VRAM for 4K upscaling should gracefully scale back to 8 GB at 1080p/1440p upscaling situations with just a tiny bit reduced texture detailing. This shouldn't be really hard concept to grasp. 8GB GPUs are not meant for 4K screens/4K upscaling unlike Xbox SX / PS5.

The fact that 7.5 GB is enough for 1440p/high/PS5 equivalent settings in Last of Us 2 where native 4K requires 10 GB proves that scaling is indeed possible and even happens in this very port. It just so happens that there are no inbetweens between med and high textures and game's false OS+apps report is spreading FUD due to how unnecessarily big number it allocates. If they slightly reduced texture quality, most people would casually play at 1440p without any problems (and also reduce the weird OS+apps thing where it is FUD and misinformation)

As I said, you even refute Series S point by saying decompression and stuff. If 8 GB of Series S can magically be enough but 8 GB GPU goes jank, then technically "14 GB to work with" should easily destroy 12 GB VRAM GPUs too. Then, once again, who will you sell the games to if that happens? Just who? (Ah, but they will half ass the Series S ports, ain't it? Because you would find it appopriate. After all, death to 6-8 GB memory budgets. Damn the scalability. As if a plague tale requiem does not exist. Or, Flight Simulator, you know..)
 
Last edited:
Look at another game that scales gracefully back to 4 GB without looking like a PS2 game.

Aaand all of a sudden: Here we go, obsoleting your 8 gb gpu at 1080p. PS2 textures, take it or leave it!

Do you think or believe any recent game that hard requires 8 GB looks 2x better than these games?
It seems that these presets do not really scale back the amount of details and numbers of these objects. A GTX1060 with 6GB cant use the high textures with the low preset (this is with FSR 2.0 quality):

It has more details than PS4 Pro version, but it runs at a sub 1080p resolution with less than 50 FPS...
 
No, you are the one who is not getting the point. It is at this point clear that both consoles most likely allocate around 10 GB for GPU operations for 4K textures, assets and 4K upscaling (requires a lot of memory buffer, even if the internal resolution is low). A game that targets 10 GB VRAM for 4K upscaling should gracefully scale back to 8 GB at 1080p/1440p upscaling situations with just a tiny bit reduced texture detailing. This shouldn't be really hard concept to grasp. 8GB GPUs are not meant for 4K screens/4K upscaling unlike Xbox SX / PS5.

The fact that 7.5 GB is enough for 1440p/high/PS5 equivalent settings in Last of Us 2 where native 4K requires 10 GB proves that scaling is indeed possible and even happens. It just so happens that there are no inbetweens between med and high textures and game's false OS+apps report is spreading FUD due to how unnecessarily big number it allocates.
Confidently wrong, I like it. If 7.5gb was enough for TLOU at high then 8gb users would set textures to high and only complain about cpu performance other bugs. The rest of your post is just speculation as you have absolutely on insight into the game as you didn’t work on the port.

Finally, what some people are having issues understanding is that consoles are the minimum requirement. As in, your pc must be superior to consoles in all aspects. Whining that your underpowered pc can’t run a game is a personal problem.
 
Status
Not open for further replies.
Back
Top