Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
AC Unity 1 year (Nov 2014) into the PS4, CPU gap was also way larger (each 4770k core was probably x4 the speed of a console Jaguar core, a gap which we probably won't see on the PC even when a PS6 hits). For context the GTX 980 was released Sept, 2014 -


69436.png
I’m not sure what you are debating with this graph here.

Uncharted 4 runs super smooth for me. This game on the other hand needs few CPU optimization passes at least.
On what GPU?
 
Yeah, finely tuned textures for 6 and 8 GB cards is going to be key for good PC ports moving forwards. There's no reason these cards should get PS2 esque textures in 2023, like we have seen in Forspoken and TLOU.
 
I don’t agree with this take at all. 8gb of vram should have been the base minimum of a long time ago. Dram exchange list 8gb of gddr6 at an average of 3.409. That is completely negligible when compared to the price increases of gpus from 2018. Thankfully, some of us figured out that we could just buy nvidia stock and let the stock price increases driven by their buybacks pay for our new GPUs.

With to regards as to why a game might need more vram at 1080p, try texture variety, texture quality, RT, etc. is that why TLOU1 is using 8gb? Nope. The memory subsystem on pc is very different to that of the ps5. If I were to hazard a guess, the cost of completely rewriting the system in place to better utilize pc hardware outweighed the potential benefit in terms of revenue for the port. Then again, if this were some nobody indie game, I doubt many would be talking about this at all.
What do you do with more than 8GB VRAM at 1080p? Needing more VRAM for BVH is understandable. But when you need alone ~6GB for textures, why would it be possible to get away with 16GB VRAM in 4K? You would need alone 24GB for the same quality in 4K...
 
Yeah, finely tuned textures for 6 and 8 GB cards is going to be key for good PC ports moving forwards. There's no reason these cards should get PS2 esque textures in 2023, like we have seen in Forspoken and TLOU.
I sincerely hope not. Imo, an 8gb card today should quickly become like a 4gb card. I don’t want games held back because people don’t want to upgrade their hardware. This has never been the case in the pc and I hope it doesn’t start now.
 
I disagree with this post because there's a pattern now with recent game releases all looking like dogshit and stutter a lot with cards that have 8 GB and below.

IMO its perfectly valid to raise concerns that 8 GB might be not enough in the long run and that one GPU vendor we all know didn't calculate the needs for VRAM too much into the future, which now hurts their consumers badly.

With techniques like VT in UE5 and SFS, this could change of course. But right now Sampler Feedback is used in no games and UE5 titles are still quite far off.

I sincerely hope not. Imo, an 8gb card today should quickly become like a 4gb card. I don’t want games held back because people don’t want to upgrade their hardware. This has never been the case in the pc and I hope it doesn’t start now.
What are you worried about? Scalability has always been part of the PC experience. If you want the best textures, just crank the settings up. All I'm saying is that PC developers should implement good looking textures for cards with lower amounts of VRAM too.
 
What do you do with more than 8GB VRAM at 1080p? Needing more VRAM for BVH is understandable. But when you need alone ~6GB for textures, why would it be possible to get away with 16GB VRAM in 4K? You would need alone 24GB for the same quality in 4K...
Leave that to game developers to figure out. We’ve been stuck at 8gb since at least pascal and frankly it’s time to move on. The defence of 8gb in the pc space to me is frankly surprising. It’s almost console like rhetoric. I frankly hope that GPU vendors increase the ram allocation next gen to the point where 12gb becomes the base minimum.
 
Nobody is defending 8GB. But VRAM doesnt determine a good graphics card. Here are two example in which GPUs with 16GB+ are not faster than GPUs with 10GB to 12GB:
metro-exodus-rt-3840-2160.png


cyberpunk-2077-rt-1920-1080.png

 
Leave that to game developers to figure out. We’ve been stuck at 8gb since at least pascal and frankly it’s time to move on. The defence of 8gb in the pc space to me is frankly surprising. It’s almost console like rhetoric. I frankly hope that GPU vendors increase the ram allocation next gen to the point where 12gb becomes the base minimum.
We shouldn't move from 8GB just to do it. We should do it if the visuals on display require it and frankly so far, they don't. A Plague Tale Requiem and Callisto Protocol run just fine on 8GB cards and both look better than TLOU Part 1. The variety of assets in TLOU Part 1 is seriously impressive but to the point of justifying 8GB+ at 1080p? Certainly not.
 
What are you worried about? Scalability has always been part of the PC experience. If you want the best textures, just crank the settings up. All I'm saying is that PC developers should implement good looking textures for cards with lower amounts of VRAM too.
Scalability has its limits. As a ridiculous example, you can’t scale cyberpunk to run adequately on a 2gb GPU. As developers aim to push hardware, it’s inevitable that old hardware will get left behind. For example, in older games, all the enemies look exactly the same. It’s basically a limited asset set for each enemy type. Newer games may want to do away with this by providing more variety in assets. This in turn requires more resources.

Also, let’s call a spade a spade. A lot of pc users have enjoyed almost a decade worth of stagnation due to the length of the console gen. Now the thought of lowering settings is bothering a lot of people. There are people in the steam forums complaining about how they have to use medium settings. A stark contrast from the 90s and early 2000s.
 
We shouldn't move from 8GB just to do it. We should do it if the visuals on display require it and frankly so far, they don't. A Plague Tale Requiem and Callisto Protocol run just fine on 8GB cards and both look better than TLOU Part 1. The variety of assets in TLOU Part 1 is seriously impressive but to the point of justifying 8GB+ at 1080p? Certainly not.
The Callisto protocol is a last gen game and tbf, I don’t think a plagues tale requiem looks good at all. I played it on Xbox, played it on pc and I was not impressed. It’s alright. As to whether the asset variety in TLOU justifies 8gb, I can’t say that until I understand how the game is architected. Many are operating from the assumption that it’s inefficient. I don’t know that to be the case and so I cannot comment on it.
 
There’s no issue with asking people not to use it in GPU vendor wars but, frankly the suggestion is asinine. The game is part of the group of statistical outliers. It’s still a game that a lot of people clearly want to play with their varying gpus. This means that people will compare will naturally compare GPUs because they want to play it. It’s the same as when a big game comes out for example cyberpunk. Some people based their GPU purchase solely on how the GPU performed in cyberpunk because at the time, it was what was most important to them.

Honestly for Alex’s sake, I hope he doesn’t make the mistake of using this as a point to defend the validity of 8gb of vram. He’ll catch an unbelievable amount of flak if he makes that mistake.
I don’t think he’s targeting vram, as he is just discussing the load on CPU affecting GPU performance. As Clukos has indicated here, there are draw calls being submitted that don’t even work, costing 1ms of waste on a 4090. These are I believe what needs to be addressed on a patch.

I do agree it’s fine time to move on from 8GB vram. But with virtual textures, this problem is largely contained not counting the exception.

I think something is “up” with the VT system. And maybe worth exploring. But the idea that texture quality must be reduced to reduce the pool size as opposed to reducing resolution seems
Opposite of what I understand. Unless they aren’t doing virtual textures, then the need for more VRAM makes a lot of sense.

Here is a description of unity’s:

Streaming Virtual Texturing (SVT) is a feature that reduces GPU memory usage and texture loading times when you have many high resolution textures in your Scene
. It splits textures into tiles, and progressively uploads these tiles to GPU memory when they are needed.

SVT lets you set a fixed memory cost. For full texture quality, the required GPU cache size depends mostly on the frame resolution, and not the number or resolution of textures in the Scene. The more high resolution textures you have in your Scene, the more GPU memory you can save with SVT.

SVT uses the Granite SDK runtime. The workflow requires no additional import time, no additional build step, and no additional streaming files. You work with regular Unity Textures in the Unity Editor, and Unity generates the Granite SDK streaming files when it builds your project.
 
Last edited:
Nobody is defending 8GB. But VRAM doesnt determine a good graphics card. Here are two example in which GPUs with 16GB+ are not faster than GPUs with 10GB to 12GB:
metro-exodus-rt-3840-2160.png


cyberpunk-2077-rt-1920-1080.png

Both your examples are basically last gen games though with RT? I didn’t say vram determined good graphics but it does determine texture quality and asset variety. For example, the 3080 10gb is still faster than 12gb and even 16gb GPUs dependent on your texture settings in this game.
 
The Callisto protocol is a last gen game and tbf, I don’t think a plagues tale requiem looks good at all. I played it on Xbox, played it on pc and I was not impressed. It’s alright. As to whether the asset variety in TLOU justifies 8gb, I can’t say that until I understand how the game is architected. Many are operating from the assumption that it’s inefficient. I don’t know that to be the case and so I cannot comment on it.
That it is a last-gen game is irrelevant. Horizon Forbidden West is one of the best-looking games on the market with incredible assets yet it runs on a PS4 just fine and still looks good there as well.

TLOU Part I to justify this kind of performance would have to look above everything else right now but simply doesn’t. The fact that it has roots on the PS3, and features small and constrained environments with no RT to speak of makes its performance profile on PC suspect at best.
 
Last edited:
That it is a last gen game is irrelevant. Horizon Forbidden West is one of the best looking games on the market with incredible assets yet it runs on a PS4 just fine and still looks good there as well.

TLOU Part I to justify this kind of performance would have to look above everything else right now but simply doesn’t. The fact that it has roots on the PS3, features small and constrained environments with no RT to speak of makes its performance profile on PC suspect at best.
You're free to speculate but, Naughty Dog claimed to have rebuilt this game in part to become more familiar with and take advantage of the ps5. I don't know why it runs like this at all. Maybe if Alex has the opportunity to talk to the devs that worked on the project, we will know more. Until then, I can't comment.
 
That it is a last gen game is irrelevant. Horizon Forbidden West is one of the best looking games on the market with incredible assets yet it runs on a PS4 just fine and still looks good there as well.

TLOU Part I to justify this kind of performance would have to look above everything else right now but simply doesn’t. The fact that it has roots on the PS3, features small and constrained environments with no RT to speak of makes its performance profile on PC suspect at best.
Texturing is worth exploring on PC. I think they could not port the VT system from PS5, so no VT on PC which dramatically balloons VRAM size.
 
You're free to speculate but, Naughty Dog claimed to have rebuilt this game in part to become more familiar with and take advantage of the ps5. I don't know why it runs like this at all. Maybe if Alex has the opportunity to talk to the devs that worked on the project, we will know more. Until then, I can't comment.
I don't doubt that they did but you can see the PS3 constraints that were pushed further back with the PS4 game. For one, TLOU Part II features much, much larger environments uninterrupted by doors or other obstacles. I had a chuckle when I was playing TLOU Part 1 and saw Joel open a garage door that was completely out of place knowing full well it was placed there to separate the zones and prevent backtracking. You wouldn't see something like this in a PS5 game. This was a little trick used to circumvent hardware limitations at the time.

This however strengthens the point I'm making; the zones in TLOU Part 1 are quite small because of its PS3 roots and this hasn't changed on the PS5. Hence why I completely fail to even imagine why it's so demanding when a better-looking game with far larger environments scales down to the little PS4 with like 5.5GB of available VRAM without butchering the art and visuals.
 
Both your examples are basically last gen games though with RT? I didn’t say vram determined good graphics but it does determine texture quality and asset variety. For example, the 3080 10gb is still faster than 12gb and even 16gb GPUs dependent on your texture settings in this game.
All Assets have to scale with the resolution. It doesnt make sense to use 4K textures and objects in 1080p.

There is another current gen PS5 game on the PC: Miles Morales. With raytracing it uses only 9.3GB on a 4090 in 1080p (TLoU with Ultra needs 13GB) and it runs fine on a 3070 with 8GB: https://en.gamegpu.com/action-/-fps-/-tps/marvel-s-spider-man-miles-morales-test-gpu-cpu
Open world game, the player can go in any direction at any time with better image quality than the PS5 version - runs just fine with 10GB in 1440p and 12GB in 4K.
 
All Assets have to scale with the resolution. It doesnt make sense to use 4K textures and objects in 1080p.

There is another current gen PS5 game on the PC: Miles Morales. With raytracing it uses only 9.3GB on a 4090 in 1080p (TLoU with Ultra needs 13GB) and it runs fine on a 3070 with 8GB: https://en.gamegpu.com/action-/-fps-/-tps/marvel-s-spider-man-miles-morales-test-gpu-cpu
Open world game, the player can go in any direction at any time with better image quality than the PS5 version - runs just fine with 10GB in 1440p and 12GB in 4K.
I think an easy way to check is just look at texture quality at all settings and VRAM size for TLOU1. Simple snap shot at the same spot with various texture settings. And do the same test against a game that we know for sure does use virtual texturing and compare and contrast the differences.
 
Status
Not open for further replies.
Back
Top