Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
I want to couch this answer especially much with this is just my personal impression, not anything particular to do with job experience. There's also a lot of "it depends on your expectations on resolution, quality settings, etc." of course. That said, as games transition to treating PS5/XSX/XSS as the baseline and no longer supporting the older consoles, 8GB starts to feel a bit too constrained... especially if you're used to running high res desktop with multiple monitors and a zillion chrome tabs and discord and other electron apps all competing for resources. Windows is not great at sharing VRAM in oversubscription situations so you really want to avoid them in the first place. I was sort of shocked when they released the 3080 with only 8GB of RAM. If you're getting a high end GPU I'd really want to have at least 12GB and ideally 16GB if you can.

Obviously PC games will still have to provide settings that work on GPUs with less VRAM, but IMO it sucks to have to sacrifice quality due to VRAM rather than the GPU's core speed. While things like textures, geometry and shadow maps are increasingly virtualized so resident memory is more fixed, they still need a pool of memory to operate in. Add on top all the new stuff coming in that takes quite a bit of room (raytracing structures, GI structures, more complex animation and simulations, etc) and I don't think we're going to go backwards on VRAM needs any time soon.

I find it hilarious that some of us were getting ridiculed for saying the exact same thing just a few weeks back and the PC diehards were up in arms. Now that Andrew here and Oliver from DF are providing the same message, it seems value of contribution is much more dependent on the messenger as opposed to the message and it's substance in this forum. Haha, fine with me, so long as we ultimately allow facts to dictate the discussions. So yes, with the newer consoles and specifically thanks to their I/O memory architecture with dedicated hardware decompression, 10gb cards and below will ultimately prove to be a not so great investment for the duration of the generation. And if you PC gamers are dead set on maintaining similar console/pc relationship that we had in prior generations, I suggest you wait until these GPU vendors (well NVidia specifically) get their acts together. Also, consider the likely release of Pro consoles as early as next year before you spend silly amount of money for a card that ultimately gives you lower than expected performance when measured against console counterpart.

It's about time you all stop blaming developers and "lazy PC ports" and instead amp up the pressure on these vendors who are selling high compute cards with incohesive memory systems.
 
I find it hilarious that some of us were getting ridiculed for saying the exact same thing just a few weeks back and the PC diehards were up in arms. Now that Andrew here and Oliver from DF are providing the same message, it seems value of contribution is much more dependent on the messenger as opposed to the message and it's substance in this forum. Haha, fine with me, so long as we ultimately allow facts to dictate the discussions. So yes, with the newer consoles and specifically thanks to their I/O memory architecture with dedicated hardware decompression, 10gb cards and below will ultimately prove to be a not so great investment for the duration of the generation. And if you PC gamers are dead set on maintaining similar console/pc relationship that we had in prior generations, I suggest you wait until these GPU vendors (well NVidia specifically) get their acts together. Also, consider the likely release of Pro consoles as early as next year before you spend silly amount of money for a card that ultimately gives you lower than expected performance when measured against console counterpart.

It's about time you all stop blaming developers and "lazy PC ports" and instead amp up the pressure on these vendors who are selling high compute cards with incohesive memory systems.
Judging by this post it's quite clear why you were getting ridiculed, and why Andrew doesn't.

Developers and publishers (whoever deserves the blame) are definitely to blame for a lot of these issues.... we literally see it being addressed with some of the latest releases... like TLOU P1 for example. We also have seen other high end games which perform just fine, like CP2077.. If Naughty Dog would have done things correctly in the first place, it wouldn't have needlessly added to this conversation. Same with Hogwarts, and others.. which VRAM issues have been largely addressed post launch.

What we're learning about VRAM requirements in the latest games now, is that consoles have finally gotten to the point where developers can't just transplant their console code over to PC and have ample performance and memory capacity to ensure that it works... they actually have to put some effort into optimization and resource allocation.
 
Last edited:
The reality is that devs need to optimize more. This is true for the PC conversions coming out now and especially when devs are fully getting the most out of the current gen consoles.

Devs are too used to PC being so far ahead that optimization for PC is a lot easier. Consoles have given them that out both in the 7th gen(for being harder to code for in general and the rate of PC upgrades being more significant in that gen) and the 8th gen due to the weakness of the jaguar CPUs and HDDs always allowing for a wide gap between PC and console.

But consoles have caught up and are much more balanced machines lacking not GPU power, CPU power, RAM capacity nor bandwidth, nor storage speeds.

I'm sure it's harder for devs now. But as a console only player, PC users deserve better. They pay for games just like anyone else
 
As Remij says - the amount of post launch patches radically improving the visual quality of Games on 8 GB GPUs shows that the issues lies on the development side. PC versions get the obvious short end of the stick for time and money investment (with Xbox not too far behind) and the PC crowd including myself are not going to just sit there and be happy with poor quality.

PC Games need to scale regardless of what consoles are - that is the point. These console focused multiplatform Games are not Crysis or something, they are Not so utterly ground breaking unheard of technical masterpieces that necesseitate only the best and most expensive Hardware to run and look good. Getting angry at manufactures long after the fact is a useless endeavour and does nothing to change the situation. Demanding better ports from ISVs actually changes things. Imagine if instead of focusing on #Stutterstruggle as a problem I just made videos about how MS, NV, Intel and AMD are to blame for having different hardware. That would bring 0 progress in anywhere near the next 5 years - games would still launch with even more PSO issues than before and Unreal most definitely would not have had initiative to improve it so utterly in UE5.

The problem is an ISV problem.
 
Last edited:
I'll definitely agree optimization passes could use some more time and effort. But even on console it's not as if certain publishers are not launching with problems even in the optimal scenario. So it's not just a problem of focusing on specific hw for optimization but not enough optimization in general to go around. I think things will probably become worse if the pro consoles actually exist because now your talking about 5 skus on console alone, and that's when we just barely got rid of crossgen.

Do Sony and MS believe this will lead to better products
 
I'll definitely agree optimization passes could use some more time and effort. But even on console it's not as if certain publishers are not launching with problems even in the optimal scenario. So it's not just a problem of focusing on specific hw for optimization but not enough optimization in general to go around. I think things will probably become worse if the pro consoles actually exist because now your talking about 5 skus on console alone, and that's when we just barely got rid of crossgen.

Do Sony and MS believe this will lead to better products

Exactly, it's the complexity of modern games coupled with the number of skus and performance modes that are greatly contributing to this problem. There's just not enough optimisation time to go around, and PC will naturally be at the end of the queue in that regard in terms of time per hardware configuration.

I do think you have a valid point also on the performance front where certainly on the CPU side the new consoles are much closer to PCs that the previous gen, especially when hardware decompression is being used in heavy streaming scenarios as we've seen in the likes of TLOU and Forespoken - which GPU decompression and DirectStorage should help to address.

On the memory front, of course if you're going for a current gen high end GPU you don't want less than 12GB because the expectation of the current gen high end PC GPU's is to be able to go well beyond console settings. And higher settings require more memory.

I dont think that 8GB (let alone 10GB) should be a serious limiter in the class of GPUs that it's used on though, not it a reasonably optimised game - as we've seen by all the games where this was initially an issue and held up as 'evidence' of the obsolescence of 8GB which were later resolved. That amount of memory does as you say though require a level of optimisation from developers that in the current generation they may struggle to find time for.
 
We need to wait a little to be sure what will be the VRAM limit for this generation after a very long cross gen generation phase. It is not cross generation but this is not like TLOU Part 1 will be the pinnacle of what will be done this gen.

Dev can do better but ISV too and it begins with a 16 GB 4060Ti ...
 
Yea, I really wish developers would cool off with all these different modes for consoles, which in my opinion, is getting to the point where it takes away from the simplicity which is the main allure. You have performance, quality, RT, 40fps, 120hz, VRR, HDR.... and even VR...

I'm sure I'm in the vast minority.. but it be what it do. I'd like to see them just focus their games around the performance profile they think is the right choice for their game, and cut out the other stuff... instead focusing on polish and optimization.

Also, it would be nice if they only released one console sku per generation like back in the old days and not these mid gen upgrades. I think they run the risk of seriously diminishing the impact of their next true console iterations, because gap in power increases will shrink quickly. You can only do so much at the $400-600 range.

Bring it back to basics. Develop one console. Develop one mode for your game. And just polish/optimize the hell out of them.
 
Yea, I really wish developers would cool off with all these different modes for consoles, which in my opinion, is getting to the point where it takes away from the simplicity which is the main allure. You have performance, quality, RT, 40fps, 120hz, VRR, HDR.... and even VR...

I'm sure I'm in the vast minority.. but it be what it do. I'd like to see them just focus their games around the performance profile they think is the right choice for their game, and cut out the other stuff... instead focusing on polish and optimization.

Also, it would be nice if they only released one console sku per generation like back in the old days and not these mid gen upgrades. I think they run the risk of seriously diminishing the impact of their next true console iterations, because gap in power increases will shrink quickly. You can only do so much at the $400-600 range.

Bring it back to basics. Develop one console. Develop one mode for your game. And just polish/optimize the hell out of them.
I think that console users having tasted 60fps standardized modes will definitely be unreasonable about devs having a breaking point. It's an "expectation" now.

I was one of the minority of people who was against graphical options introduced in the last iterative cycle and especially at the start of this gen because it was easy to see that it would become a problem when devs didn't have headroom like that to fall back on to make it "easy" to unlock the fps and gain automatic performance.

I also made the argument then that whether it was 60 or 30, it was critical that devs set expectations early on for a singular performance metric. But then insomniac added 60fps mode for spiderman. Then a 60fps balanced mode. Then a 40fps mode. It's surely easier if your only working on one sku. But that doesn't apply to third party developers.

More powerful upgrades will only further complicate the situation for not much gain.
 
I think that console users having tasted 60fps standardized modes will definitely be unreasonable about devs having a breaking point. It's an "expectation" now.
I would offer that graphical options for consoles games isn't a wholly new things. A bunch of OG PlayStation games had options for this, including G-Police where you have settings for draw distance and frame rate. As current generation consoles support display output at 1080p, 1440 and 4K, games having options for quality vs performance were an inevitable development.

I don't see this trend reverting.
 
Status
Not open for further replies.
Back
Top