Sony PlayStation cross-platform game strategy

If I were to take a guess, it'd be the almost infinite combinations (probably not close to infinite but a prohibitive amount) of PC hardware and drivers. Especially drivers, as any driver change can alter how a shader needs to be compiled for a given graphics card. So, changes by NV or AMD to allow a game to run faster on X game can alter how a shader needs to be compiled for more than just X game.

But it's only the GPU, right? So nVidia and AMD can provide a service where devs upload their game, it gets built across 300 different GPUs, and the shaders cached for download. When the game or drivers get updated, the shader library is updated.

I mean, we have the same work being done by millions of computers. How many 3060 TIs are compilling shaders locally? It could be bit-torrented across PCs as part of the driver deployment software from nVidia/AMD?
 
Last edited:
But it's only the GPU, right? So nVidia and AMD can provide a service where devs upload their game, it gets built across 300 different GPUs, and the shaders cached for download. When the game or drivers get updated, the shader library is updated.

I mean, we have the same work being done by millions of computers. How many 3060 TIs are compilling shaders locally? It could be bit-torrented across PCs as part of the driver deployment software from nVidia/AMD?

Oh, I'm not arguing against doing something like that, just postulating some reasons why it might not be happing.

There's also other factors like dev. time, etc. What we need, IMO, is some industry standard for aggregating and distributing pre-cached shaders. But then getting all the major players to agree to a standard for that isn't a small undertaking.

Regards,
SB
 
All we REALLY need... is for developers to play the f out their games and properly create PSO caches and then include the full cache data with the game so we can compile every damn thing upon initial load...

This needs to just become industry standard... and accepted as a fact of life by PC gamers.

I swear this is a developer logistics issue more than anything technical. It's just too much of a hassle currently for developers to do a proper job of.. IMO.
 
All we REALLY need... is for developers to play the f out their games and properly create PSO caches and then include the full cache data with the game so we can compile every damn thing upon initial load...

It's a bit unrealistic to expect every developer to have every single possible GPU that is on the market today (discrete and integrated) and then run each of them through the game with multiple different driver sets and then redo that every single time NV, AMD or Intel release a driver (both release and BETA).

Even if you limit it to say the last 5 years of GPUs and drivers plus let's say the next 5 years worth of GPUs and drivers as they are released, that's a LOT of iterations and a lot of time. But then that also removes one of the best things about PC gaming. The fact that you can run even the newest games on some really REALLY old hardware. So, now you'd be limited to running the game on the past 5 years and next 5 years worth of hardware and drivers and if you fall outside of that, you're SOL. Assuming smaller developers can even afford to do all of that.

What would be more useful but still potentially really expensive is having companies that can do that for a developer ... at a cost. But again, smaller developers may not be able to afford it.

"Crowdsourcing" seems to be the best option for precaching, but then if you're an early adopter, hello stutters if still don't pre-generate the cache before running the game.

IMO, PC gaming developers should just bite the bullet and pre-compile the shaders at the start of the game. They can potentially check against a database to see if that hardware+driver combination already exists and download it instead of compiling it. Of course, then if you are a smaller developer you have to maintain a repository as well as pay any bandwidth fees incurred from downloads of that cache.

Of course, that's where a storefront can step in and host the shader caches per game. But it's possible bandwidth fees from that might be passed along to the developers anyway if it results in excessive extra bandwidth. IE - a user frequently downloading and installing drivers which may or may not trigger a need for a new shader cache.

Everything is always simple when looking from the outside in, but if it was so simple it'd be a solved problem.

IMO, the best and most simplistic option is to just pre-compile the shader cache when the game starts if it is needed or if a graphics setting change necessitates a re-compile then do it when applying those settings.

Regards,
SB
 
It's a bit unrealistic to expect every developer to have every single possible GPU that is on the market today (discrete and integrated) and then run each of them through the game with multiple different driver sets and then redo that every single time NV, AMD or Intel release a driver (both release and BETA).

Even if you limit it to say the last 5 years of GPUs and drivers plus let's say the next 5 years worth of GPUs and drivers as they are released, that's a LOT of iterations and a lot of time. But then that also removes one of the best things about PC gaming. The fact that you can run even the newest games on some really REALLY old hardware. So, now you'd be limited to running the game on the past 5 years and next 5 years worth of hardware and drivers and if you fall outside of that, you're SOL. Assuming smaller developers can even afford to do all of that.

What would be more useful but still potentially really expensive is having companies that can do that for a developer ... at a cost. But again, smaller developers may not be able to afford it.

"Crowdsourcing" seems to be the best option for precaching, but then if you're an early adopter, hello stutters if still don't pre-generate the cache before running the game.

IMO, PC gaming developers should just bite the bullet and pre-compile the shaders at the start of the game. They can potentially check against a database to see if that hardware+driver combination already exists and download it instead of compiling it. Of course, then if you are a smaller developer you have to maintain a repository as well as pay any bandwidth fees incurred from downloads of that cache.

Of course, that's where a storefront can step in and host the shader caches per game. But it's possible bandwidth fees from that might be passed along to the developers anyway if it results in excessive extra bandwidth. IE - a user frequently downloading and installing drivers which may or may not trigger a need for a new shader cache.

Everything is always simple when looking from the outside in, but if it was so simple it'd be a solved problem.

IMO, the best and most simplistic option is to just pre-compile the shader cache when the game starts if it is needed or if a graphics setting change necessitates a re-compile then do it when applying those settings.

Regards,
SB

Wouldn't it be easiest to just have AI run through the game on different graphics cards over and over and over again ?

I'd think it be easy for Nvidia/ AMD / intel to just set up servers to generate shader caches with and let these companies maintain the caches pc / linux and whatever else might exist out there.
 
Wouldn't it be easiest to just have AI run through the game on different graphics cards over and over and over again ?

I'd think it be easy for Nvidia/ AMD / intel to just set up servers to generate shader caches with and let these companies maintain the caches pc / linux and whatever else might exist out there.

Indeed, all this AI/ML power these days, should be able to assist something with this.
 
It's a bit unrealistic to expect every developer to have every single possible GPU that is on the market today (discrete and integrated) and then run each of them through the game with multiple different driver sets and then redo that every single time NV, AMD or Intel release a driver (both release and BETA).

Even if you limit it to say the last 5 years of GPUs and drivers plus let's say the next 5 years worth of GPUs and drivers as they are released, that's a LOT of iterations and a lot of time. But then that also removes one of the best things about PC gaming. The fact that you can run even the newest games on some really REALLY old hardware. So, now you'd be limited to running the game on the past 5 years and next 5 years worth of hardware and drivers and if you fall outside of that, you're SOL. Assuming smaller developers can even afford to do all of that.

What would be more useful but still potentially really expensive is having companies that can do that for a developer ... at a cost. But again, smaller developers may not be able to afford it.

"Crowdsourcing" seems to be the best option for precaching, but then if you're an early adopter, hello stutters if still don't pre-generate the cache before running the game.

IMO, PC gaming developers should just bite the bullet and pre-compile the shaders at the start of the game. They can potentially check against a database to see if that hardware+driver combination already exists and download it instead of compiling it. Of course, then if you are a smaller developer you have to maintain a repository as well as pay any bandwidth fees incurred from downloads of that cache.

Of course, that's where a storefront can step in and host the shader caches per game. But it's possible bandwidth fees from that might be passed along to the developers anyway if it results in excessive extra bandwidth. IE - a user frequently downloading and installing drivers which may or may not trigger a need for a new shader cache.

Everything is always simple when looking from the outside in, but if it was so simple it'd be a solved problem.

IMO, the best and most simplistic option is to just pre-compile the shader cache when the game starts if it is needed or if a graphics setting change necessitates a re-compile then do it when applying those settings.

Regards,
SB
I'm not sure I understand what you're saying. I'm saying the same thing you are..

Devs need to play the games to collect PSO data to know which shaders/materials require which PSOs, right? This is what allows precompilation in the first place Which also allows us to precompile upon boot for our respective systems.

The more PSOs they cover, by playing the game, the better and more we can precompile.

What im saying is what Unreal Engine already allows for... we just need devs to do it.

UE5 will hopefully make this automated for developers, and will actually compile on background threads upon asset load into memory, compiling any PSOs required, instead of just in time upon draw call.

At least that's my understanding.
 
Yea, which is why it's imperative that those who understand what's happening speak up loudly.
Speaking, posting and complaining won't change anything. Actions change things.

This is why it's not going to change. Gamers can't, or don't want, to change. They sure do like complaining though.
 
Speaking, posting and complaining won't change anything. Actions change things.

This is why it's not going to change. Gamers can't, or don't want, to change. They sure do like complaining though.
Speaking, posting, and complaining are actions....

It's literally what gamers do to get things changed...

If you want to act like those 3 things have never had an effect on this industry.. then I don't know what else to say.
 
Speaking, posting, and complaining are actions.... If you want to act like those 3 things have never had an effect on this industry.. then I don't know what else to say.

They are actions, but not actions that bring about change. If gamers want things to change for the better then there is only one way of things: the one that deprives those adopting poor practising of financial renumeration. I remember the outcry of those pre-ordering Watch Dogs and those pre-ordering Cyberpunk 2077.

But if you can cite tangible evidence of people posting and complaining that has had an effect on the industry, then I welcome seeing it.
 
But if you can cite tangible evidence of people posting and complaining that has had an effect on the industry, then I welcome seeing it.


 

Not to mention:


or how about their offline BC policies?


How about Sony and cross-play?

Did someone mention NFTs?

I shouldn't even need to say more..


When a narrative online starts forming around your game engine.. and you get negative sentiment building up... you take notice and act. Which is what Epic (I believe) are doing now with Unreal Engine 5.1. They simply can't ignore this growing issue any longer. They even mention "now that DX12 and Vulkan are the focus..." That means... we have to do something.. or literally all the games which use our engine will have this very easily identifiable issue.. and that will make us look bad.. and that will eventually cause sales to drop.
 
They are actions, but not actions that bring about change. If gamers want things to change for the better then there is only one way of things: the one that deprives those adopting poor practising of financial renumeration. I remember the outcry of those pre-ordering Watch Dogs and those pre-ordering Cyberpunk 2077.

But if you can cite tangible evidence of people posting and complaining that has had an effect on the industry, then I welcome seeing it.
Every MS XBOne u-turn... There's definitely empirical evidence that grumbling has changed things, but it's probably not easy to for someone to collate that info and not worth the effort outside of a Masters thesis. ;)

Edit: others already posted more examples.
 
I mean, we even have examples of developers directly listening to consumers, and acting on it, about these issues. Psychonauts 2 initially had terrible shader compilation stuttering, some of us complained on the forums, and the developers said they'd look into it, and it was later fixed.

Same goes for The Ascent developers.

These are Unreal Engine 4 games btw. Complaining, posting, voicing youself about these issues, DOES get attention... eventually. Sites like Digital Foundry, and certain reviewers calling out this stuff amplifies it.

Look at Capcom and the RE Village stuttering DRM issue. Capcom didn't even respond to people calling out this issue initially... until hackers proved it was a solvable issue and that it was caused by DRM... sites like Digital Foundry called out Capcom.. and it was fixed.

Things get fixed PRECISELY at the moment when it make X or Y look bad. And the stuttering in Unreal Engine games is beginning to make Epic look bad, because so many games use it and it's so widespread now. Thus the issue is getting attention.
 
What is it about DirectX that means Steam's option only works with Vulcan?

From the Shader Compilation on PC thread:

And now you just described what e.g. Steam is doing under the label "Shader Pre-Caching".

Except it's only functioning for Vulkan, and not DX12 API, since it's based entirely on layer injection, for which the used API needs to have a proper infrastructure in the first place.

Want that to happen for DX12 too? Start pestering Microsoft about adding a proper loader about 5 years back, because getting it to work with that black box past-the-fact is a little bit to late...

This suggests that there is perhaps an architectural issue with DX that makes inserting compiled caches more difficult than it should be.

Dammit, forgot that Sackboy used UE4. Crap.

And then it also comes down to who holds the repository? Initially, you'd think Steam would be a logical point, but then not all gamers are on Steam. And even for games on Steam, most of the games that pre-compile shaders do so in their own launcher, so Steam may not even have direct access to them. Steam only has the legal right to redistribute the software package as provided to them by the developer.

Yeah, that's why I've never been too enthused about the possibility of Steam being the main avenue to deliver these potentially compiled caches.

I mean hell it's better than nothing, and if Steam's shader pre-caching could end up working with DX games it could at least be the driver to bring more attention to this (Oh your game is stuttering? Should have bought it on Steam!). But I think ultimately if/when this occurs for DX games, it's really going to have to be through Nvidia/AMD/Intel, or MS.

It doesn't look like we're close to this kind of solution regardless. Gotta first get the option to compile shaders at startup to be as expected as resolution settings.
 
Last edited:

Not to mention:

I was really looking for examples of this in the PC gaming space. Microsoft and Sony in the console space are a very different situation with a different than Studio X, Y and Z making PC games.

From what I've read this like a tricky problem to solve and part of companies being wiling to change plans, or fix problems, is there being a clear financial incentive to do so. I'm not seeing it here. Sure, it's annoying but there would need to be compelling evidence that people are no longer buying games because of it.
 
The fact that Epic have build the PSO collection capability into UE5.1 right after scs has been getting a lot of media coverage (off the back of a lot of forum complaining) seems like a pretty solid example of it working.
I don't know what any of this means, sorry! Can you elaborate?
 
I don't know what any of this means, sorry! Can you elaborate?

I don't really understand the details of it myself but as far as I gather, Epic are planning to implement a new feature in the next iteration of Unreal Engine 5 which makes the process of creating a PSO cache during development much easier for developers (automatic possibly?). This in turn makes it much simpler to pre-compile shaders in the game which will reduce or eliminate shader compilation stuttering (scs).

This is a change they've implemented seemingly in response to the recent media attention scs has been getting which as far as I can tell was started by our friend @Dictator on Digital Foundry (perhaps with some influence from particularly vocal forum members....) and picked up from there by other outlets like Linus Tech Tips.
 
Back
Top