Shader Compilation on PC: About to become a bigger bottleneck?

On the bright side it it's really as heavy on the CPU as Alex has seen in the Matrix project then it should help drive PC CPU sales forwards and bring the average CPU performance way up!

Obviously that's searching pretty hard for a silver lining. Personally I'm much less worried about this though. It works fine on the consoles so with proper optimisation it should be fine on PC too. If you want 60fps, get a Zen4!

Zen2 or 3700x have been around since 2019, people expecting to play 'next generation' games should probably have a cpu as capable as one of these, their older than the consoles. Its a good thing that the gpu aint the problem, but the CPU for this UE5 sample.
And as John says, its the pc of the old again (the way it should be) :p
 
Zen2 or 3700x have been around since 2019, people expecting to play 'next generation' games should probably have a cpu as capable as one of these, their older than the consoles. Its a good thing that the gpu aint the problem, but the CPU for this UE5 sample.
And as John says, its the pc of the old again (the way it should be) :p
I think "as it should be" is if it was actually maxing out the predominant architecture of the time to deliver an experience which has no peer - which in this case would be heavily utilizing multicore CPU's.

From Alex's tests it's doing a rather poor job of that at the moment, and still relying on single threaded performance. A game engine that can't reach a sustained 60fps on the best PC hardware is anything but a positive for the PC imo.

Again though, early days. The shader compilation issues are more of a concern for me as this is something that really should have been addressed in UE5, but hopefully the attention this issue has gotten will mean a partial solution will be in place when we get to actual shipping games.
 
I think "as it should be" is if it was actually maxing out the predominant architecture of the time to deliver an experience which has no peer - which in this case would be heavily utilizing multicore CPU's.

Crysis 2007 wasn't going for multi/wide cores either, instead clock speed/single core performance was the most important, while GPU's scaled pretty well. Quite abit like what we see in the UE5 developer sample.So John isnt really wrong in mentioning it.

From Alex's tests it's doing a rather poor job of that at the moment, and still relying on single threaded performance. A game engine that can't reach a sustained 60fps on the best PC hardware is anything but a positive for the PC imo.

While that's true, there's also the fact that it doesn't really run all that well on consoles either, they are running at 30fps max, but dip way below that quite often (20fps or below, 15fps is being mentioned), at a native resolution we would be expecting during the 7th generation of consoles (1080p). And thats for the, as Alex mentioned, polished version of a tech demo for the consoles.
I see alot of YT comments regarding this, and their defenitely having a point there for sure, its not that positive of a view over there either.

Again though, early days. The shader compilation issues are more of a concern for me as this is something that really should have been addressed in UE5, but hopefully the attention this issue has gotten will mean a partial solution will be in place when we get to actual shipping games.

As Digital Foundry mentioned on this (i think it was John), they will have to, and they probably can (Fortnite).
 
Crysis 2007 wasn't going for multi/wide cores either, instead clock speed/single core performance was the most important,
Yes, as single thread performance was prioritized by X86 architecture at the time of development and looked to be the case going forward. That's not the case today and hasn't been the case for a decade +.
while GPU's scaled pretty well. Quite abit like what we see in the UE5 developer sample.So John isnt really wrong in mentioning it.
Yes, in the context that it's bad. It should not be the case.
 
Do you think Crysis was developed in a year?

You got a point. Still, when did dual cores come around? I had a A64 x2 in the latter half of 2005. This might not be a one to one comparison indeed, but there are similarities for sure. Both UE5 and Crysis aiming for clockspeed instead of spreading workloads over more cores.
 
Both UE5 and Crysis aiming for clockspeed instead of spreading workloads over more cores.
But my point is it's understandable that Crysis was limited by single-threaded performance, as the move to multicore only came about due to the failed scaling promises of the Netburst architecture. Intel was claiming future Pentium CPU's would be running at 10ghz, only well into Cryengine's development was this potential future revealed as impossible. Multithreaded engines that could truly scale were still a relative rarity for years after Crysis debuted, it took quite a while the developers to really get their head around the multicore era, and you can argue that some never really have.

Unreal 5 on the other hand, has been developed in an era where multithreaded performance has been paramount with very meagre single threaded performance gains for years and years, and no indication that this will change substantially going into the future. If it truly does end up being largely limited in performance due to single threaded bottlenecks, that's not necessarily being 'forward looking', rather the opposite - it's different if it was maxxing out all the cores and producing something that had no equal on any other platform because of it.

It bears mention as well that performance expectations have progressed since the original Crysis, with even console games often having 120fps options, and PC gamers complaining when a game is limited to 60fps, let alone struggling to reach it.

Again though, very early days in UE5 so who knows how actual games will perform, but I think you're taking Alex's comment that this is reflective of Crysis 1 as a positive, when I think he was just noting the irony.
 
But my point is it's understandable that Crysis was limited by single-threaded performance, as the move to multicore only came about due to the failed scaling promises of the Netburst architecture. Intel was claiming future Pentium CPU's would be running at 10ghz, only well into Cryengine's development was this potential future revealed as impossible. Multithreaded engines that could truly scale were still a relative rarity for years after Crysis debuted, it took quite a while the developers to really get their head around the multicore era, and you can argue that some never really have.

Unreal 5 on the other hand, has been developed in an era where multithreaded performance has been paramount with very meagre single threaded performance gains for years and years, and no indication that this will change substantially going into the future. If it truly does end up being largely limited in performance due to single threaded bottlenecks, that's not necessarily being 'forward looking', rather the opposite - it's different if it was maxxing out all the cores and producing something that had no equal on any other platform because of it.

It bears mention as well that performance expectations have progressed since the original Crysis, with even console games often having 120fps options, and PC gamers complaining when a game is limited to 60fps, let alone struggling to reach it.

Again though, very early days in UE5 so who knows how actual games will perform, but I think you're taking Alex's comment that this is reflective of Crysis 1 as a positive, when I think he was just noting the irony.

Ok well, let's hope Epic isn't this ineffective with their newest engine. And as you say, it could be due to early days.
 
Holy shit! Its often drawing more than 400watts for the GPU alone :mad:

EDIT: Though on the bright side you prolly dont need to turn on the heater during the winter, balanced out by in summer your A/C needs to work overtime

Thats when running a halo product like the 3090Ti. A fast CPU and mid range gpu will draw much less than that GPU while delivering what the premium consoles give you on screen.
 

Shader compilation discussion in this weeks DF Direct..

I'll spare the forum another lengthy dissertation about this issue, which I normally write after any new confirmation of games having this issue come to light... It does sound like Alex is getting tired... much like me. However now more than ever voices need to get louder to get through to whoever needs to hear it.

Richard has the right idea.. I think many of the big influencers and techtubers out there need to band together and start hammering this issue home. Regardless of whether it's developers, or engine providers, or IHVs faults... they all play a part in this mess.. and they all need to start working together to fix this.
 
Last edited:
Nv/amd should make some noise about it…. And ofcourse everyone else (even linus tech tips). This probably can be fixed if everyone wanted to, its that they dont care. People buy the games anyway. Sure if that would stop then this problem suddenly will have priority.
 
Nv/amd should make some noise about it…. And ofcourse everyone else (even linus tech tips).
Linus actually did do a short video with DF's help on this recently. Wasn't that in-depth of course, but kind of my beef with so many of these PC 'gaming' channels, countless videos that are about cooling your 5% overclocked CPU, and seemingly little understanding/interest in the actual experience of PC gaming. Like with a decade+ of experience, why did Linus even need to consult with DF on this?

It's great that DF is doing this of course, but so much of the analysis of this stuff falls on Alex's shoulders when we've had established PC DIY channels existing for years that so rarely delve into the nitty gritty of this.
 
Linus actually did do a short video with DF's help on this recently. Wasn't that in-depth of course, but kind of my beef with so many of these PC 'gaming' channels, countless videos that are about cooling your 5% overclocked CPU, and seemingly little understanding/interest in the actual experience of PC gaming. Like with a decade+ of experience, why did Linus even need to consult with DF on this?

It's great that DF is doing this of course, but so much of the analysis of this stuff falls on Alex's shoulders when we've had established PC DIY channels existing for years that so rarely delve into the nitty gritty of this.

The general public doesnt seem to care, and thats the core problem. Once people stop buying softwares with these issues the industry will prioritize this problem abit sooner perhaps.
 
The general public doesnt seem to care, and thats the core problem. Once people stop buying softwares with these issues the industry will prioritize this problem abit sooner perhaps.
I think most don't even know the source of the problem for one, and these kind of problems usually don't occur in such rapid sequential fashion - we've had so may releases in a short window that have had it. I remain hopeful though that Alex's diligence on this and his upcoming video focusing on shader compiling in general will continue to mount increasing pressure on Epic to devote more attention to building better tools to highlight this potential pitfall for devs, and hopefully further push frametime consistency and the importance of ootb experience to PC enthusiast youtubers. Just slapping some 1% lows on GPU graphs when you're doing a new card review isn't enough imo. Part of the reason the PC gaming public as a whole doesn't care is because these popular channels don't educate their viewers on this stuff.

I was told in another thread that DF doesn't have near the influence of in-house QA testers of publishers though, so I guess this is all moot. :)
 
Played a new game. I knew which engine it was as soon as it booted up.

I've decided I'm going to start *ahem* compiling clips of all the Unreal Engine games I have where you do anything for the first time and there's big hitches/stutters, and I'm going to make a youtube video of it. Nothing but shader compilation hitching and loading stuttering for like 10 - 20 minutes. (maybe more) Shoot a gun? Hitch. Barrel explodes? Hitch. Break a wooden crate? Hitch. Use a magic spell? Hitch. Character jumps? Hitch. Spin the camera? Hitch. Open the menu? Hitch. Checkpoint reached? Hitch. Cutscene starts? Hitch. Camera cut? Hitch. New character? Hitch... and on and on it will go... as it does.

Every damn game is using this engine now too...

I just don't understand how developers can release games knowing they do this. And if developers don't know they do this, then I just don't understand how developers can not know they do this! I see developers on Twitter talk about logistics and costs as being factors into why it's hard to do it properly... Well if that's the case, then don't release your damn game on the platform. Something being "too much work" is not a valid excuse. Why should I give a damn about how much money or how hard it is for a company to put out a quality product? Since when are games so different and special? If I buy ANY OTHER PRODUCT which doesn't work properly... would I consider how difficult it was for the developers or creators of that product to built it... or how much it cost the company before complaining about it? No. We don't consider stuff like that... because we expect things to work properly. We complain when things don't. The ironic thing is how much you see developers complain about products they use when they don't work properly.. or up to their standards. They have no qualms with letting their issues be known. I don't know why we should be expected to be so understanding..

It's a complete failure of the chain on every level. Engine-side, developer-side, QA, and Publishing. I understand the plight of the small developers out there... but the industry can't remain at the point where it's as easy as it is right now for any developer to just create a game using such an established engine... and these issues are just part and parcel of using it as it is.

It's unacceptable. Digital Foundry, along with Linus, GamersNexus, IGN.. whoever else has a voice and who cares about PC gaming... need to start making examples out of this issue at every opportunity. It's amazing how things can magically be fixed when enough negative press is stirred up due to it. It shouldn't just be them either... PC gaming and tech review sites are a god damn joke. Almost none of them speak anything about performance, and when they do it's a single paragraph or sentence at best. They should be ashamed of themselves.. It should be part of their responsibility to their readers to give insight into how games actually perform and not gloss over technical issues.

We don't need games to be 100% perfect.. not ever a single stutter. No. But we do need it to be a HELL of a lot better than it is. Pre-compiling everything possible is the first step. That has to start happening. It should just be an accepted norm that in PC games you pre-compile everything you possibly can at initial boot.

I know Alex is taking his time with his video, doing his research, taking a very cautious angle of how to approach this issue, because of course that's the right thing to do. But man.. I REALLY hope he remains firm on this. Richard too. I'm sure is behind the scenes probably asking around about what the hell is going on and why this is such an issue now and what it's going to take to get to a better place. We need him pushing hard too. Letting developers know that Digital Foundry is going to continue to call out this "phenomenon" whenever they see it, and make examples of games that do.

It's the only way things will start changing.. of that I'm convinced.
 
People have always been apologetic to game developers for some odd reason. As if expecting a game to operate smoothly and do what it's advertised to do is somehow unreasonable.
 
Back
Top