CPU Limited Games, When Are They Going To End?

Redfall seem to be another heavily CPU limited UE4 title:

View attachment 8870



The difference is that the limitation happens at >120 fps so it's less of an issue for the majority of players.
The game also support DLSS3 helping to hit even higher fps in CPU limited areas.

And like Jedi, it barely uses a modern CPU! Oh and it has traversal stutters! Yay! (those fps graphs there from Tom's hardware are definitely not a reflection of the game's frame-times). I think we need to move beyond frame rate graphs, they are so uninformative.
 

And like Jedi, it barely uses a modern CPU! Oh and it has traversal stutters! Yay! (those fps graphs there from Tom's hardware are definitely not a reflection of the game's frame-times). I think we need to move beyond frame rate graphs, they are so uninformative.

Yeah, I think 1% and 0.1% is more helpful, though they're not showing the entire picture either.

For example, in the X3D CPU reviews, I find it's not very useful to see average FPS because to be honest who buys a 7950X3D to play 1080p medium quality? I'm more interested in how it performs in the worst situation. If a 7950X3D is able to reduce the occasional slow downs even in 4K resolution, then it might worth the hassle. 1% and 0.1% FPS numbers are more likely to reflect these.
 
Intel and AMD are just as much to blame, they've just gone down the route of "We added moar!!! cores!!!" route for years and now it's biting gamers in the ass.

Last generation consoles also didn't help either, a 1st generation Intel Core series CPU with a least 4 cores offered more performance than the 8 cores in the consoles, meaning all you really needed as a PC gamer was a decently clocked quad core CPU.

I have a Ryzen 5 7600 overclocked to 5.35Ghz and countless times I've looked at the 7700 but aside from a small bump in clock speed it's a worthless upgrade.
 
Intel and AMD are just as much to blame, they've just gone down the route of "We added moar!!! cores!!!" route for years and now it's biting gamers in the ass.

Last generation consoles also didn't help either, a 1st generation Intel Core series CPU with a least 4 cores offered more performance than the 8 cores in the consoles, meaning all you really needed as a PC gamer was a decently clocked quad core CPU.

I have a Ryzen 5 7600 overclocked to 5.35Ghz and countless times I've looked at the 7700 but aside from a small bump in clock speed it's a worthless upgrade.
Well, Intel actually was dog-f-ing around not doing anything to push to higher core counts for a long time.. because their strength was their single core IPC advantage. AMD pivoted, as one does, and put more effort into bringing more cores to consumers so that it's CPUs could better compete in multi-threaded workloads.. expecting games to catch up eventually.

I think the PS3's Cell processor played a major role in hepling acclimate (forcing) developers to somewhat multi-thread their code during that era. Some engines never got the memo.. and continued on, while other engines, were set up to more gracefully enter the new era of high core/thread counts when it came. Problem is... a lot of those incredible pioneering developers also know they are worth more money than the games industry is willing to pay (I imagine) and they also want to contribute their talents to society in a more impactful way... so the games industry loses out on a lot of people who truly care about true optimization and using every bit of the power available to them.
 
Just picked up a 5800x3D for "cheap." Feels like a big upgrade from a 5600. I think the bar charts don't really display how much smoother games feel. I've got a 240Hz display and playing fortnite with low settings (gpu usage stays less than 40%) I could pretty much stay consistent 200 fps with some drops to 170 around mega city on the map. With the 5800x3d my frame times are way smoother, and even if I lock the game to 200 fps, it feels way smoother than before. I'd love to see games really focus on cpu performance so we can push gpus harder and stay gpu limited with very high framerates. Sounds like UE5 is doing the work to get there. Pretty easy to be cpu-limited in Remnant 2, even with graphics turned up. Would love to see that game integrated UE 5.4, but I doubt they will.
 
The X3D chips in general aren't really properly evaluated via those average results comparisons. There's some games/situations in which they provide generational if not multi-generational uplift and a basically a completely different experience. Then there's games/situations in which they do nothing. Averaging them out to x% really doesn't really properly illustrate showcase the value (or lack of) depending on the usage case.

What my hope for with this generation was due to the CPU uplift in current gen consoles it would result in a push towards more simulations in games that basically seemed to stall out with the 7th->8th generation due the lack of CPU uplift. Mainstream game physics for instance seems to have peaked in what around the 2010s? and might have even arguably regressed since then.
 
1696863180083.png


Here we are at the end of 2023 with titles using explicit APIs still losing hard to DX11 even in CPU limited scenarios.

Such results makes me wonder how would all the recent releases which have CPU issues perform if they'd still have DX11 renderers in them. I have a feeling that with the loss of the latter the performance situation between a theoretical DX11 and what we have now in DX12 may have deteriorated instead of improved.
 
Here we are at the end of 2023 with titles using explicit APIs still losing hard to DX11 even in CPU limited scenarios.
I think by now, the picture is quite clear, in the vast majority of cases, DX11 is faster than DX12/Vulkan, the best we can hope for is DX12/Vulkan being equal to DX11 in well optimized games. Some games do indeed have a faster DX12 path than DX11, but they are so few and far in between, that they are not statistically significant.

We need to learn to live by the atrocious PSO stuttering or the frequent long compilation times, bad memory management, and by the reduced CPU performance for the rest of these APIs lifetimes until they are replaced by something more sane.
 
I think we need to remember that Sebbi is a master at his craft and while he would boss any hardware that he wants, a lot of other engineers wouldn't be able too.

And that's a problem we have, there are engineers like Sebbi who have the skill to manual mange hardware and those who can't, and so who do you try to please?
 
I think we need to remember that Sebbi is a master at his craft and while he would boss any hardware that he wants, a lot of other engineers wouldn't be able too.
Yep. And this proposal is about as radical as a switch to D3D12/VK was previously. I would expect this model on average to perform about similar in comparison to D3D12/VK as these do in comparison to D3D11 (where applicable). Meaning that a couple of games would get some benefits while all the rest would suffer from worse performance and numerous compatibility issues.
 
I think we need to remember that Sebbi is a master at his craft and while he would boss any hardware that he wants, a lot of other engineers wouldn't be able too.

And that's a problem we have, there are engineers like Sebbi who have the skill to manual mange hardware and those who can't, and so who do you try to please?

This approach could work well for big engines like Unity and UE that basically handle the render back-end for you. From everything I've read devs seem to like metal best of the modern apis. They should probably have a metal or even higher-level api for simpler applications and stuff, but basically thinly veiled hardware access for the performance people. Almost anything I've read from devs kind of suggest that dx12 and vulkan didn't hit the mark. They didn't provide the low level access developers wanted, or the lightweight drivers, and it added a lot of complexity.
 
Back
Top