Why do devs want more CPU cores? What work is there that's better on 16 cores than a GPU if it's that parallelisable?
As I see it, you have three workloads - massive float power, ML work, and that ever doesn't fit into that. Silicon budget should be distributed accordingly to what yields best returns.
Because some workload well on CPU: Some part Physics engine, AI, gameplay code are on the CPU.
Doom Eternal will run at up to 1,000fps if your PC is up to it
Id Tech 6 maxed out at 250fps, but id Tech 7 engine can quadruple this with the right PC.
hexus.net
Like I said Id Tech 7 scale between 4 to 16 cores. New Decima physics jolt engine scale to 16 cores. Same for any game engine well optimize with ECS or job/fibers based like Bungie, Naughty Dog, Bluepoint or GG. All this engine are ready to scale to 16 cores and I am sure they aren't the only one...
Mostly since the ps4 generation of consoles, which have 8 very weak cores, architectures have evolved into trying to make sure all the cores are working on something and doing something useful. A lot of game engines have moved to a Task based system for that purpose. In there, you don’t dedicate 1 thread to do one thing, but instead split your work into small sections, and then have multiple threads work on those sections on their own, merging the results after the tasks finish. Unlike the fork-join approach of having one dedicated thread and having it ship off work to helpers, you do everything on the helpers for the most part. Your main timeline of operations is created as a graph of tasks to do, and then those are distributed across cores. A task cant start until all of its predecessor tasks are finished. If a task system is used well, it grants really good scalability as everything automatically distributes to however many cores are available. A great example of this is Doom Eternal, where you can see it smoothly scaling from PCs with 4 cores to PCs with 16 cores. Some great talks from GDC about it are Naughty Dog “Parallelizing the Naughty Dog Engine Using Fibers” 3 and the 2 Destiny Engine talks 4 5
Here code of Physics Engine scaling to 16 cores
GitHub - NVIDIA-Omniverse/PhysX: NVIDIA PhysX SDK
NVIDIA PhysX SDK. Contribute to NVIDIA-Omniverse/PhysX development by creating an account on GitHub.
github.com
GitHub - jrouwe/JoltPhysics: A multi core friendly rigid body physics and collision detection library. Written in C++. Suitable for games and VR applications. Used by Horizon Forbidden West.
A multi core friendly rigid body physics and collision detection library. Written in C++. Suitable for games and VR applications. Used by Horizon Forbidden West. - jrouwe/JoltPhysics
github.com
Doom Eternal scaling to a 16 cores CPU example with Digitalfoundry PC and this is maybe the only way to feed a Nvidia 4090. I would be curious to see framerate of Doom eternal with a 16 cores CPU and the 4090 probably approaching the 1000 fps limit. But they can do something heavier than Doom Eternal in the future.
Doom Eternal analysis: how id Tech 7 pushes current-gen consoles to the limit
Doom 2016 revitalised the fortunes of both id software and the classic Doom franchise, delivering a phenomenal, exhilar…
www.eurogamer.net
Of course, the PC version is the way to go if you have the hardware as it supports much higher frame-rates all around. I'm running a rather powerful rig equipped with a 16-core Intel i9 7960X and an RTX 2080 Tiso you'd expect greater performance and it's delivered without issue. In the marketing push up to launch, id promised astonishing performance and the team has delivered. In-game performance varied between 300 to 500 frames per second and Doom Eternal becomes one of the very few triple-A games that can actually deliver sustained, consistent performance for high refresh rate screens - up to and including the latest 360Hz displays.
I think it's way too early for 16 cores CPU, engines are not ready, don't need it, and it would be too expensive (APU size) for a console.
What they need to do is to improve current efficiency like for starters they could use a 8 cores CCX and more L3 cache on top of the usual tech improvements. For consoles and the specific requirement of always smaller APU a stacked L3 cache would be ideal!
Some engine are currently ready for 16 cores like Tiger engine of Bungie, Id tech 7, Naughty Dog engine, Decima Engine, Bluepoint game engine and so on. Id tech 7 showed it in 2020 with Doom Eternal.
EDIT:
ND Engine task based engine is ready since 2013/2014. What is working for 8 cores Jaguar, work too for 8 cores 16 threads of PS5/Xbox Series and can work for 16 cores 32 threads of next generation consoles.
Parallelizing the Naughty Dog Engine Using Fibers
This talk is a detailed walkthrough of the game engine modifications needed to make The Last of Us Remastered run at 60 fps on PlayStation 4. Topics covered will include the fiber-based job system Naughty Dog adopted for the game, the overall...
www.gdcvault.com
Bungie engine is ready since Destiny
And the engine not ready have now 6 years to do the job if next generation consoles release in 2028 and it will help the engine to run better on current gen console and it would have help engine to run better even on PS4/Xbox One.
Last edited: