Digital Foundry Article Technical Discussion [2024]

The game stutters on more powerful CPU's.

So I'm not sure how you're concluding it's the 3600 that's the problem.
The developers could get stable frame times on the console while much better CPU's on PC (7800X3D) get stutters. And during the years there have been many examples like that. It's just that it's usually small parts of DF videos that I'm not going to search for hours upon hours. Anyway, the CPU that a PS5 and series x gets compared to by DF has been the 3600 for years, and comparisons aren't really changing. Even something that we can be pretty sure it's CPU limited (dragon's dogma 2 cities) get allegedly slightly better frame rates on ps5 compared to a 3600.

All in all, physically, a PS5 CPU is less powerful than a 3600 because of the cache, but in games that hasn't really materialized because of a bunch of factors. So, saying that consoles CPU is more comparable to 1st gen Ryzen, to someone that isn't informed much, tells only half of the story.

Ps: then again, people in this discussions are free to show me CPU limited scenarios where especially the PS5 is performing like 1st gen Ryzen in actual games. I'm ready to change my mind.
 
Last edited:
I am pretty sure the reason why we see such things @Charlietus is due to improper memory/asset destruction/usage on the PC platform by devs making games with PC performance profiling for such things as an afterthought. Either that, or they lack the time or technical knowledge to do it right.

A really great example is Wild Hearts. On PC, that game has a massive huge stutter every few meters on PC on massive 7800X3Ds and better. On Console, it does not have these issues. The developers are doing incredibly bad things with the hardware and APIs on PC.
 
I am pretty sure the reason why we see such things @Charlietus is due to improper memory/asset destruction/usage on the PC platform by devs making games with PC performance profiling for such things as an afterthought. Either that, or they lack the time or technical knowledge to do it right.

A really great example is Wild Hearts. On PC, that game has a massive huge stutter every few meters on PC on massive 7800X3Ds and better. On Console, it does not have these issues. The developers are doing incredibly bad things with the hardware and APIs on PC.
I understand that, but at the same time, the end user experience is what matters, and consoles CPU's (especially PS5, since Xbox probably doesn't get the same amount of care or the API that it uses is less efficient, or a combination of the 2) are offering a better experience than a 1st gen Ryzen. Let's remove all the why's, I'm concentrating on the end result.

A video trying to test this with a big variety of games would be a good video, but I understand that it's difficult to pinpoint when a console game is CPU limited, aside from a few games.
Interviewing a bunch of developers talking about it would be cool.

PS: I hate stutters and frametime spikes, I'll take framerate drops anyday. Seeing elden ring stutters in cutscenes on PC (for example, rykard cutscene stutters so much that the audio desyncs, like what) that aren't there on PS5, and I have seen things like that a lot of times. I'll make a new PC for the rtx 5000 launch, but things like that are a bummer.
 
Last edited:
The developers could get stable frame times on the console while much better CPU's on PC (7800X3D) get stutters. And during the years there have been many examples like that. It's just that it's usually small parts of DF videos that I'm not going to search for hours upon hours. Anyway, the CPU that a PS5 and series x gets compared to by DF has been the 3600 for years, and comparisons aren't really changing. Even something that we can be pretty sure it's CPU limited (dragon's dogma 2 cities) get allegedly slightly better frame rates on ps5 compared to a 3600.

All in all, physically, a PS5 CPU is less powerful than a 3600 because of the cache, but in games that hasn't really materialized because of a bunch of factors. So, saying that consoles CPU is more comparable to 1st gen Ryzen, to someone that isn't informed much, tells only half of the story.

Ps: then again, people in this discussions are free to show me CPU limited scenarios where especially the PS5 is performing like 1st gen Ryzen in actual games. I'm ready to change my mind.
The problem is basically windows being an archaic, unoptimized garbage piece of ………….
Same with file transfer speeds, good luck with that
 
The problem is basically windows being an archaic, unoptimized garbage piece of ………….
Same with file transfer speeds, good luck with that
Windows has likely obsolete APIs but most importantly the PC architecture is some ancient relic of the past. I remember DSoup patiently explaining us all the problems and bottlenecks inherited from decades old PC designs.
 
Windows has likely obsolete APIs but most importantly the PC architecture is some ancient relic of the past. I remember DSoup patiently explaining us all the problems and bottlenecks inherited from decades old PC designs.
That is also true, but some modern windows titles like cyberpunk having better performance ‘emulated’ in Linux compared to windows shows that it is pure garbage
 
The developers could get stable frame times on the console while much better CPU's on PC (7800X3D) get stutters. And during the years there have been many examples like that. It's just that it's usually small parts of DF videos that I'm not going to search for hours upon hours. Anyway, the CPU that a PS5 and series x gets compared to by DF has been the 3600 for years, and comparisons aren't really changing. Even something that we can be pretty sure it's CPU limited (dragon's dogma 2 cities) get allegedly slightly better frame rates on ps5 compared to a 3600.

If the stutters exist on a 7800X3D and not on the console CPU's then it should be pretty obvious this isn't a CPU performance issue. It's an issue with how the game is operating on the PC. Perhaps due to more complex or less flexible API's, perhaps due to the wider array of hardware compatibility that is required, perhaps simply because the PC got less relative QA time. But whatever it is, it's not console efficiencies making their CPU's perform better than a 7800X3D and I'm sure that could be very easily demonstrated by looking at the relative performance outside of those stutters.

Essentially, whatever event is causing the stutters on the PC simply doesn't exist on the console, it's not a matter of the console CPU's simply powering through them to the point they are invisible.

Windows has likely obsolete APIs but most importantly the PC architecture is some ancient relic of the past. I remember DSoup patiently explaining us all the problems and bottlenecks inherited from decades old PC designs.

This is complete rubbish and I had this debate with Dsoup more than once in response to exactly those posts you reference above. Feel free to elaborate on these problems and bottlenecks if you wish and we can have that debate again.
 
Seeing elden ring stutters in cutscenes on PC (for example, rykard cutscene stutters so much that the audio desyncs, like what)
Hmmmm I want to call bullshit on this but don't want to start a new game to see, i'm on ng+5 and killed him in 4 of those play throughs and can't recall this happening. Are there any other cutscenes on bosses I can get to quickly from a new game to prove myself right or wrong?
 
Hmmmm I want to call bullshit on this but don't want to start a new game to see, i'm on ng+5 and killed him in 4 of those play throughs and can't recall this happening. Are there any other cutscenes on bosses I can get to quickly from a new game to prove myself right or wrong?
It's absolutely true. I have played elden ring on both PC and PS5 probably like 6 times. On PC most of the boss cutscenes stutter on camera cuts, while on PS5 it's a flawless playback.

This is PS5 footage

This is PC footage

This still happens even in the latest patch.

During the years I have also seen streamers in their playthrough's having the same problem.

PS: I forgot about what where you can test. Rykard and Godfrey have it 100%, for a early game boss, Margit I think, but I'm not sure.
 
Last edited:
If the stutters exist on a 7800X3D and not on the console CPU's then it should be pretty obvious this isn't a CPU performance issue. It's an issue with how the game is operating on the PC. Perhaps due to more complex or less flexible API's, perhaps due to the wider array of hardware compatibility that is required, perhaps simply because the PC got less relative QA time. But whatever it is, it's not console efficiencies making their CPU's perform better than a 7800X3D and I'm sure that could be very easily demonstrated by looking at the relative performance outside of those stutters.

Essentially, whatever event is causing the stutters on the PC simply doesn't exist on the console, it's not a matter of the console CPU's simply powering through them to the point they are invisible.
A 3600 gets much bigger stutters compared to the 7800X3D, so it is a CPU issue. That's not to say that the console CPU is more powerful than a high end CPU, it's just that it's not getting those stutters. How or why that happens doesn't change that.
 
A 3600 gets much bigger stutters compared to the 7800X3D, so it is a CPU issue.
More accurately, it's a CPU affected issue. A more powerful CPU can help alleviate the problem, which could be some background OS process, say, that takes less time on a faster CPU.
 
At some point in the video Rich says that consoles CPU's perform more like a first generation Ryzen
It's true, the Series X and PS5 CPU is basically a glorified 8 core i7 from the 5th gen (i7 5960X), the low L3 cache (8MB) and clock speed (3.6GHz) and the cut down FPUs really hurt these CPU. Also the consoles use GDDR5 which is higher latency but optimized for better throughput. Here is a more extensive set of benchmarks.

 
It's true, the Series X and PS5 CPU is basically a glorified 8 core i7 from the 5th gen (i7 5960X), the low L3 cache (8MB) and clock speed (3.6GHz) and the cut down FPUs really hurt these CPU. Also the consoles use GDDR5 which is higher latency but optimized for better throughput. Here is a more extensive set of benchmarks.

I'm going to repeat myself, and as I said, I'm not denying that physically those CPU's aren't like their desktop counterparts, but the results in actual games while running inside the consoles is reducing the gap significantly compared to the CPU inside a PC.
 
Last edited:
I'm going to repeat myself, and as I said, I'm not denying that physically those CPU's aren't like their desktop counterparts, but the results in actual games while running inside the consoles is reducing the gap significantly compared to the CPU inside a PC.

Imagine you are doing a race on a circuit. You have a 10000HP monster truck which can accelerate to 100mph in less than 2 seconds. The other guy has a formula 1 car, and he is Max verstappen.

Who is going to win the race?

So while power matters, design and how the power is used is more important, and who uses is might even be the most important. PC is raw power on archaic designs with severe bottlenecks. It achieves results despite being pc architecture, not because of it
 
Windows has likely obsolete APIs but most importantly the PC architecture is some ancient relic of the past. I remember DSoup patiently explaining us all the problems and bottlenecks inherited from decades old PC designs.
what's wrong with PC according to you? It's an open architecture, so you have NUMA PCs, UMA PCs, x86 PCs, ARM PCs. Anybody can make the architecture of the PC flawless.

When it comes to gaming, there are flaws compared to consoles -and it's not the price, since you end up saving money over time compared to consoles-, that's something MS should be working on now, and thing like Steam Deck started to solve. Currents consoles share the same architecture: Northbridge, Southbridge, APU.
 
Imagine you are doing a race on a circuit. You have a 10000HP monster truck which can accelerate to 100mph in less than 2 seconds. The other guy has a formula 1 car, and he is Max verstappen.

Who is going to win the race?

So while power matters, design and how the power is used is more important, and who uses is might even be the most important. PC is raw power on archaic designs with severe bottlenecks. It achieves results despite being pc architecture, not because of it
I don't think that the PC architecture is that bad, but at the same time, there is some weirdness that shouldn't be there were a console game can have better frame times while running on much less powerful hardware. And it's probably a combination of windows, direct x and developer attention that causes these problems. But that's just a gut feeling, not an informed opinion.
 
Imagine you are doing a race on a circuit. You have a 10000HP monster truck which can accelerate to 100mph in less than 2 seconds. The other guy has a formula 1 car, and he is Max verstappen.

Who is going to win the race?

So while power matters, design and how the power is used is more important, and who uses is might even be the most important. PC is raw power on archaic designs with severe bottlenecks. It achieves results despite being pc architecture, not because of it
can you elaborate what it has to do with the flaws of a PC? You have x86, you have ARM, and Windows has ran on things like MIPS and Alpha AXP architectures (RISC).
 
A 3600 gets much bigger stutters compared to the 7800X3D, so it is a CPU issue. That's not to say that the console CPU is more powerful than a high end CPU, it's just that it's not getting those stutters. How or why that happens doesn't change that.

No it really isn't. Its some other issue that as Shift says, impacts the CPU. A much more powerful CPU than those found in the consoles is unable to fully mitigate the issue. So unlike originally presented, this is in no way evidence of the console CPU's performing above their specifications relative to PC CPU's. To be clear, they do perform above their specifications relative to PC CPU's, but this isn't evidence of that. It's evidence of something else entirely. i.e. some kind of event that causes a frame time spike on PC which does not exist (or at least is far less severe) on console.
 
Back
Top