Digital Foundry Article Technical Discussion [2024]

Can we have a video showing matched settings and not from some random YouTube channel?
Sadly, I didn't find any. I don't think many people have a 4700s.
But even then, being at 80 fps is pretty far from 120 fps, even if the title had amazing CPU scaling, it wouldn't reach it.
It would be good to have more testing with matched settings, but we will have to make do.
 
Sadly, I didn't find any. I don't think many people have a 4700s.
So the only reputable channel we have is Digital Foundry.

Which shows the lack of L2 cache really cripple the console CPU's.

Console optimisation will only get you so far, but in your typical game, at matched settings, the 3600 will be considerably faster.

Over the last few years it has become apparent the console CPU's weren't the ground breaking jump we all thought they would be.
 
So the only reputable channel we have is Digital Foundry.

Which shows the lack of L2 cache really cripple the console CPU's.

Console optimisation will only get you so far, but in your typical game, at matched settings, the 3600 will be considerably faster.

Over the last few years it has become apparent the console CPU's weren't the ground breaking jump we all thought they would be.
Digital foundry has said multiple times that the 3600 is more or less equivalent to the console CPU. Don't believe me, believe them then :)
 
So the only reputable channel we have is Digital Foundry.

Which shows the lack of L2 cache really cripple the console CPU's.

Console optimisation will only get you so far, but in your typical game, at matched settings, the 3600 will be considerably faster.

Over the last few years it has become apparent the console CPU's weren't the ground breaking jump we all thought they would be.
I haven't been able to found CPU limited 4700s videos, but I found this video of warzone 2 at low settings and CPU limited.


PS5 isn't locked either (it drops frames when you drop from the plane at the start) but in this game it's surely more stable than this. And this is a 3600, not a 1800x.
 
I haven't been able to found CPU limited 4700s videos, but I found this video of warzone 2 at low settings and CPU limited.


PS5 isn't locked either (it drops frames when you drop from the plane at the start) but in this game it's surely more stable than this. And this is a 3600, not a 1800x.

That's not proof of anything, these random videos in YouTube are worthless.

And you keep holding on to this one game.....there will always be exceptions.

Digital Foundry used Cyberpunk 2077 and Metro Exodus to show how much faster the 3600 is.

So go and find evidence of what PS5 runs at it CPU limited scenarios in those games so we can actually attempt a half decent comparison.
 
That's not proof of anything, these random videos in YouTube are worthless.

And you keep holding on to this one game.....there will always be exceptions.

Digital Foundry used Cyberpunk 2077 and Metro Exodus to show how much faster the 3600 is.

So go and find evidence of what PS5 runs at it CPU limited scenarios in those games so we can actually attempt a half decent comparison.
Digital foundry, me or you doesn't know if cb 2077 and metro Exodus are CPU limited on PS5 (especially 2077, where in recent videos the dynamic resolution seemed kind of bugged) unless we had debug data.

So we have a single Digital foundry video of a CPU that is supposed to be the PS5 CPU installed on a pc that tells us that a 3600 is 40-70% faster.
No other video is allowed, nothing else exists. Ok.

A 120 fps battle royale 150 player open world game is probably a better use case than any other game.

Also we have a video that came out 2 days ago where the PS5 CPU performed like the 3600 (again, settings are similar, not exactly the same), and digital foundry tells us that the CPU is comparable. But you can continue ignoring that, I'm not repeating myself anymore.
Btw I'm curious, are you a member of the glorious PC masterrace? Can I see your badge?
 
Digital foundry, me or you doesn't know if cb 2077 and metro Exodus are CPU limited on PS5 (especially 2077, where in recent videos the dynamic resolution seemed kind of bugged) unless we had debug data.

We know it's not, but that's not the point.

The point is the console CPU Vs the 3600.

And there's is evidence to suggest the 3600 is 40%+ faster than the console CPU in this specific game.

Also worth highlighting that so far you have failed to provide any evidence that contradicts Digital Foundry's video other using a completely different game shown in a random YouTube video.

So we have a single Digital foundry video of a CPU that is supposed to be the PS5 CPU installed on a pc that tells us that a 3600 is 40-70% faster.
No other video is allowed, nothing else exists. Ok.

You can use whatever video you like, as long as it:

- Uses matching settings for all tested CPU's
- Uses the 4700s

A 120 fps battle royale 150 player open world game is probably a better use case than any other game.

Not really as you need to look at multiple games so you have more data points to form a conclusion from.

Also we have a video that came out 2 days ago where the PS5 CPU performed like the 3600 (again, settings are similar, not exactly the same), and digital foundry tells us that the CPU is comparable.

As I have already said, there will always be exceptions.

But you can continue ignoring that, I'm not repeating myself anymore.

That's fine, unless you're going to provide actual evidence with good testing methodology I'm not going to listen anyway.

Btw I'm curious, are you a member of the glorious PC masterrace? Can I see your badge?

What a childish comment.
 
Last edited:
Alright, you win. Even if the PS5 CPU is considered to be performing in a similar manner to the 3600 by most people, including digital foundry, I guess nobody noticed that actually, it was 40-70% percent slower, so the CPU it should be compared to should probably be the Ryzen 5 1600 instead.

The videos that you are asking of me also don't exist, since they have to be exactly a match of the console version, and no, low settings aren't right either, even if some of them aren't even set at low on the console.

Alright, bye bye.
 
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.

Dragon’s Dogma 2 DLSS 3 Mod Released for Free by PureDark – Native Implementation Is There, Just Hidden

 
Let DD2 be a lesson that bespoke engines aren't always good. For all the criticism UE5 would run this game much better with a lot more geometry.

I think is a good example of using an engine in a way it was never designed for, and there's no guarantee that UE5 would be any better as it completely depends on this games specific needs and if UE5 just supports them or if it needs adding.

For the Resident Evil games the RE engine is performant as heck, they're just pushing it to far with this game.
 
Last edited:
One of their goals is pushing the engine into new directions, and updating the various components to facilitate that as they go along. They need more experience with programming these types of games.. and well, that's what Dragon's Dogma 2 is. It's a different kind of technical challenge to the Resident Evil games. Hopefully it will lead to a better, more optimized and versatile engine overall.
 
So the only reputable channel we have is Digital Foundry.

Which shows the lack of L2 cache really cripple the console CPU's.

The cache reduction will no doubt hurt console CPUs as you say, but I thinks it's also possible that memory access hurts them. The memory controller is probably set up like a GPU, to prioritise bandwidth over latency. Higher memory latency would only compound the cache reduction.

PC has much higher driver and API overheads than consoles do which can eat into the 3600s performance advantage, but in a game where the CPU bottleneck is elsewhere like AI or scripting that's probably not going to help consoles, and the 3600 will pull comfortably ahead.
 
I think is a good example of using an engine in a way it was never designed for, and there's no guarantee they UE5 would be any better as it completely depends on this games specific needs and if UE5 just supports them or if it needs adding.

For the Resident Evil games the RE engine is performant as heck, they're just pushing it to far with this game.
the problem is that rumours say Resident Evil 9 is going to use the same approach as DD2, and going open world. I don't know how is that going to work tbh. On RE games I prefer the more controlled, calculated (speedrun friendly too), and oppressive atmospheres of the RE1 mansion or RE2 police station.
 
I don't imagine the 4700s would perform similar to a 3600. Too many variables preventing the 4700s from reaching its full performance on a PC when comparing it to a 3600. Yeah it may have higher bandwidth, but also has higher latency because if the gddr. I'd be curious to know how the PS5 APU compares to other AMD APU's with similar graphics performance.

I did see a video where the 4700s can reach 4.1 Ghz. And I believe it's minimum wattage was 85 watts. I'll see if I can find the video.

Here's the link to the video in in YouTube.

 
Last edited:
Just some mumblings regarding the CPU discussion.

The 3600 likely gets brought up as an anology because if it being more common in ethuasist circles but the 4700g (Zen 2 APU, 8MB L3 cache) would likely be more akin to the console CPUs aside from clock speed.

If you look at 4700g comparisons to the Zen 2 CPUs in gaming while it's slower it's more roughly a 5-10% deficiet on average. While cache size is larger on the CPUs there is also some clawback in terms of added latency due to the configuration of the L3 and the chiplet layout in terms of memory subsystem performance.

It's worth noting that while we use the term CPU performance we often don't really mean CPU performance in an isolated sense but multiple subsystems together. For instance when we say GPU performance we really mean the entire graphics card and if we were to configure everything else on that graphics card differently we'd get different gaming performance even with the same chip.

With that said using those 4800s comparisons against typical CPU builds has an issue in that the memory and also the PCI connection is much less performant than typical builds used in DIY. DIY builds for example almost always use really OCed memory (even the most budget builds as there is no price premium). I think the 4800s might have 50% higher latency once you go to system memory against a typical 3600 build? Slower PCIe performance as well which we shouldn't discount the effects of if we're talking about very high fps.

I suspect the console development environment and system platform (eg. shared memory) likely does a better job with some of the above issues (such as hiding memory latency) than the PC. Also that the games are likely optimized based on a 30fps or 60fps for the most part, this shouldn't be overlooked either as the the underpinnings might not be designed in account of scaling to very high FPS at which point you could be running into various subcomponent bottlenecks if you translate it to an open platform like the PC.
 
I don't imagine the 4700s would perform similar to a 3600. Too many variables preventing the 4700s from reaching its full performance on a PC when comparing it to a 3600.

I did see a video where the 4700s can reach 4.1 Ghz. And I believe it's minimum wattage was 85 watts. I'll see if I can find the video.

Also worth baring in mind the desktop 4700s has a noticeable clock speed advantage over the CPU inside PS5 and Series-X. So single threaded performance is going to be even lower on the consoles, their strength definitely lies in multithreading performance but the increased L3 on the 3600 helps to mitigate being 2 cores down.
 
I wonder if the lack of cache on the CPU in the consoles is somewhat made up for by the larger main memory bandwidth.

It's also worth noting that console devs will be specifically coding for that reduced cache to avoid misses as much as possible whereas PC code will generally be looking for as much as possible. So seeing how poorly PC code runs on that CPU doesn't necessarily tell us how well console code that is optimised for the lesser cache would run on it, especially with much more restrained system memory bandwidth.
 
The game has a mostly good RTGI implementation. Image quality problems on Series X in resolving the checkboarded output. Resolution is hard to determine, fps is between 30 and 40, Series X wins in GPU limited scenarios, while PS5 wins in CPU limited scenarios.

I think the reason CPU pressure scales with resolution is due to the way RE engine handles occlusion culling. At least in RE2:RE, occlusion culling is handled via CPU based software rasterization on a low res buffer. It is reasonable to assume that the occlusion test res gets increased slightly alongside the rendering resolution?
 
Back
Top