Digital Foundry Article Technical Discussion [2024]

It doesn't speak loudly about anything. We can safely assume that Digital Foundry used a CPU on the PC side that was sufficiently powerful to isolate GPU performance - which is what they were testing.



You don't need to control these elements. Game developers will control them for you. As I noted above, if the game is CPU limited on console, the developers would set the resolution to take advantage of the excess GPU power. Or DRS will do that automatically. Its a pretty safe assumption that games on console are generally at, or close to their GPU's limitations because it's so trivial (relatively speaking) to soak up excess GPU power with increased resolution.
I don't agree but you're free to believe what you want. Making unchecked assumptions when attempting to perform a test/experiment is a recipe for disaster.
 
The burden of proof lies with @Flappy Pannus to prove that this statement is true for all games tested. The video I posted is simply meant to highlight that there exists scenarios where you can be cpu limited by even on a gpu as weak as the 6700. In Richard's video, we see Hitman 3 showing a 40% difference between both GPUs that have a 10% difference in clocks at a high resolution. That result seems rather interesting. It's then followed by monster hunter rise where we see a 37% difference on a 10 percent gpu overclock. I mean, if that does not scream cpu limited, I don't know what will.

You could always just check.


Hitman-3-CPU-benchmarks-2-768x565.png

If an ancient 6 core 4930K can average 122fps in a CPU limited scenario in this game, do you really think the PS5 is CPU limited to a mere 55fps which is the framerate displayed in Richards video? Or is it more likely that the devs have simply ramped up the resolution to hit a 60fps target?
 
I don't agree but you're free to believe what you want. Making unchecked assumptions when attempting to perform a test/experiment is a recipe for disaster.

Surely assuming that the games are CPU limited on PS5 is an "unchecked assumption"?

There is no way to get an exact CPU performance match between the platforms for numerous reasons. So any test performed has to come with a set of assumptions (as, in fact is the case with most scientific research). But if the assumptions are reasonable, and based on some kind of logical reasoning then the final results can still be valid. The reasoning here is likely a combination of the fact that console devs will generally increase graphics settings to maximise GPU usage (thus eliminating or minimising CPU bottlenecks) combined with DF's likely extensive knowledge of where CPU bottlenecks come into play in their tested games.
 
The last of us on pc is a known broken port. Again, my point of contention is not the results but the test methodology. Using a known broken port like the last of us highlights poor test methodology

And yet, you assume that when it goes the other way, it's somehow self-evident of your theory. Why is Hitman 3 also just not sub-optimal on the PS5, but necessarily indicative of a hardware CPU bottleneck? Even Rich alludes to this in mentioning it was one of the earlier PS5 ports and its drops were potentially the lack of optimization.

Speaking of Hitman3, here's how "CPU limited" it is. Here's the scene Rich used where it dropped below 60fps:

1708721366137.png


Same area, my system - 12400f, DDR4 32GB, 3060. Max details, no RT, 720p.


1708722176989.jpeg


720p is the lowest res the game will go, but even at 177fps, I'm still not CPU limited! (at least in this scene, moving around I will hit GPU usage in the 80's).

Now, in at least single-threaded performance, is my 12400f faster than the PS5's CPU by a fair amount? Sure, probably - esp if they were on the same platform with the same API. But over 3 times as fast? No. At the very least, the 6700 here is not reaching 80fps in Rich's video due to an overpowered CPU. A Core i3 could get past that.

So I believe we can safely excise Hitman 3 as any example of a CPU bottleneck on the PS5, at least I sure as fuck hope so.


The cpu was never mentioned I believe and the absence of that information speaks loudly....

Expand on this. What's the implication here? Are you saying Rich purposefully obfuscated the CPU results? It speaks loudly of...what? Spell it out.
 
Last edited:
@Flappy Pannus Your whole argument is centered around proving your opinion is "correct" and not examining how you could have missed something. Putting the actual framerates aside and just understanding the methodology of testing, it's very very easy to see how and why this approach to testing is very wrong. If you're attempting to compare two things, it's paramount that you control all other variables so that you can ensure that only the two things are being tested. There are a multitude of ways to do this but, it must be done. Failure to do so will guarantee that you're not necessarily testing the things you want to test.

Bringing frame rates back into the discussion, Rich's assertion is that the 6700 is similar to the ps5 GPUs. The test is structured to ensure that the infinity cache hit rate is poor due to the higher resolutions however it fails to control for all other variables. Take the hitman example that shows a 40% discrepancy. If you've eliminated the cache differences from the equation, then if your theory was correct, downclocking the GPU by 10% should deliver a similar performance to the PS5. However, that line of reasoning was never explored. In fact, I'm sure if Richard did tested for that, it wouldn't be the case. So now you have to explain the discrepancies however, Richard cannot. Why? Because all other variables were not controlled in the test. Was it just a bad port? Does the ps5 gpu not hit it's peak clocks? Does it have to do with the delayed cpu latency due to the use of GDDR instead of DDR4 memory? Is it due to the general under performance of the cpu? Is it due to the general memory subsystem? You cannot definitively answer these questions for each game. All you can do is speculate but you can't even intelligently speculate because there are so many variables out of control in your test.

I think you're too focused on numbers and not on the soundness of the approach which gives you the ability to draw meaningful conclusions. Regardless, if lots of people are pointing it out, it means that there might be some merit to the criticism. Perhaps they see something you don't.
 
@Flappy Pannus Your whole argument is centered around proving your opinion is "correct"

Perhaps not as common as I would like, but generally the approach in an argument. Highly recommended!


All you can do is speculate but you can't even intelligently speculate because there are so many variables out of control in your test.

So then there is no point to such speculation that you wanted Rich to engage in. There is no point for Rich to speculate on "core clocks" of the PS5 when he has no way to measure that. All he can do is comment on potential optimization/lack thereof, or when there are clearly code bottlenecks like in Cyberpunk on the PS5 in raster mode where the dynamic resolution is failing to change fast enough to stabilize the framerate.

Since we don't have tracedumps of the actual games being run on each platform, there is going to be some level of speculation involved, yes. The actual kind of speculation matters though, hence why I'm posting actual CPU-limited scenarios of a budget PC, in the exact same scenes Rich tested, to illustrate why your speculation that there is a CPU bottleneck on the PS5 is highly likely not to be the case.

The degree of the speculation matters. If my Hitman 3 result showed a CPU bottleneck reaching say, 70fps in the same scene, then yeah - I would posit that it's possible the PS5's CPU is the bottleneck here. But of course, it wasn't remotely close. Speaking of which:

DF:

1708725653506.png

My system, same area:

1708725763163.png

Nearly double in a CPU limited scenario.

You can argue this doesn't definitive 'disprove' anything definitively, and sure - it doesn't, as you readily admit speculation is impossible to avoid when we can't see what the code is doing. I'm using these to point out why I believe your speculation is untethered to likelihood based on actual evidence. The only actual data you've come anywhere close to providing is a youtube video showing that a 3600X can be a bottleneck in games and settings that Rich didn't test.

So there are only two possible results in the games Rich tested that could potentially indicate a CPU bottleneck on the PS5. And so, I've tested those exact areas on a PC with a budget CPU. The same CPU in the PS5? No of course not, I can't test that. So then you are left to argue that the PS5 CPU is so underpowered relative to a $140 6 core CPU, that it is this much of a bottleneck - to where my 12400f is 2 to 3 times the performance, even with the overhead of its API.

It's a possibility, yes. My argument though is it's close to the same likelihood that DF's video is also poor because Rich didn't tell us the polling rate of the mouse he was using. You're free to make these arguments, and I'm free to test the actual scenarios that you want in order to demonstrate their validity.

(Also bear in mind Rich also tested the supersampling mode in MHR, clearly not CPU limited, it was pretty much the exact same performance gap. So clearly this is not due to the CPU, you don't even really need my tests to determine this.)

xI think you're too focused on numbers

"Scientific methodology!"

<provides hard data casting significant doubt on hypothesis that CPU is a bottleneck in this game>

"No, not like that!"

Expand on this. What's the implication here? Are you saying Rich purposefully obfuscated the CPU results? It speaks loudly of...what? Spell it out.

Again, circling back. What did you mean by this?

It's one thing to speculate that an established tech channel that has demonstrated years of meticulous approach to detail made a snafu in not recognizing the possibility their results in a video might have been skewed due to other components, it's another to imply it 'speaks volumes' about...something.
 
Last edited:
I would say the biggest takeaway is just how ineffective games can be at properly utilizing hardware. This makes it very difficult as consumers to draw hard conclusions when this very unreliable data is all we have. For example, is TLOU really well optimized for Playstation or just very poorly optimized for PC? You could look at the breadth of games and say the latter, but maybe those are just not well optimized for anything in particular.
 
The cpu was never mentioned I believe and the absence of that information speaks loudly.... What you're saying is correct but it still adheres to the scientific method of testing. Unfortunately, you can't actually verify that the console is CPU limited or GPU limited at all. You can guess but, thats it. Because console cpus are so weak, the api is so different, and the memory subsystem is different, you just cannot say at all. Those are 3 potential variables you cannot control. Not only can you not control, you can't even properly monitor those elements to know when they're affecting the experiment. The test methodology was not good and I was surprised because DF usually nails these things.
From memory the only PS5 DF benchmark we actually know from a known devkit owner (hacker) to be highly GPU limited is Control during the photo mode with uncapped framerate (which makes totally sense, during photo mode CPU is not used at all for game logic). So the benchmark DF made is actually a great way to test both the memory bandwidth + GPU compared to XSX or PC.
Perhaps not as common as I would like, but generally the approach in an argument. Highly recommended!




So then there is no point to such speculation that you wanted Rich to engage in. There is no point for Rich to speculate on "core clocks" of the PS5 when he has no way to measure that. All he can do is comment on potential optimization/lack thereof, or when there are clearly code bottlenecks like in Cyberpunk on the PS5 in raster mode where the dynamic resolution is failing to change fast enough to stabilize the framerate.

Since we don't have tracedumps of the actual games being run on each platform, there is going to be some level of speculation involved, yes. The actual kind of speculation matters though, hence why I'm posting actual CPU-limited scenarios of a budget PC, in the exact same scenes Rich tested, to illustrate why your speculation that there is a CPU bottleneck on the PS5 is highly likely not to be the case.

The degree of the speculation matters. If my Hitman 3 result showed a CPU bottleneck reaching say, 70fps in the same scene, then yeah - I would posit that it's possible the PS5's CPU is the bottleneck here. But of course, it wasn't remotely close. Speaking of which:

DF:

View attachment 10874

My system, same area:

View attachment 10875

Nearly double in a CPU limited scenario.

You can argue this doesn't definitive 'disprove' anything definitively, and sure - it doesn't, as you readily admit speculation is impossible to avoid when we can't see what the code is doing. I'm using these to point out why I believe your speculation is untethered to likelihood based on actual evidence. The only actual data you've come anywhere close to providing is a youtube video showing that a 3600X can be a bottleneck in games and settings that Rich didn't test.

So there are only two possible results in the games Rich tested that could potentially indicate a CPU bottleneck on the PS5. And so, I've tested those exact areas on a PC with a budget CPU. The same CPU in the PS5? No of course not, I can't test that. So then you are left to argue that the PS5 CPU is so underpowered relative to a $140 6 core CPU, that it is this much of a bottleneck - to where my 12400f is 2 to 3 times the performance, even with the overhead of its API.

It's a possibility, yes. My argument though is it's close to the same likelihood that DF's video is also poor because Rich didn't tell us the polling rate of the mouse he was using. You're free to make these arguments, and I'm free to test the actual scenarios that you want in order to demonstrate their validity.

(Also bear in mind Rich also tested the supersampling mode in MHR, clearly not CPU limited, it was pretty much the exact same performance gap. So clearly this is not due to the CPU, you don't even really need my tests to determine this.)



"Scientific methodology!"

<provides hard data casting significant doubt on hypothesis that CPU is a bottleneck in this game>

"No, not like that!"



Again, circling back. What did you mean by this?

It's one thing to speculate that an established tech channel that has demonstrated years of meticulous approach to detail made a snafu in not recognizing the possibility their results in a video might have been skewed due to other components, it's another to imply it 'speaks volumes' about...something.
First thing I noticed on DF pic is PS5 is triple buffered vs Vsync off (I think?) on PC. That difference alone can create a lot of performance difference (I think up to like 10-20% gap based on DF vs NXgamer comparisons). Sure finding identical hardware is hard, but you can at least try comparing using similar settings / hardware (including CPU).
 
First thing I noticed on DF pic is PS5 is triple buffered vs Vsync off (I think?) on PC. That difference alone can create a lot of performance difference (I think up to like 10-20% gap based on DF vs NXgamer comparisons),

We've talked about relying on NxGamer's assumptions before, in fact I'm pretty sure it was this exact flawed example. Showing this huge 'gap' was based on games with faulty vsync on the PC, such as A Plague's Tale - when he was speaking about this gap, he was showing his GPU being severely underutilized in the very area he's referencing!

There is no rational reason why simply using a proper triple buffered vsync setup would cost you 10-20% performance, it makes no sense. This is trivially easy to test, as you can force triple buffering in DX9/11 games on the PC through Nvidia's Fast Sync (and AMD's Enhanced Sync).

Lo and behold:

No Vsync: 106 fps

APlagueTaleInnocence_x64_2024_02_23_21_25_41_067.png


Fast Sync (triple buffered vsync): 105 fps.


APlagueTaleInnocence_x64_2024_02_23_21_26_44_614.png
 
First thing I noticed on DF pic is PS5 is triple buffered vs Vsync off (I think?) on PC. That difference alone can create a lot of performance difference (I think up to like 10-20% gap based on DF vs NXgamer comparisons). Sure finding identical hardware is hard, but you can at least try comparing using similar settings / hardware (including CPU).
There's no obvious reason that triple-buffering would have reduced average framerate compared with no-vsync, unless you allow the no-vsync case to tear above the refresh rate and count that as "more frames." Typically, the difference between them is that turning vsync off trades screen tearing for reduced average latency.
 

00:00:00 Introduction
00:00:43 The Good - Frame Rates and PC UX
00:04:05 The Good - Sky and Water Simulation
00:05:22 The Bad - Water Reflections
00:08:13 The Bad - Animations, Textures, and LOD
00:11:12 The Bad - Lighting, Particles and Post-Processing
00:13:00 The Ugly - Xbox Series X/S and PS5 Performance Mode Image Quality
00:16:25 The Ugly - Quality Mode Controls
00:17:50 The Incomprehensible - AAAA Game Design and Bugs
00:21:34 Video avast!
 

00:00:00 Introduction
00:00:43 The Good - Frame Rates and PC UX
00:04:05 The Good - Sky and Water Simulation
00:05:22 The Bad - Water Reflections
00:08:13 The Bad - Animations, Textures, and LOD
00:11:12 The Bad - Lighting, Particles and Post-Processing
00:13:00 The Ugly - Xbox Series X/S and PS5 Performance Mode Image Quality
00:16:25 The Ugly - Quality Mode Controls
00:17:50 The Incomprehensible - AAAA Game Design and Bugs
00:21:34 Video avast!
11 years and 100s of millions of dollars.
 

00:00:00 Introduction
00:00:43 The Good - Frame Rates and PC UX
00:04:05 The Good - Sky and Water Simulation
00:05:22 The Bad - Water Reflections
00:08:13 The Bad - Animations, Textures, and LOD
00:11:12 The Bad - Lighting, Particles and Post-Processing
00:13:00 The Ugly - Xbox Series X/S and PS5 Performance Mode Image Quality
00:16:25 The Ugly - Quality Mode Controls
00:17:50 The Incomprehensible - AAAA Game Design and Bugs
00:21:34 Video avast!
yikes.. lol.. damn.
GJ Alex
 
Back
Top