Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
I wonder if DF can find-out the actual truth behind the file size differences between XBSX and PS5, other than the wishful conjecture that's been going around on gaming boards.

Any reason to believe its not the deduplication of assets?
 
I actually start to think the XSX is the better designed console, as it seems to catch up and excel the later we get into the generation, and its only been some months now. In special seein ray tracing isnt going anywhere anytime soon.
 
I actually start to think the XSX is the better designed console, as it seems to catch up and excel the later we get into the generation, and its only been some months now. In special seein ray tracing isnt going anywhere anytime soon.
still during real gameplay has more drops than ps5 in control
 
Most (but not all) gameplay comparisons (so when the CPU has much more work to do) with similar settings show a performance advantage on PS5 (or near indentical performance).
I’ve watched all of DF’s PS5-XSX comparos and my main takeaway was PS5 outperformed XSX with alpha effects (Valhalla smoke, Hitman 3 grass). How much is the CPU involved in alpha blending?
It's unintuitive, but, except if the CPU is using specific power hungry instructions, the max power on PS5 should be reached during cutscenes and others scenes not taxing the CPU (particularly if uncapped).

On PS4 and Pro, for instance the fan is often spinning the most during non-gameplay scenes: in cutscenes or when the GPU is not limited by CPU logic as in start screens.
Don’t fans spin up during cutscenes because they contain a ton of post-processing that maxes out the GPU? Map screens may also inadvertently max out the GPU by rendering something semi~transparent on top of the game, or somehow losing adaptive vsync in doing so.

Max overall APU power may be different than hotspotting a part of the die with a sepcific load, right?

So actually what DF tested with Hitman 3 and here Control are likely be the worst cas possible for PS5, in those scenes (notably as they are uncapped) the GPU is more likely to be downclocked than gameplay scenes because those scenes wouldn't be stalled by some CPU logic that would make the GPU wait for some logic to be done.
Can the PS5 APU sustain max clocks if only one part (CPU or GPU) is being stressed? Similarly, what’s more likely to make the XSX’s APU throttle or its fan max out, a balanced load or slamming just the CPU or GPU?
 
Last edited:
That's a good question! I wish I knew.

Moving a main thread around seems to have a cost, but AFAIK PCs often do it many times a second to balance thermal load of the most demanding thread so they can boost optimally, so unless something is going wrong (maybe with the scheduler?) I don't think it should be causing hitches like these.
I agree with you.
But there's also no reason for this to happen anyway, as unlike PC the game can lock processes to cores.
 
Why did remedy cut raytracing from XSS? Memory issues already? Maybe thats why 60fps @1080p raytacing dosent work in gameplay on higher end consoles also
 
I’ve watched all of DF’s PS5-XSX comparos and my main takeaway was PS5 outperformed XSX with alpha effects (Valhalla smoke, Hitman 3 grass). How much is the CPU involved in alpha blending?

Don’t fans spin up during cutscenes because they contain a ton of post-processing that maxes out the GPU? Map screens may also inadvertently max out the GPU by rendering something semi~transparent on top of the game, or somehow losing adaptive vsync in doing so.

Max overall APU power may be different than hotspotting a part of the die with a sepcific load, right?

Can the PS5 APU sustain max clocks if only one part (CPU or GPU) is being stressed? Similarly, what’s more likely to make the XSX’s APU throttle or its fan max out, a balanced load or slamming just the CPU or GPU?
But in Control there are also plenty of alphas in some photo mode scenes and the XSX is still performing a bit better (well in some scenes the performance is indeed identical though). But I think it's when there is a mix of GPU and CPU involved that the PS5 has the advantage thanks to its design focused on low latency which will favor those scenes: the CPU cache, the GPU caches and maybe because of the specific GDDR6 chips they are using but we don't any more details about those only that they are a somehow better (but with the same clocks) than the others.

Usually the fan is spinning the max on PS4 when the GPU can be maxed (in very different kind of scenes, polygons display, shaders stuff even 2D stuff in indies games) without being restrained by the CPU with game logic. So in those scenes the GPU is being used like in a furmak test (and also can use 100% of the bandwidth without contention with CPU jobs). But those scenes are obviously not typical gameplay scene (except maybe in a walking simulator game).
 
there are some small differences in some comparisons, but i don't know if it can explain the FPS delta. In other cases they both seem identical and yet XsX performs better
This is the kind of delta I would expect even the relative differences in GPU and memory bandwidth. A few bits of missing geometry or textures is not going to boost the frame rate by 16% or more! :nope:

I do wish devs would just include a 'NOT RECOMMENDED' unlocked fps mode. Let the user and their TV's VRR implementation determine if they like it or not.
 
Or just adjust the rendering load by scene, if they're hitting 40-50fps in one area, then add a few graphical features to that area, while dropping them in The Corridor of Doom.
 
still during real gameplay has more drops than ps5 in control

well yes if we count those big stutters then yes. If we look at normal FPS drops they perform the same. Plus XSX seems to perform much better in unlocked raytracing photo mode. Funny thing is not so long ago all games were looking and performing much better on PS5 higher res better details better framerates, in just 2 months or so we are looking at the situation where games perform similar with same or better settings on XSX. I don’t know if this is tools getting better or maybe not all cross gen titles work so well on ps5 architecture. PS4 was a lead platform for many titles so maybe code was already much better on Sony’s platform. Maybe ps5 have better design for early titles but XSX design will take over soon in upcoming titles. Or maybe those all hitches and freezes on multiple titles on XSX are sign that there is something really wrong with XSX architecture and it will last whole gen. For me next interesting comparison will be Outriders. The demo should be available 25th Feb so I am hoping for DF test.
 
Using the same settings in the same scene this game performs better on PS5. But it's during gameplay. The XSX has usually the performance advantage in cutscenes or photo-mode, rarely during gameplay. I think we'll see this pattern for the whole gen.
Surely because OS message processing caused from input draws too much perf on Windows. :D
 
well yes if we count those big stutters then yes. If we look at normal FPS drops they perform the same. Plus XSX seems to perform much better in unlocked raytracing photo mode. Funny thing is not so long ago all games were looking and performing much better on PS5 higher res better details better framerates, in just 2 months or so we are looking at the situation where games perform similar with same or better settings on XSX. I don’t know if this is tools getting better or maybe not all cross gen titles work so well on ps5 architecture. PS4 was a lead platform for many titles so maybe code was already much better on Sony’s platform. Maybe ps5 have better design for early titles but XSX design will take over soon in upcoming titles. Or maybe those all hitches and freezes on multiple titles on XSX are sign that there is something really wrong with XSX architecture and it will last whole gen. For me next interesting comparison will be Outriders. The demo should be available 25th Feb so I am hoping for DF test.
I dont’t see much difference in this 2 months, some games or some modes play better on ps5 other on xsx, most very close. Hitman 3 could be one outlier with clear xsx advantage but then Mendoza level ;)
 
@Globalisateur one spec we’re never given is ram timings. It can make a huge difference in cpu performance. I would imagine ps5 and series x are in the same ballpark but latency to memory matters a lot.
Exactly. Interestingly we have already learned the PS5 uses custom GDDR6 chips called MT61K512M32KPA-14C:B. They are labelled 14C instead of 14. The only thing we know about them is they should perform better than the others 14 Gb/s models (because they have a better speed grade mark).

NHC9kag.png

SSYf37L.png
 
You could call it “framerate go VRR” if you want to get that coveted internet meme demographic.

Apparently at least on Xbox there is an API call to know if the TV supports VRR so this would not even be such a risky option to user experience.
 
@Globalisateur one spec we’re never given is ram timings. It can make a huge difference in cpu performance. I would imagine ps5 and series x are in the same ballpark but latency to memory matters a lot.

It can indeed. As a pc gamer with alot of OC's through all the years, mem latency can have a impact to cpu performance, but it greatly depends on the workload. For BF4 it made a rather large difference, but other games it didnt matter at all. Though, i doubt the XSX's stutter is due to memory timings in Control.
 
It's unintuitive, but, except if the CPU is using specific power hungry instructions, the max power on PS5 should be reached during cutscenes and others scenes not taxing the CPU (particularly if uncapped).

On PS4 and Pro, for instance the fan is often spinning the most during non-gameplay scenes: in cutscenes or when the GPU is not limited by CPU logic as in start screens. The start screen of God of War is actually used by DF to measure the max power consumption of Pro. As stated by Cerny the map of Horizon makes the fan goes hyperdrive. Another example would be MGS5, one of the most technically impressive game on PS4 is the most noisy during the start screen or the cutscenes. The game is usually quite silent during gameplay. But there are plenty others examples on PS4 and we know Cerny and co studied tons of PS4 games when they designed PS5 dynamic clocks.

So actually what DF tested with Hitman 3 and here Control are likely be the worst cas possible for PS5, in those scenes (notably as they are uncapped) the GPU is more likely to be downclocked than gameplay scenes because those scenes wouldn't be stalled by some CPU logic that would make the GPU wait for some logic to be done.
Imo, the CPU sharing power with the GPU is the worst case scenario. And in areas where no CPU is required all the power draw will go towards GPU. It's unlikely to draw additional power away from the GPU in these scenes because the CPU has nothing to do, this is in fact the best case scenario for PS5 if you want have the absolutely maximum amount of frame rate. As per Cerny's words, the PS5 handles its clocking based on the code, so if it detects it's going to spin up for no reason and fly it to the moon, it will clock down to conserve that power. I don't want to have to go through the numerous amount of post history that tried to tell me for a long time that both the CPU and GPU would run at it's maximum clocks for PS5 and that' there's no way the CPU would 'ever' dip the GPU down enough to drop below 10TF. And now with this perfect scenario, no CPU, all GPU, it's going to use more power seems counter intuitive to a great deal of discussion we had by multiple members earlier on before release.

I've seen this crop up several times, cache sizes, joined caches, latency etc. Frankly I don't know what to make of it, seems too big of a deal for 2 consoles apparently both using Zen 2 + RDNA 2. But I mean, all the games in which we've seen 'PS5 Win' is basically a discussion around whose dropping below the target frame rate of 60fps more. You don't actually see how far either console goes above it. Is it possible that XSX could both be outperforming PS5 in unlocked scenarios and under stress also under performing it? Sure it can. Do we know if it is? No. But we are assuming it can't because we haven't seen many unlocked frame rate games. And not the recent ones either.

PS5 has been, as far as I can see, very consistent in it's performance since launch (BC aside). And XSX not so much.
So I'd probably say, XSX got a problem instead of PS5 having secret sauce.

imo, If you want to know what's up with the dipping on XSX, stop looking for things on PS5 as it's not dipping and look for where XSX could be doing things differently.
 
"And in areas where no CPU is required all the power draw will go towards GPU. It's unlikely to draw additional power away from the GPU in these scenes because the CPU has nothing to do, this is in fact the best case scenario for PS5 if you want have the absolutely maximum amount of frame rate."​

This "Photo Mode Benchmark" is using RT, right?
I understand that the "bottleneck" in this comparison is the RT power of each GPU, and because the PS5 have less CUs, less RT hardware, the FPS hit's a limit earlier in this benchmark, so it does not matter if this is a best case scenario for the GPU because the CPU may be almost idle. I thought that this "benchmark" was useful to see finally in practice the difference in RT potential between the two machines, not other strengths and weakness.

What this comparison got me thinking is that maybe Remedy could had enable contact shadows on the SeriesX.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top