Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Intresting is that in some cases the difference can be 36%, which is quite close to hitman 3's performance differentional, besides the XSX spending much of its time in the 50s and PS5 in the 40's fps range.
I think you're cherry picking to fit an agenda that doesn't exist. The average is 16% - which is very close to the actual paper spec differences, that alone is significantly more telling that a weird outliner - much like using the corridor of doom example to prove how good the PS5 is - it's highly flawed.

As an example - the texture Streamer actually stops streaming in photo mode at the moment you freeze the game. So in that scene for example, i started photo Mode on xsx immediately after the elevator door opened. On PS5, Jesse had already been sitting there for a while. So textures may perhaps not be fully streamed to highest mips on xsx.II have no idea.
Also, there could be z fighting differences between versions for decals.
There are thousand of reasons. IMO load the game up on both there and make some recording yourself. None of any Version differences have to Do with the game's settings which have been confirmed to be the same by... the developer.
This is interesting and in my uneducated mind might explain the gap which is non existent in the game modes...with no texture streaming the XSX has the ‘expected’ paper advantage but once in game mode they are toe-to-toe.

I guess I’m putting it out there that XSX ‘ issues’ regarding performance are down to texture streaming?
 
we are in a scenario of a typical tech demo, when only graphic processing is involved in a fixed environment.
Just like when people are saying the UE5 tech demo does not count because it's only a tech demo and not real game condition.
So here the superior raw power of XsX GPU shines.
 
Here we go - to sooth some conspiracy theories. I just went back to Maintanence Access Corridor after restarting the Directorial Override Mission on both consoles, waited a full minute, and made screenshots. With the HUD so people can see which console at is, and without the HUD so people can see the image in full. Here it is possible to see that the texture this time I did it looks different on the PS5 version in fact. It looks like the game's texture streamer is imperfect. A whole lot of fuss over absolutely nothing. The versions are the same.
PS5:
PS51.png


PS5 with UI:
PS52.png


XSX:
XSX1.png


XSX with UI:
XSX2.png


PS5 renders all games in an HDR container and then applies its own tonemaps down to SDR. So all games in SDR on PS5 will look different than normal SDR as designed by the developer on PS4 or PS4 Pro.

It is something that could possibly be producing a blacked crush look in SDR displayed back compat games.

This will prove that xsx has smaller puddles ;)

And yes, the engine has really some strange bugs. As I mentioned ealier. E.g. (on PC) the lighting comes a few seconds after watching a scene, so you can spot differences if you time your screenshots right. The constant-short time stutter I get when something begins or ends to happen (first enemy appears, last enemy beaten). It seems the engine is just behind it's time when loading something (at least in the windows store version).
What I really dislike is the constant noise in the image where pixel shimmer all over the screen.

The corridor of doom should be purely RT limited. It is a scene where the glass reflects (almost over the whole screen) light sources ("windows") behind you + it is transparent + you can see the light-source behind that. Just seems to be to much for current RT hardware. And as I wrote before, Control is one of the games that is almost purely RT limited if you turn it on. The rasterizer options have almost no effect on performance (well they might if you lower resolution).
 
My fave RT 'testing' space is the Jukebox enclosure. You're basically standing in a glass box with reflections on the ceiling, threes sides, and the floor.
 
DF written Article @ https://www.eurogamer.net/articles/...ction-all-graphics-modes-tested-on-both-games

The Nioh Collection on PlayStation 5 - all graphics modes tested on both games
And how it improves over PS4 Pro.

PlayStation 5's Nioh Collection sees Tecmo Koei bring together both of its PS4 epics into one single package complete with all downloadable expansions, offering up a huge amount of content. On top of that, the publisher also promises enhancements to the existing PS4/PS4 Pro releases, with both titles delivering allegedly native 4K modes, as well as support for 120Hz gaming. On top of that is a new 'PlayStation 5 standard mode' - effectively a quality mode targeting the capabilities of the new hardware for a significantly upgraded experience. So how does it all pan out, and what kind of improvement are we looking at compared to the existing PS4 Pro version - which offers 4K, high frame-rate support of its own It's an interesting question to answer because fundamentally, the older renditions of Nioh on PS4 Pro offered a great degree of flexibility with their own performance and quality modes and as they leaned heavily into unlocked frame-rates and dynamic resolution scaling, you can already get an upgraded experience from the existing games simply by running them under backwards compatibility on the new Sony platform.

...
 
Then the majority of this particular benchmark is RT limited and not alpha transparency limited.
Can't it be both? I haven't really read deep into exactly how AMD hardware handles ray tracing, but from what I have read the RT is all handled using the same shader hardware that handles "regular" compute. If you are using a shader to handle even basic stuff like color math (adding 2 colors from a transparency effect) that would occupy some of those shader units, and if RT is using them also... Do we know how bandwidth heavy RT is on AMD hardware? Could the extra BW from a lot of alpha hamper RT performance?

To add to the screen tearing conversation, it is possible to effectively have screen tearing without technically having screen tearing. If you are outputing a vsync'd signal but are composing your final buffer from internal buffers that aren't sync'd you can get tearing within the contents of those buffers. And if those buffers are full screen effects, then you would have what looks like traditional torn frames even if the full buffer is being refreshed on the output. I used to see stuff like this pretty often when I watched TV on my PC back in the Windows XP and Vista days. My TV card would output to a buffer (either a DX surface or overlay) and show tearing even though my output was vsync'd to my display. I can't say this is what is happening with Control, only that both "The game is double buffer vsync'd" and "I see torn frames" can both be true at the same time.
 
Last edited:
we are in a scenario of a typical tech demo, when only graphic processing is involved in a fixed environment.
Just like when people are saying the UE5 tech demo does not count because it's only a tech demo and not real game condition.
So here the superior raw power of XsX GPU shines.

This is basically a benchmark for graphics processing power, to measure what the systems GPUs are capable off. A very good one at that (as DF said in their video).
UE5 isn't even a game, its a optimized for one platform showcase demo, its not on XSX so can not be compared.

I think you're cherry picking to fit an agenda that doesn't exist. The average is 16% - which is very close to the actual paper spec differences, that alone is significantly more telling that a weird outliner - much like using the corridor of doom example to prove how good the PS5 is - it's highly flawed.

Oh and there we go again, claims of agendas, conspiracies and other crap that certainly doesnt belong here. Read the first post in the thread and please obey the rules as created for this one.

The calculated avarage is 16% yes, as much as they have tested it. But they also state that the XSX spends much of the time in the 50's, whereas PS5 in the 40fps range. Thats a fact, and if framerate locks would be needed, the XSX could be locked to 50fps whereas PS5 to 40fps. Watch the video again.

Hitman 3 showed a 44% difference too. Also paper difference is around 18% in pure GPU capabilities, yes. But theres also bandwith feeding that GPU, which happens to be rather important regarding ray tracing and higher resolutions (and more). Not to forget that one of the platforms has to content with CPU/GPU power balancing, which does and will have an impact when things get tight.
 
Last edited:
problem is a real benchmark runs the same code on the machines benchmarked.
Here the game is coded for their respective platforms, and still have old code from the last gen versions. it's not been rewritten from scratch.
It still gives an indication, but we should not take it as an absolute benchmark. Otherwise the XsX would always have the same degree of advantage in every scenes.
 
Can't it be both? I haven't really read deep into exactly how AMD hardware handles ray tracing, but from what I have read the RT is all handled using the same shader hardware that handles "regular" compute. If you are using a shader to handle even basic stuff like color math (adding 2 colors from a transparency effect) that would occupy some of those shader units, and if RT is using them also... Do we know how bandwidth heavy RT is on AMD hardware? Could the extra BW from a lot of alpha hamper RT performance?
I sometimes like to imagine there being a formula for every game and every engine. where
F(x,y,z) = frames per second
where x is resolution, y is some other stuff, and z is some other stuff.

When I look at any polynomial, I'm always reminded of the highest power of any polynomial, because that power is going to largely determine the shape of the graph. So x^5y + x^3y + x^2y + x, is largely still going to look like a x^5 graph despite all the elements trailing behind it. So in my mind, if RT is going to have this daunting impact on the GPUs, everything else relatively speaking is diminished in it's impact to frame rate. ie, the rest of the render pipeline is stalled at RT and after it's stalled it can go as fast as it can. So it doesn't really matter what comes before or even after the stall, because the game can only render as fast as that stall point. This idea that you can somehow 'catch up' after the stall is sort of not making a lot of sense when you think about it. The 'catch up' part of it would have to be so long and relevant to make up for the RT stall and we just don't see that because in all the benchmarks the XSX beats the PS5 in the camera modes. If there was such a discrepancy between alpha/rop performance, PS5 would blow by XSX in non RT challenging scenes.

Both consoles are being rocked by RT. And we see that happening with the 6800XT.
The reason that XSX is barely better than PS5 in the corridor of death is because ultimately it only has marginally better RT. And we see that with the 6800xt, even with 80CUs and tons of bandwidth, 2x the ray calculation dropped their performance back to 34fps with console settings. So it's RT as being the limiter on all these benchmarks.
 
Last edited:
problem is a real benchmark runs the same code on the machines benchmarked.
Here the game is coded for their respective platforms, and still have old code from the last gen versions. it's not been rewritten from scratch.
It still gives an indication, but we should not take it as an absolute benchmark. Otherwise the XsX would always have the same degree of advantage in every scenes.

While true, its highly unlikely we will ever see such benchmarks. On PC, to gauge performance, we have 3dmark etc and they aren't either written for each specific hardware vendor. Ideal would be, having some sort of benchmark tools specificly written for both the XSX and PS5. Though, that wouldnt either really be a good measurement for what multiplat games performance on the systems, as rarely multiplat games are even doing that.

This Control benchmark is the best we got so far. Its main code isnt true next gen perhaps, but that accounts for both systems. The 16% avg differentional is very close to what the paper specs indicate for the GPUs, atleast, which almost perfectly aligns just by looking at the TF difference. I can imagine that, when the CUs are saturated and bandwith becomes a more limiting factor, together with the CPU being hammered, the gap might widen and perhaps explain the 36% difference for Control and 44% for hitman 3. The future is compute (UE5 etc), so its not that unlikely either.
 
problem is a real benchmark runs the same code on the machines benchmarked.
Here the game is coded for their respective platforms, and still have old code from the last gen versions. it's not been rewritten from scratch.
It still gives an indication, but we should not take it as an absolute benchmark. Otherwise the XsX would always have the same degree of advantage in every scenes.
all the benchmarks have been bad if we want to be honest about it; in due time however, enough of these samples may provide a good picture, but people have largely been impatient to make claims, so they make them with each sample despite it being so early in the generation, and so little to work with.

All benchmarks have a capped upper limit, so we've only been comparing dips all this time, without any knowledge of how high the consoles can actually ride.

I think aside from 1 game, no game has run completely unlocked frames with the same locked resolution. And it's clear that game was a disaster trying to make it out of the gates for launch during covid.

We'd likely have to wait some time for a real benchmark game to be release, likely when PS5 starts supporting VRR more, perhaps more developers would be willing to release a VRR unlock mode.
 
it does not align everytime that's why it's weird and should be taken with caution, because if as some say here in photo mode only GPU power talks, why is that in the corridor scene the difference is negligible ?
 
Oh and there we go again, claims of agendas, conspiracies and other crap that certainly doesnt belong here. Read the first post in the thread and please obey the rules as created for this one.
You can cut out that BS - I haven't done anything wrong...mods feel free to ban me if I have!

The calculated avarage is 16% yes, as much as they have tested it. But they also state that the XSX spends much of the time in the 50's, whereas PS5 in the 40fps range. Thats a fact, and if framerate locks would be needed, the XSX could be locked to 50fps whereas PS5 to 40fps. Watch the video again.
I wonder, what % difference that is - not only that it was a very loose interpretation...and no, the game couldn't be locked to that, maybe you should watch the video again?

Hitman 3 showed a 44% difference too. Also paper difference is around 18% in pure GPU capabilities, yes. But theres also bandwith feeding that GPU, which happens to be rather important regarding ray tracing and higher resolutions (and more). Not to forget that one of the platforms has to content with CPU/GPU power balancing, which does and will have an impact when things get tight.
No, everyone likes to roll out the outliner Hitman 3 to prove a point! Hitman was running at 44% more pixels but we don't know what PS5 could run at because it was running at a locked 60FPS. There were times when the PS5 outperformed the XSX therefore we can be pretty certain the performance delta is less than 44% - who knows how much less.

SMH
 
Both consoles are being rocked by RT. And we see that happening with the 6800XT.
The reason that XSX is barely better than PS5 in the corridor of death is because ultimately it only has marginally better RT. And we see that with the 6800xt, even with 80CUs and tons of bandwidth, 2x the ray calculation dropped their performance back to 34fps with console settings. So it's RT as being the limiter on all these benchmarks.

Wait what? It has a lot more CUs - surely it has more than 'marginally better' RT?
 
it does not align every time that's why it's weird and should be taken with caution, because if as some say here in photo mode only GPU power talks, why is that in the corridor scene the difference is negligible ?
And that's fine it doesn't; there are loads of factors in place that will change with each title.

A quick story: A coin has 2 sides. If I asked you what the probability of getting heads is, you'd say 50%. This is a reasonable assumption, why would anyone assume that it's greater >50%?

If I told you I will flip it 20 times, do you think you will get 10 heads and 10 tails? Probably, but it's not exact either. What's the probability that you get 15 heads and 5 tails? Probably not very high, but that probability is still present. Eventually if it continues to stay lop sided, you need to check for bias, that bias being that heads is >50% chance of landing.

This is about all we can do. Right now, people are making calls based on 10 coin flips. We're a far way from anything statistically significant. You need to flip 100-200 times to get that 50% landing really. 20 coin flips is still like 5.6% chance you'll get 14 or more heads out of 20 flips. Which is daunting if you had 3 girls and you want a boy or vice versa, It's still quite high you'll get 1 more girl/boy.

When people made claims on XSX outperforming PS5, this was a debate on the null hypothesis really; it's like saying heads should be 50% chance of landing. That's the null hypothesis, until we start getting real data from flipping.
12 > 10, and 560 > 448. 52 CUs > 36 CUs for RT workloads. That's the null hypothesis, XSX should be biased towards being better. Data right now is not reflecting that, sometimes it is, here and there, and most of the time it seems much closer to them being a draw based on how low the consoles dips (without knowing how high the consoles can ride). A good example is that everyone assumed based on the first video of Control that PS5 was outperforming XSX just looking at stutters, and with camera mode we see the opposite with the framerates unlocked. Now a clear winner is present. So are we basing claims on good information really? That being said, if you want to be real about it; if people are comparign PS5 and XSX, sample testing should at least deviate the 2 by 5% in performance to separate them. Otherwise, we'd consider it a draw. And those that have been advocating for 'draw' has been correct all this time. Though there are some actual 5% wins earlier on for PS5, and hitman and control camera modes are more than 5% wins.

It's clear that there are issues on the XSX side of things, what those issues are, are relatively unclear still. Many are quick to point to alpha transparency, ROP performance, fixed function performance, etc, but in this academic benchmark, XSX wins clearly in all scenarios. And while we don't play synthetic benchmarks, it does tell us that our null hypothesis seems still worth keeping, and that the issues with XSX needs to be observed as to why it's performing at draw with PS5. Yes it's entirely possible that because PS5 is the lead console, they will always choose a performance profile in which the PS5 will never dip, and cap the higher end so that XSX can not exceed further so our only measurement of a win is based on loss frame rates. This is reality and that is console business. However, it works both ways. We saw Hitman choose a performance profile that catered around maximizing the XSX and PS5 was ultimate had it's ceiling performance lowered. It had less dips sure, but in the grand scheme of things, it rendered a continual 44% less pixels for the entirety of the game.

And the corridor of doom, because it's RT limited. And RT is really bad on consoles.

So if we take a bias that XSX is supposed to be on paper 20% better than PS5.
Pure math here then
Probability of XSX = 54.45% beating PS5
In reverse
Probability of PS5 = 45.45% beating XSX

They are really much closer than people think: out of 100 flips XSX should win 54.45% of them and PS5 winning 45.45% of them. I know it doesn't actually work like that in practice, but something to consider while we are lacking information.

Let's assume that out of 20 titles, PS5 wins 14 of them for instance
(20 choose 14) * 0.45^14 * 0.54^6 = 1.6% chance that this would occur. So if this does happen, we'd have to look into re-formulating our null hypothesis.

Hope this helps.
 
Last edited:
rt capabilities scale also with clocks so still xsx should have around 20% advantage theoreticaly

For some reason 18% is a number etched in my memory (12.155 vs 10.28), which is indeed around 20%. Perhaps take into account system reservations of resources and that's certainly pretty darn close to the academic 16% average findings.
 
You can cut out that BS - I haven't done anything wrong...mods feel free to ban me if I have!

Your claiming agendas and whatever. The first post in this thread clearly shows this isnt welcome. Attacking people escalates things. Keep it to yourself, not here.

No, everyone likes to roll out the outliner Hitman 3 to prove a point! Hitman was running at 44% more pixels but we don't know what PS5 could run at because it was running at a locked 60FPS. There were times when the PS5 outperformed the XSX therefore we can be pretty certain the performance delta is less than 44% - who knows how much less.

Clearly, hitman 3 isnt an outlier. Hence the discussion being mainly Control now. The difference is 44% for the ones playing and using the game as a benchmark.
 
Status
Not open for further replies.
Back
Top