Current Generation Hardware Speculation with a Technical Spin [post launch 2021] [XBSX, PS5]

Status
Not open for further replies.
You are wrong here. With delta color compression applied everywhere in RDNA 2 the bandwidth should be enough to properly use the pixel performance on PS5 in many cases.

I meant XSX has more bandwidth, yes, but the DCC is allowing PS5 to actually use its 20% advantage in many cases. And it is proving it in most comparisons with heavy use of alphas where the PS5 has almost always the edge (XSX having the edge in compute heavy scenes with RT). No surprises there. The fact that you want to fabricate another reason for PS5 advantage (allegedly caused by bad tools on XSX, is that it in a nutshell?) doesn't change the reality. PS5 performs better when there are plenty of alphas.
Are you assuming XSX does not use DCC? I have not heard anything of the sort about that considering AMD GPUs have had it for so long now.
 
I can't recall exactly, but didn't the MS HotChips presentation and slides include DCC and also different data formats for additional bandwidth savings?
 
Why are we still discussing Hitman's PS5/Series X performance like it's apples to apples. Isn't the resolution different between them (2160p vs 1800p). That's 20% more pixels that Series X is rendering, which would, assuming all else is equal, require 20% more bandwidth, and would require 20% more performance from the ROPs specifically. Series X has a 25% bandwidth advantage(from the 10GB), which aligns rather well with the resolution differences. Couldn't the game just be bandwidth limited during most scenes on both consoles but fillrate limited on Series X during the scenes with lots of alpha.

Regarding culling, unless they have an exotic solution to handle it, you usually can't cull geometry or even surfaces that that are obstructed by alpha effects, even if parts of those alpha effects cause an opaque color to be rendered. In the case of the grassy fields, assuming the grass is rendered with flat sheets for the grass with alpha (I haven't played the game), the hardware has to calculate and fill all of the geometry behind the grass, including the grass obstructed by the grass. It might simply be the case that you reach the fillrate limit of Series X when you are doing this, but not on PS5, given that the bandwidth relative to resolution is so similar.
 
Thats 44% more pixels, isn't it?
Yeah, I was doing pre-coffee math. 44% more pixels. But still, it's hard to compare relative performance in the title because the rendering load is different. So if Series X is dropping performance because of fill, or because of bandwidth, it's not indicative of an issue with the hardware's performance. It's rendering more per frame, and it's rendering less frames.
 
This is my understanding and why Hitman 4 should be considered running badly/not properly optimised on PS5 rather than it ‘showing the true power of XSX’.

Good, same thing can be said whenever the PS5 has had a slight advantage, as opposed to the huge advantages reported for the XSX.
 
You are wrong here. With delta color compression applied everywhere in RDNA 2 the bandwidth should be enough to properly use the pixel performance on PS5 in many cases.

So, I've looked at Alpha as being a possibility for PS5 to hold an advantage over XSX for some time now. the general issue for me here is that I find that this argument does not hold consistent enough for the type of performance dips that we're seeing but more importantly a 20% fill rate advantage shouldn't equate to a 100% frame time advantage. and so normally while I would agree that there are probably times in which fill rate might be an issue and we may see small dips in the you know one to five frame rate difference, sometimes larger, I do not find it to be a particularly encompassing explanation for the larger drops.

Last generation, we looked at how fill rate was plaguing both consoles. And what we found overtime was that many of the blending and full screen alpha effects were eventually covered by compute shaders. Moving things to compute shaders bypasses the needs for rops and therefore fill rate, and this makes this entirely an exercise around bandwidth. And we saw this entire exercise of moving to compute shaders as largely being a big part of the improvement in graphics overtime with last generation. They overcame huge fillrate and calling problems using compute shaders. And we see this with first party Sony titles in which grass and vegetation is very numerous and moving. Typically, that kills fill rate and has culling problems. We also see with newer compute methods like Dreams where they can have infinite amount of detail, with minor memory footprint and draw it out while they bypassed ROPS entirely. Thus, I find the likelihood that fill rate as being this universal metric for all titles as being unlikely to be the culprit for some of these larger frame drops especially considering that fill rate only affects one part of the fixed function pipeline an important part for sure, but not necessarily enough to double the frame rate.

In one of your earlier posts, you wrote about how XSX was outperforming PS5 in camera photo modes (Control) in which alpha was present, yet in this case in a pure GPU benchmark we should have seen PS5 outperform series X. Your counterpoint to the GPU having as much clock rate as possible and none of it going to CPU was that the CPU is acting now as some sort of viral hog pulling power from the GPU and thus why it was underperforming. This does not align because we can see with DMC5, at a 120 FPS frame rate mode it was able to hold higher frame rates up to 20% higher than Xbox series X. So clearly in an unlocked frame rate type of situation you are referring to is clearly not applying to this case. So, I find once again your counterpoints, well I do not dismiss them entirely, are not strong enough to account for everything that we're seeing here.

All in all, I find that your argument around alpha does not necessarily fit well enough with the data provided and thus I look elsewhere to explain some of the data points that we see. I think it is OK to use from time to time, in which it is obvious that the situation is a fill rate problem, but it is not nearly applicable in many of the issues that we see or that you have over attributed the fill rate advantage significantly more than its impact can be.

I meant XSX has more bandwidth, yes, but the DCC is allowing PS5 to actually use its 20% advantage in many cases. And it is proving it in most comparisons with heavy use of alphas where the PS5 has almost always the edge (XSX having the edge in compute heavy scenes with RT). No surprises there. The fact that you want to fabricate another reason for PS5 advantage (allegedly caused by bad tools on XSX, is that it in a nutshell?) doesn't change the reality. PS5 performs better when there are plenty of alphas.
and so my challenge to you is to answer the following question, why would PS4 pro with two times the amount of fill rate than Xbox One X, was completely unable to keep up with the resolution difference despite having, double the fill rate with DCC. My second challenge to you is to prove the dips only occur as a result of alpha, and alpha alone. I would like to see the evidence, I’ve seen situations where alpha effects are occurring and frame rate is tanking, but when those alpha effects are gone, the frame rate is still tanked on XSX. So, it leads me to question if the tanking was alpha related. Yes Hitman 3 is a prime example, but if you could find more, please do.

So once again I must ask if you are over attributing this 20% differential with being as big of a deal as you are making it out to be. However, when we look at these benchmarks that have to do with triangle culling and try and ultimately primitive drawing you can see the improvement can be up to 1600%. You claimed I was fabricating this response as to prove that there is a toolkit issue with Xbox; my rebuttal is that I have managed to do is to prove that there may not actually be a toolkit issue with Xbox, but to prove that there is something that PlayStation 5 is doing better.

When I specifically look at where PS5 can be performing the competition in this area, I look so look at situations in which already in 2 cards are also outperforming their equivalent NVIDIA counterparts. And if I look at the launch titles in which this group of cards was outperforming NVIDIA based cards where they normally line up with, and it happened with a couple of launch titles. So, NVIDIA only has mesh shaders and they have no access to primitive shaders. And if Sony forces primitive shaders to run on PS5 then by default all our RDNA 2 cards two would also have access to this. And I think this could explain why we see some of the RDNA 2 cards outperforming NVIDIA in AC Valhalla, Dirt 5 and possibly other titles (no more coming to mind however)

The counter rebuttal should look at why RDNA 2 cards don’t win in every scenario then, and that's because the better engines have all likely switched to compute based culling and likely use a lot more compute shaders to do their work and that separation away from the fixed function pipeline is what ultimately allows those cards to make up for those differentials. If you recall Unreal Engine 5 doesn't need primitive shaders or mesh shaders to perform its task, it does it entirely through compute shaders. And the triangle draw rate of Nvidia cards is insanely high, so all they needed was help on the culling side of things. Whereas on the console side of things, PS5 continues to hold a 20% primitive rate over XSX. And if PS5 is using Primitive shaders and XSX is using the standard pipeline, it's likely generating geometry at a much faster rate than 20% over XSX.

I’m not saying it’s not an alpha issue, but the alpha argument shouldn’t be applied liberally everywhere. There are bigger bottlenecks that need to be observed and I have a hard time believing a 20% fill rate advantage is going to double the frame rate over it's competition. Even under typical GPU benchmarking, take the same card and subtract 23% clock rate (and therefore TF with it), you will not receive a -50% frame rate penalty.

TLDR; If XSX has a fill rate problem, you wouldn't need PS5 benchmarks to prove it. Someone could look at XSX performance in isolation and see it suffering under transparency/blending problems.
 
Last edited:
They both use DCC, but L1 & L2 caches on ps5 are faster which both used by RB.
I can tank my 1070 by 23% clock rate (2000Mhz vs 1632 Mhz), and subtract 23% compute along side of it and only lose at most 13% frame rate. It's a very far way from -50%. At higher resolutions that difference will drop even smaller, as the bottleneck starts to slide in favour of bandwidth
 
This is my understanding and why Hitman 4 should be considered running badly/not properly optimised on PS5 rather than it ‘showing the true power of XSX’.
Shift the bottleneck away from geometry and it becomes a pure compute/fillrate/bandwidth bottleneck. I don't know what geometry load looks like for Hitman 3, (as does Halo Infinite btw) but it does come across as simpler against other titles we've seen. But that's perhaps an unfair POV given the situation. TLDR; if you can remove the geometry bottleneck then the comparisons become more apt. We just don't consider geometry to be part of the equation usually.
 
Last edited:
I mean, the main detail point here I is that it's actually very unlikely all of these games are dropping frames for the same reason. These are launch period ports of somtimes several year old games, with wildly different types of renderers, and microsoft has a clean break with their old tools/processes in favor of something that probably barely works in places. We just don't have enough data to even try to claim its one cause -- Maybe dmc can't keep 120fps because of bad synchonization of dx12 resources, hitman because of an extremely heavy alpha workload looking at like 80 bush planes at once (behind eachother) at a much higher resolution.

The difference between a fill rate "problem" and "slightly lower fillrate but still plenty" is engine and implementation.

(and, re: hitman -- two points:
1. yeah, i dont think its very likely that the ps5 just straight up preforms 44% worse. In whatever they tested, it performed badly enough that they dropped resolution waaaaay down to the next logical step. It could be that at 1800p uncapped the ps5 would at 80fps and the xbox would run at 85 -- all we know for sure is that io thought it needed to be dropped it down.
2. I bet it has a decent amount of geometry (and skinning, either on gpu or cpu) compared to other games we've seen. Look at those crowds!)
 
One question about download speeds on console, during the execution of apps:

Xbox one and PS4 reduced download speeds when a game was running. This happened not only to save resources, but also to make shure download speeds would not hog the internet connection, allowing games to run without problems.
Now Xbox series do the same.
In the following video you can see a series S reduzing download speed when a game is in background. The author thinks the problem has to do with quick resume, but it is not. Quick resume is just a save state, and there are videos on the internet showing the speed slowing down with just one game and no quick resume, like the one on this link.
Here is the video:

Now the question is about the PS5... Does the PS5 also reduce download speeds when a game is running?
When downloading something, every time I checked speeds, with a game in backgound or not, the speed is at max... But maybe PS5 is improved, and aware the game is in background and not active, so it gets the speeds back to maximum.
To check that out, a test was made... Using a router with OpenWrt + iftop the download speed was checked on a 100 Mbits conection during the use of the game.
So... Spider Man downloading, and no game running, this was the result.

2Pa5uU7.jpg


As you guys can see, very close to the maximum of 100 Mbits.

Now, with Demons Souls, online co-op.

fxVaDWS.jpg


There was really no noticeable decrease on download speed!

Now... This leaves me questions. Is this good??? Or is this bad???

As stated the download speeds decrease allows for some QOS, making shure the downloads would not hog the internet bandwidth... But in this case, although the game did not suffer at all, the bandwidth was almost at maximum. As such the risk of the game lagging due to the download is present.

Anyone else tryed this? Did the PS5 decreased download speeds? Did the games had lag?

Thank you for any feedback.
 
Last edited:
Intresting, never thought of that. I think my PS5 downloads at max even when the kids are playing destruction Allstars atleast, according to my TP link router (archer C5400, wired). Didnt play myself but the game was smooth as ever from a quick view.
Also, steam does download at max speeds even when playing BF4 MP conquest matches, without any lag (neither internet or local sdd related).
Modern systems seem to have no trouble doing those things at the same time.

The old consoles probably couldnt bear with downloading at max speed and reading game files at the same time due to hardware limitations everywhere, be it CPU, IO/hdd etc.
 
On the Xbox consoles, to get maximum download speeds, you need to make sure there isn't an active game. That means you have to use the menu and "Quit" the program. If you want to have the game you're playing still be in QuickResume state, one trick is to launch a different game then quit from that title.

When no games are running I see 450-500 mbit/s speeds saving to external USB SSD (not the nvme storage card). When a game is active I think the speed is anywhere from 35 - 75 mbit/s.
 
Intresting, never thought of that. I think my PS5 downloads at max even when the kids are playing destruction Allstars atleast, according to my TP link router (archer C5400, wired). Didnt play myself but the game was smooth as ever from a quick view.
Also, steam does download at max speeds even when playing BF4 MP conquest matches, without any lag (neither internet or local sdd related).
Modern systems seem to have no trouble doing those things at the same time.

The old consoles probably couldnt bear with downloading at max speed and reading game files at the same time due to hardware limitations everywhere, be it CPU, IO/hdd etc.

Speed limits acts as QOS, making shure the game can have free access to the internet with no bandwidth limitation.
Só this is good on one side, but can present problems on other. That's why I brouht the matter to discussion. Anyone ever experienced any problems with lag on PS5?
I have not...
 
So, the first die shots of the PS5 APU are available
Fritzchens Fritz auf Twitter: "A first quick and dirty die-shot of the PS5 APU (better SWIR image will follow). It looks like some Zen 2 FPU parts are missing. https://t.co/PefXCxc3G1" / Twitter

EuM4_ZeXUAcjVuy

EuM4-XEWYAYKnMk
Maybe someone can point out, what is which part of the GPU, CPU, ...

Edit:
seems like caches are still split, so no infinity-cache (never really expected that).
still nice looking I guess, the lines are nice. I can only separate the CPU (left side) there are 8 cores, 4x2. and the GPU, middle, though separating those CUs individually is a bit tougher.
 
Status
Not open for further replies.
Back
Top