You are wrong here. With delta color compression applied everywhere in RDNA 2 the bandwidth should be enough to properly use the pixel performance on PS5 in many cases.
So, I've looked at Alpha as being a possibility for PS5 to hold an advantage over XSX for some time now. the general issue for me here is that I find that this argument does not hold consistent enough for the type of performance dips that we're seeing but more importantly a 20% fill rate advantage shouldn't equate to a 100% frame time advantage. and so normally while I would agree that there are probably times in which fill rate might be an issue and we may see small dips in the you know one to five frame rate difference, sometimes larger, I do not find it to be a particularly encompassing explanation for the larger drops.
Last generation, we looked at how fill rate was plaguing both consoles. And what we found overtime was that many of the blending and full screen alpha effects were eventually covered by compute shaders. Moving things to compute shaders bypasses the needs for rops and therefore fill rate, and this makes this entirely an exercise around bandwidth. And we saw this entire exercise of moving to compute shaders as largely being a big part of the improvement in graphics overtime with last generation. They overcame huge fillrate and calling problems using compute shaders. And we see this with first party Sony titles in which grass and vegetation is very numerous and moving. Typically, that kills fill rate and has culling problems. We also see with newer compute methods like Dreams where they can have infinite amount of detail, with minor memory footprint and draw it out while they bypassed ROPS entirely. Thus, I find the likelihood that fill rate as being this universal metric for all titles as being unlikely to be the culprit for some of these larger frame drops especially considering that fill rate only affects one part of the fixed function pipeline an important part for sure, but not necessarily enough to double the frame rate.
In one of your earlier posts, you wrote about how XSX was outperforming PS5 in camera photo modes (Control) in which alpha was present, yet in this case in a pure GPU benchmark we should have seen PS5 outperform series X. Your counterpoint to the GPU having as much clock rate as possible and none of it going to CPU was that the CPU is acting now as some sort of viral hog pulling power from the GPU and thus why it was underperforming. This does not align because we can see with DMC5, at a 120 FPS frame rate mode it was able to hold higher frame rates up to 20% higher than Xbox series X. So clearly in an unlocked frame rate type of situation you are referring to is clearly not applying to this case. So, I find once again your counterpoints, well I do not dismiss them entirely, are not strong enough to account for everything that we're seeing here.
All in all, I find that your argument around alpha does not necessarily fit well enough with the data provided and thus I look elsewhere to explain some of the data points that we see. I think it is OK to use from time to time, in which it is obvious that the situation is a fill rate problem, but it is not nearly applicable in many of the issues that we see or that you have over attributed the fill rate advantage significantly more than its impact can be.
I meant XSX has more bandwidth, yes, but the DCC is allowing PS5 to actually use its 20% advantage in many cases. And it is proving it in most comparisons with heavy use of alphas where the PS5 has almost always the edge (XSX having the edge in compute heavy scenes with RT). No surprises there. The fact that you want to fabricate another reason for PS5 advantage (allegedly caused by bad tools on XSX, is that it in a nutshell?) doesn't change the reality. PS5 performs better when there are plenty of alphas.
and so my challenge to you is to answer the following question, why would PS4 pro with two times the amount of fill rate than Xbox One X, was completely unable to keep up with the resolution difference despite having, double the fill rate
with DCC. My second challenge to you is to prove the dips only occur as a result of alpha, and alpha alone. I would like to see the evidence, I’ve seen situations where alpha effects are occurring and frame rate is tanking, but when those alpha effects are gone, the frame rate is still tanked on XSX. So, it leads me to question if the tanking was alpha related. Yes Hitman 3 is a prime example, but if you could find more, please do.
So once again I must ask if you are over attributing this 20% differential with being as big of a deal as you are making it out to be. However, when we look at these benchmarks that have to do with triangle culling and try and ultimately primitive drawing you can see the improvement can be up to 1600%. You claimed I was fabricating this response as to prove that there is a toolkit issue with Xbox; my rebuttal is that I have managed to do is to prove that there may not actually be a toolkit issue with Xbox, but to prove that there is something that PlayStation 5 is doing better.
When I specifically look at where PS5 can be performing the competition in this area, I look so look at situations in which already in 2 cards are also outperforming their equivalent NVIDIA counterparts. And if I look at the launch titles in which this group of cards was outperforming NVIDIA based cards where they normally line up with, and it happened with a couple of launch titles. So, NVIDIA only has mesh shaders and they have no access to primitive shaders. And if Sony forces primitive shaders to run on PS5 then by default all our RDNA 2 cards two would also have access to this. And I think this could explain why we see some of the RDNA 2 cards outperforming NVIDIA in AC Valhalla, Dirt 5 and possibly other titles (no more coming to mind however)
The counter rebuttal should look at why RDNA 2 cards don’t win in every scenario then, and that's because the better engines have all likely switched to compute based culling and likely use a lot more compute shaders to do their work and that separation away from the fixed function pipeline is what ultimately allows those cards to make up for those differentials. If you recall Unreal Engine 5 doesn't need primitive shaders or mesh shaders to perform its task, it does it entirely through compute shaders. And the triangle draw rate of Nvidia cards is insanely high, so all they needed was help on the culling side of things. Whereas on the console side of things, PS5 continues to hold a 20% primitive rate over XSX. And if PS5 is using Primitive shaders and XSX is using the standard pipeline, it's likely generating geometry at a much faster rate than 20% over XSX.
I’m not saying it’s not an alpha issue, but the alpha argument shouldn’t be applied liberally everywhere. There are bigger bottlenecks that need to be observed and I have a hard time believing a 20% fill rate advantage is going to double the frame rate over it's competition. Even under typical GPU benchmarking, take the same card and subtract 23% clock rate (and therefore TF with it), you will not receive a -50% frame rate penalty.
TLDR; If XSX has a fill rate problem, you wouldn't need PS5 benchmarks to prove it. Someone could look at XSX performance in isolation and see it suffering under transparency/blending problems.