Nah, you are clearly mixing contradictory information and omitting important facts.
From DF:
"However, even basic ports which barely use any of the Series X's new features are delivering impressive results. The Coalition's Mike Rayner and Colin Penty showed us a Series X conversion of Gears 5, produced in just two weeks.
The developers worked with Epic Games in getting UE4 operating on Series X, then simply upped all of the internal quality presets to the equivalent of PC's ultra, adding improved contact shadows and UE4's brand-new (software-based) ray traced screen-space global illumination. On top of that, Gears 5's cutscenes - running at 30fps on Xbox One X -
were upped to a flawless 60fps. We'll be covering more on this soon, but there was one startling takeaway - we were shown benchmark results that,
on this two-week-old, unoptimised port,
already deliver very, very similar performance to an RTX 2080."
The 2080 as benchmark is not relevant in this discussion. I just need to know that it's running 60fps ultra+ and that gives me a fairly good idea of how well it's going to perform when optimized.
Also, that 5700XT is RDNA1 and has lower clocks than the PS5. So there is that as well, your 50% difference pretty much gone.
That 5700XT is red devil, it's 2010Mhz. It's 10% less than PS5 but it also runs 10% more CUs at a full 40CU. 10% additional clock isn't going to give it a 50% performance boost.
RDNA1 to RDNA2, there's been no indication of expecting more performance for same TF. 50% performance per watt will result in higher clocks, but the TF count is nearly the same here.
40CU at 2000Mhz vs 36CU at 2230Mhz. I'm not saying this is going to be a straight expectation of performance, but any clustering/sorting algorithm would group PS5 and a 2000Mhz 5700XT together.
From a vector perspective, these are very equivalent GPUs.
- They are both RDNA
- They are both 10.24 TF
- They are both 448 GB/s bandwidth
- They have the same bus
- They have nearly the same CUs
- They have nearly the same Mhz
RDNA 2 would have to be 1 hell of an architecture to throw out all RDNA 1 results at equivalent settings.
Here's another benchmark. Overclocked 40CU to 2100 Mhz, OC memory to 484 GB/s so even higher.
https://www.techpowerup.com/review/xfx-radeon-rx-5700-xt-thicc-iii-ultra/16.html
Average 44 fps on ultra @ 4K
Under these circumstances, PS5 would not be able to achieve 4K60.
XSX will likely be there with optimization.
A final caveat: I’m not saying this is to be the expected norm. I’m just indicating the realistic probability that there could be exceptions to the rule that there should only be a 18-20% performance differential. Tech talk aside; it doesn’t matter, and most people won’t notice it. But if we are sticking to shop talk, then we must be open to the realities here.