As you would expect, there are a lot of factors to contribute to this: configuration of the GPU, functional computation units (regardless of the architecture), clock speeds, cache, memory, bandwidth, CPU and APIs. None of these things are equal between PS5 and Series X so why are people focussing on just one metric and expecting the higher to result in more performance?
There's a lot more than the TF that makes the situation regarding XSX and PS5 interesting. For a start, the base architecture is the same, and it's likely the L0 cache arrangement per CU is similar if not the same. So in terms of hardware there's possibly something going on wrt to L1 or L2 bandwidth, or perhaps the fixed function units and their ability to feed the CUs.
Whatever the reason, a huge drop in IPC observed in current games is interesting, and it will have a reason. I don't understand the desire to shut down speculation about this.
My observations over the years are that the amount of compute per pixel and per primitive moves in only one direction. It's unstoppable. This will necessarily cause changes in the way rendering pipelines are stressed. The interesting question is how this will be reflected in console performance as we move through the generation. I think it is more likely to favour XSX (relatively) especially with dynamic clocks in the mix, though I also think it's likely that XSX won't ever match PS5's IPC and show a difference that fully reflects the "TF difference".
Nobody knows how optimized PS5's tools are either. The thing about optimizing is that effective techniques only comes with experience and both consoles are brand new. What techniques work better than other techniques and how tools will adapt to help developers exploit said techniques is something that will take a while. What we do know, because Dirt 5's technical director said so, is that Xbox tools easy use and mature.
We also know from DF that not everyone has found the transition to GDK straight forward or pleasant. IMO it's best not to take a single opinion as being representative of industry wide experience. DF have also hinted that some developers are unhappy with the current state of the GDK.
There's no doubt that things will improve on both consoles as time passes though.
I think we're all a bit prone to only accepting the information that conform to our individual biases.
We do have practical evidence though; so far the PS5 is performing beyond expectations if you're looking at the tflop numbers alone.
Until contrarian evidence are provided it's difficult to accept a significant change in favour of the Xbox. I find it hard to believe that performance will increase for one because of a change in tools and that similar increases wouldn't happen for the other one.
Both are likely to get performance increases over the months, and unless a miracle happens, a 20-30% swing in favour of Xbox Series X is unlikely.
I'm just happy that both are performing broadly similar to one another.
I can't see the XSX ever achieving a 20 - 30% performance gain over the PS5 at this point. Even VRS isn't likely to make that happen.
I'm pretty keen to see how game workloads change over time though. We're about 7 years in to games being targetted at PS4 resolutions, geometry and shaders. Things like mesh shaders, ray tracing and more complex pixel shaders are likely to up the amount of compute needed relative to some of the fixed function stuff. I guess we'll see how that changes things, if at all.