The relative performance for me is inconclusive, but mainly because both are capped at vsync 60. I only trust that IOI tuned the releases for the best possible settings for both consoles. There's not really much to debate, DF offers additional data points for those of us interested in the technical underpinnings of the consoles. But, overall, without a rendering pipeline graph, there's really nothing ever that will be 'conclusive'.
The 44% differential is far surpassed the 22% difference in both compute and bandwidth. And to many that may seem incredibly unfair/suboptimal. But this has been the case for consoles for a great degree of time, we saw this exact thing play out for mid-gen when at times we saw XSX with 100% more resolution than 4Pro and often PS4 and 4Pro had many titles that both shared 1080p. Games are not so regulated and perfect that every ounce of power is extracted perfectly by every team. not ever game is designed the same, and different games will hit different bottlenecks for different setups. Without unlocked framerates, and equal settings, there's just no way to tell unfortunately. And that is to be expected, but when i wrote about this phenomenon, I was attacked pretty badly. This could be one of those cases, where the mixture of a lack of DRS and it was just too unstable for IOI standards at 4K that 1800p was going to be the best experience for the players considering how close it was to 4K for most people. Perhaps they would have chosen differently if PS5 already had a VRR solution in place, I don't know.
The most important take away for me, is that if there was a devkit issue, that problem is clearly gone. Honestly, I was going to give it 6 months of XSX of being out of place. But after Hitman 3, I suspect there might be 2 months remaining left where we may see a rogue title or so where XSX underperforms the 5700. But afterwards it should be clear. I think post 6 months it performing below a 5700 I would start to dive deeper to look for architectural issues.
PS5 is performing very well, it's performing where I thought it would, at least for these cross generational titles.
Yep, that's the Sega Saturn timescale if there ever was one xD. Although early optimizations with that system were...odd. IIRC VF Remix was already nearly done by that May but they still released the terribly buggy VF1 port anyway.
I don't think the 44% resolution difference is too odd considering XSX has 44% more CUs, but OTOH maybe it is kind of odd considering it would suggest PS5's version is running well below 2.23 GHz...yet from what Cerny's mentioned the only reason that would happen is if the power budget were being taxed and extra power can't be allocated from the CPU's budget to shift to GPU.
Thing is, I don't think Hitman 3 is that particularly taxing, but I also remember reading that a program doesn't need to actually be pushing technical limits (not to say Hitman 3 isn't visually impressive; I think it is outside of maybe some of the character models) to use up most of a processor's resources. Again I'm thinking of what Mark Cerny said about some menus on select PS4 games oddly kicking the fans into high gear despite it being just a simple menu screen.
So I don't know if this hints at an unoptimized PS5 port or not, and maybe Hitman 3's code profile is structure is just more suited for Series X's design, fixed clocks, wider GPU etc. than PS5's variable frequency. So there's a chance the Series X version is more optimized, though again I don't want to give impression that a game being optimized means it's "pushing" the hardware to its limits, because I don't think we'll see those games on either system until starting probably early 2023, or a bit later, it just depends on when devs (3P and 1P) decide to ditch 8th-gen for good.
Even then I think "pushing" these new consoles to their limit is more an issue of lack of budgets/time/dev workforces than it is needing to "learn" the hardware; these systems aren't complex exotic designs like consoles from the '90s or say 7th-gen units like PS3. They're mostly straightforward, but we can clearly tie the most ambitious and technically impressive games of 8th-gen to those with the largest budgets, largest teams, and/or most amount of time in dev (RDR2, TLOUII, HZD etc.).
WRT to the devkit stuff, I think that's a good way to approach it. People like BRiT have posted some updates on GDK clearly showing work-in-progress with parts of it that are or were at least fairly recent. I think even if Series X starts to take the lead in majority of 3P game performance going forward there will be outliers where PS5 has the advantage, but if that longer-term lead in 3P game performance for Series X in fact doesn't start to regularly manifest by a few months time (or certainly, heading into Fall), then yeah there will need to be some serious questions asked about architectural limitations having a negative impact, I think the two people would gravitate to most are the fast/slow RAM pools and lack of GPU cache scrubbers.
But that is a worst-case scenario and from everything we know on both systems so far I don't expect it to ever actually manifest. I just hope by this fall, discussions on system hardware specifications in terms of trying to peg who's bottlenecked where, disagreeing on where the systems fall in relation to one another etc. ceases, especially if Sony continue to be coy about some of their own hardware specifics (basically allowing some rather ridiculous rumors to fly on their behalf that the majority of which probably aren't even true).