We also don't know what game it is, what even type of game it is, and what 'max settings' even means.
I can't see any obvious means by which Pro could increase res and framerate. Perhaps the PS5 was hitting higher res on average, so nearer 1600p, whereas perhaps the PS5 ran on the lower res on average, 1800p, and so getting only a marginal res increase but a notable framerate increase? Or maybe the GPU arch is that bit better that the "up to 45%" faster metric is a conservative paper metric, or 'observed in benchmarks' metric, and not an 'attained in real games metric'? If you are compute bound then 67% more compute units is going to offer a >45% performance improvement, and if these are RDNA 3/4 something you'll have performance improvement per FLOP.
Or maybe it's all made up. ¯\_(ツ)_/¯