I know from my own endeavours in overclocking my trusty 4790K, that I hit a point where going to a bigger heatsink and ramping the fans up to noise levels yielded increasingly small benefits as I clocked higher. Eventually it made almost no difference to CPU reported temps.
The reason with my 4790K seems to be that Intel used shite thermal compound between the chip and the heatspreader. Delidders seem to be able to get vastly lower temps. So I can lower my fan speeds, get higher exhaust temps, and up to a point it makes almost no difference to reported temps even under load. Similarly, I can swap back to a smaller (but still very capable) cooler and gain only a couple of degrees in stress tests. My silly big Nocuta 2 x 14 cm fan thing is definitely better than my single fan Hyper 212 was, but in all honestly it probably wasn't worth the cost.
So the PS5 is using a "worse" cooler. Okay. And it might be pushing less air, and so have higher exhaust temps. And that might be making very little difference to processor temps, and everything is probably still in spec and completely fine.
Also, remember the PS5 boosts using predetermined tables based on activity levels. The PS5's cooling is based upon what heat those activity tables were going to generate across the worst of the yielding chips. If Sony are now finding that worst case across chips they're currently binning allows for something less expensive, then they're going to do that.
I think that to some extent the PS5 (and all consoles) launched using a cooler designed to accommodate predicted characteristics across the chips they'll be using. I would also expect that design to be a little conservative. Over time, it might become apparent that they didn't need to be so conservative, and / or the characteristics of the chips might improve a little. And if you could pay e.g. $5 more per chip, but save $15 on the cooler, that might be something you'd do (supply permitting).