If PS5's gpu spends most of the time at 9.2TF then it would clearly contradicts to what Cerny is promising and be very disingenuous to the marketed 10.3 TF. I can only imagine the thundering uproar from the core community and media alike, it would possibly put a bad name to Sony or Playstation for next gen which is undesirable to the company. But if it hovers around 9.8-10TF under heavy load then it's gonna be totally fine. Also cpu speed is gonna be a non issue at 4k or close to 4k res, so a slight downclock would literally be unnoticed during gameplay.
During a multiplatform gameplay using XSX rendering at native 4k for comparison, if PS5 stays at 1800p most of the time then it must mean the gpu clock is heavily dropped and 9.2TF is most likely the standard number since there's a 44% pixel difference between 2160p and 1800p. If it stays at ~2000p then it would be less than 20% in pixel difference and Cerny would be correct after all. 2000p vs 2160p would be virtually undiscernable at a normal viewing distance or even face to the screen lol, it would require hardcore magic from DF to tell the story.
This might become more complicated if the PS5 lacks VRS - it wouldn't be possible to know what was down to clocks, and what was down to efficiency.
Same for bandwidth. The gap in bandwidth for the GPU to use is lager than the difference in Terraflops.
I think we might need developers to leak us info on how the hardware behaves when they're profiling games.
Edit: Sony still haven't confirmed if they have VRS or sampler feedback, right?