While this is true, without knowing which games are actually CPU or GPU bound we're relatively left in the dark here. Devs often don't indicate why they are 30fps, sometimes the GPU is the limiter, sometimes the CPU is the limiter.
In every game that wanted to be 60fps, they made the target. That's something at least for consoles are decided up front. The idea that X1X or 4Pro could take any 30fps and immediately boost it to 60fps does not necessarily mean the game would work correctly; that's where console optimization takes the forefront, where they know what FPS they targeted and cut out what they can to achieve the most that they can out of 30fps. To ask them to scale to 60 using settings made for 30... well it's not exactly a simple job. For many developers it could be easier just upgrading the rendering resolution and adding more effects then to go back and alter everything else.
So while I get where you are going with this, and the desire for 60fps everywhere (I desire 120+
) pointing fingers at the CPU is not necessarily the reason why 60fps isn't being targeted. i.e.. ROTR for PS4Pro, they wanted a 60fps mode, the devs went back in and made a 60fps mode. Ark is 60fps on Xbox One X vs Xbox One 30fps. So those choices ultimately are up to the developer.
When a developer says hey, we can't do it, not enough CPU: well, yea, that's because they didn't try to make the game 60fps in the first place, and increased scene complexity up the wazoo. They'd have to go back to the drawing board to make 60fps work, and most developers aren't interested in that.