Maybe Im misunderstanding what variable clocks are, Isn't it just the mHz of the GPU ramping up and down based on workload.
The difference is stated base performance. Lets say a 2080Ti (i have it) has a claimed 13.5TF in given performance metric, its the base clock, the performance your always getting if all things are equal (correct PSU, airflow, matching CPU and other components ofcourse), the GPU is using its gameclock to boost upwards if termals, load etc allow for it, resulting in higher performance metrics,
not lower whenever the CPU is eating too much load.
The PS5 on the other hand is giving you the specs of 10.2TF at its max when the CPU isnt eating from its load. The tech is quite much the same (in special as in laptops smartshift and maxQ), but reversed from max performance metric to allow it to go down, along with the CPU/GPU trading powerload from/to eachother. Its not whats exactly happening in the pc space, and probably wont either for gaming gpus, never has.
Cant imagine Nvidia or AMD stating 13TF as the 'gameclock' (max metric), and never stating what its lowest clocks are. In the PC space we talk base clocks, gameclocks (boost) for whenever the situation allows for it. What sony has done is claiming 'gameclocks' (boost) IE the max potentional, and downclock from there when the CPU would be needing it or the other way around. Also, downclocking the CPU because the GPU needs more juice isnt whats happening in my Zen2 system.
Again, not exactly the same thing as both have different starting points.