Agree, smart shift is probably the foundation of the tech though.
My understanding is that smartshift is 1/2 the solution.
The bespoke technology that they built was the algorithm that detects workloads and controls the frequencies between the CPU and GPU.
So the workloads will dictate the frequencies on both.
The power to the SOC stays constant however (but the individual cores on both the CPU and GPU have their limits, thus clocking limits)
So while their tech controls the frequency on both the CPU and GPU, it does not shift the power between the 2 units.
Smartshift is the technology that transfers any residual power back and forth between the two.
So Sony's tech without smart shift just would have meant downclocking under heavy loads.
With Smart Shift they can handle larger loads than without as a result of the power shift from CPU to GPU.
So the easiest way to look at it:
GPU 2.23 GHz because the GPU cores are not heavily burdened so they give more power to less cores to work faster.
When a workload comes along and it requires all workloads to fire it will need to downclock to say, 2.0GHz because the power is now equally divided back among the cores.
there may be some additional sharing of power back and forth between the cores to keep the frequency up or TDP down as well.
Smart shift comes into play in which when those loads come and utilize everything on the GPU and it still needs more power or a downclock will occur, the smartshift will pull available CPU power over to the GPU.
In essence it is a reinforcement.
If that limit is broken, depending on the profile the devs picked, it could downclock the CPU further to sustain the GPU.
Further than that, the GPU will need to downclock.
All of this happening very quickly of course.