If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift.
My take on this: Smartshift can allocate CPU power to the GPU, while still running at 3.5ghz.
Another interesting power saving measure: when CPU or GPU are idle for some time (and not point of increasing the clocks because of waiting for next vsync), then the frequency will be reduced (or not increased ?) during that short time, he calls this 'race to idle':
There's another phenomenon here, which is called 'race to idle'...the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency...So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation
So with all those power saving measures (I counted 3 differents):
- Variable frequency based on current max load (100% deterministic, he confirms silicon lottery or room temp won't have an impact on the variable clocks) of both CPU / GPU,
- Smartshift (CPU unused power given to GPU)
- 'race to idle' (decreasing, or not increasing ?, frequency of either CPU or GPU when those are scheduled to do nothing for a few ms)
Cerny expect CPU and GPU to run at max frequency most of the time (the time needed to usefully run at max clocks). so this means it's totally normal that the CPU is downclocked, but it should usually not mean the downclock will make the game run slowly. In many cases CPU and GPU should be downclocked without impacting game performance.
This is really some innovative stuff. I mean, they wouldn't have needed this if they had 52CUs in the first place.