Definitely not wondering the things I am wondering because an NDA or embargo.
My perplexion is probably best described as wondering about a lot of unknowns.
What were the load scenarios that were making 2.0 Ghz on the GPU and 3.0 Ghz on the CPU with static power "hard to attain"? What are the load scenarios making 2.23 Ghz/3.5 Ghz with dynamic power apparently more stable?
I wonder about certain types of games that we know exist - like those that have a free floating dynamic resolution nearly at all times below the top end bound - like Call of Duty Modern Warfare or many other late gen games. What does a GPU load like that do under this system? It is maxing the GPU the entire time and causing a lot of heat in my experience from utilising "insane" settings targetting 4K on PC with Gears of War 5. I imagine there the GPU power draw/GPU utilisation would be genuinely near 98-100% all the time, in spite of something like a 60 fps cap, and the variability of load then would be based upon what is happening in the game on the CPU (which will be different from frame to frame).
Or I wonder about games that genuinely max out both resources really well, Ubisoft games are notorious for this (they use dynamic reconstruction on the GPU and tend to be CPU monsters).
I wonder what happens for certain types of performance profiles we see in certain games, and not just those with static resolutions, vsync, or are cross gen.