I think there is some misconception about PS5 power consumption and that's actually Cerny's faut. Cerny was talking about the worst case, when PS5 is pushed to its max in the most demanding scenes. But it can actually consumes much less in many others apps or games and consume less than Pro in the same conditions. Here some data taken by
NXGamer.
- Dashboard: it usually consumes 50W (60W on Pro).
- Shadow of the colossus: ~100W on PS5 using Pro BC mode (~150W on Pro)
- PS5 native demanding games: max is about 200W but it's usually consuming from 175W to 195W during gameplay (Astro's Playroom and Spiderman, both of those games being the 2 most demanding known games on PS5).
Interestingly we can see the PS5 consumes the most usually during the cutscenes and it's usually where the max power consumptions measures have being done. For instance DF found (in Spiderman) it's usually consuming the most at 195-205W during the cutscenes (and consistently, even when not much is displayed) or the main game menu (exactly like God of War on Pro) while it's usually hovering between 175W and 195W during normal gameplay (in both Astro's Playroom and Spiderman). So that should mean PS5 CPU or GPU are most likely not downclocked (or very rarely) during the gameplay in those games as they never reach the max known power consumption (205W reached apparently during a cutscene in Spiderman).
This data about the PS5 consuming the most during non gameplay scenes (cutcene or main menu) is interesting because this proves the PS5 is consuming the most when the CPU is actually not used very much. The benchmarks done by DF in the photo mode of Control and during a cutscene in Hitman 3 are actually scenes where the PS5 could be at its max power consumption and could potentially downclock. But it's actually not representative of the gameplay scenes where we know from the known data (same thing on Pro) that even the most demanding games should not downclock or very rarely.
Here is one of those moments in a cutscene consuming 203W taken at 6:07 in the DF video. There is barely anything that is displayed at the screen because this is probably similar to a furmak test where the GPU, not restricted by any CPU logic or vsync limitation, is most probably uselessly rendering some stuff as fast as it can.
Good discussion.
Looking back at my statement, it was not well thought out and quite a generic statement. There are all sorts of reasons why the PS5 can dip in power from its maximum power draw, despite having a boost clock system.
Though I do still disagree with the idea that the CPU is acting as a power virus during cutscenes, hence more power draw, and during gameplay, when the wattage is less than maximum should be translated as the GPU operating at maximum clock rate because there is still more power to give.
With some actual thought, while it's true that boost systems aim to maximize the amount of power available, one aspect is that it shares power with the CPU and I don’t think this is being properly accounted for. The challenge here for PS5 or this type of setup is that there's power shifting mechanism still is still latent. Due to the latency of shifting power from GPU to CPU we are unlikely to see a situation where the GPU is feeding just a little bit more power to the CPU as the CPU requires. That is fundamentally too fine grained controlled for a situation in which the CPU could burst for all eight cores at anytime. On top of the console has to act equivalent to all consoles run the same code as per PS5 specifications despite whatever environmental controls are in place. So there has to be a form of conservatism in which the console can draw its power. For all consoles, which means it’s likely to not be as highly tuned as a thermal boost, in which the CPU and GPU rely on it’s own always available power and controls it with boost based on thermals.
It is likely that the transferring of power between the GPU and CPU, is done through large steps. At step 0 the CPU has enough power to operate and the GPU can operate to a boost maximum of 2230MHz. Likely, at the next step 1, the GPU will only be allowed to boost to a maximum of 95% of 2230 megahertz. And the next step (step 2) will likely be the GPU can only be allowed to boost up to 90% of 2230 megahertz and so forth.
From this perspective whenever the CPU exceeds its power bracket the GPU will drop its power level significantly, recall that voltage is directly correlated to frequency, making this a simplistic model of voltage being a function of frequency cubed. A reduction in 10% in frequency of GPU is a dramatic amount of power available to the CPU. However, the CPU is not required to use all of the power provided to it as a result of that 10% reduction. I provide an example and calculations below.
Assume wattage is 200W maximum power draw for ease of calculations, though in reality the final wall number will be a combination of fan, ssd, and memory chips also taking power. For the sake of simplicity, 200W. A simple DVFS calculation here (2230*0.9 / 2230)^3 * 200W = 145.8W. (step 2)
If we assume step 0 is enough to power both CPU and GPU at 100%, by moving to step 1, where we take 5% frequency off the top of the GPU, the wattage headroom drops to 171.8W. This is too tight to 175W as you said the 'heavy action gameplay can drop to'. This may be likely still too tight, so drop it the next step at 10% (step 2). Now the reductions drop the total wattage to 145.8W. There is now significant room for the CPU to work with in terms of wattage, it now draws up an additional 25W of power to 175W. Now my calculations are wrong, as they mix some things together that shouldn’t be etc. But the point of this is to showcase that by reducing the GPU to feed the CPU, as long as the CPU doesn’t fully consume the power 100% that is given to it, the power draw will be lower.
It is likely at step 0, when power draw is at it’s maximum, is when the GPU is at the full 2230Mhz. As this aligns with the highest frequency for both CPU and GPU (and thus the lowest amount of activity from a profile perspective), and thus aligns with reduced performance/watt as frequency increases.