Sony is lying about their specs. 10.3tflops and 3.5ghz on their GPU and CPU are boost clocks. what are the base clocks where the cpu and gpu can operate without any oscillation between the two? those are the real clocks not these fake max clocks.
Well.... I heard the same presentation as you guys, but I did not undertood the same.
What I heard was an explanation about how to choose GPU speeds, and a thermal solution, and that the choice is made for a worst case scenario to happen on the life of a console.
What Mark cerny explained was that the worst case scenario, more than often is badly calculated, and so, when it happens, the cooling solution fires the fan, the system overheats, and it may crash or even reboot.
Sony's solution for that problem, and this was the only context he talked about downclock, is to downclock the system. But not by much, A pair of percentual points on clock speeds was enough for 10% decrease on power consumption, so he didn´t even expected a big decrease.
This was to avoid overheating, crashing and reboot. That was the context, and the only time a downclock was mentioned.
The remaining explanations were made to explain how they could reach 2.23 Ghz, since with fixed clocks 2 Ghz was already a problem.
The continuous boost mode on both processors allows the system to reduce clock speeds according to the actual usage. During a game the CPU/GPU usage varies frame by frame, and locked speeds drain a high level of minimum energy, heating the system as a consequence.
In this way the system will reduce speeds according to the needs. Nothing different from what current GPUs do.
This makes the system a lot cooler, and allows to go higher than with constant clocks. This is why cerny states both the CPU and GPU will be at 3.5 Ghz and 2.23 Ghz most of the time. Not all the time, since full power is not needed all the time, and this system exists to allow reductions and overheat..
But since both CPU and GPU can reach full clock there is no doubt there is a power an thermal envelope for the ending result!
So, why the need to draw power from the CPU to the GPU?
Well, Cerny never stated the CPU would suffer from this. The moved power is just unused power. Since the CPU power usage will also fluctuante according to the needs at every moment, not all power will be used all the time.
This allows that, if the GPU approaches the worst case scenario, instead of going over the power budget, he will see if the CPU has power not used, and use it if available to prevent going over the budget.
There are two points I need to make:
1 - If the CPU and GPU both can reach the maximum speed whenever is needed (and Cerny stated they will be there most of the time), then the console has a power budget enough for that. And as such this seems to contradict the doomsday theories thay PS5 will have to downclock for all and for nothing.
2 - This was a presentation showing the good stuff of PS5. Not their flaws! Why should Cerny be so detailed on explaining and showing a bad thing?
So it is my oppinion that it is the combination of high speeds, boost clocks, and downclock all talked in the same presentation (although in clear different contexts) that is causing all of this mess.
From this understanding, It is my firmly belief PS5 will be stable at showing it's 10.28 Tflops.