Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Depends on how fast developers will abandum PS4. You don't want to abandum 100m users as long as the userbase of the next-gen consoles isn't huge enough. And as it looks right now, there is really nothing in the pipeline that wouldn't run with lower details/resolution and 30 instead of 60fps on the last gen consoles.
The jump from the PS360 gen was bigger and the jump from the ps2->ps360 gen was also bigger. Jumps get smaller and smaller and it is increasingly hard to get better graphics (diminishing return). Gameplaywise almost nothing has changed since PS2. CPU can be a bottleneck quite fast, but on the other hand, we see games running on the Switch. So I guess the last gen will be longer supported than the gen before. Especially if we look at the hardware market, that currently just can't deliver new hardware in big numbers.

So devs would hopefully get used to the ps5 dev kits by then?

I think you're right though. Even current gen games can technically run on mobile atoms.

Now I'd like to see a game that's built around next gen cpus and ssds where there's crapton of advanced ai, gpu physics and teleporting segments and gameplay balancing is around those so you can't remove or reduce them.
 
Moving forward, wouldn't this be less and less of a concern as devs get to know the PS5 dev kits and fade out last gen hardwares?

If they made PS5 specific releases it wouldn't be an issue. But then the devs would have to roll their own solution for cross-progression so game saves from PS4/4Pro can be used on PS5 specific version. I think most of the early systems resolved this with their own cloud migration service. At least I haven't heard of Sony resolving this issue of not having genuine smart-delivery system in place.
 
If they made PS5 specific releases it wouldn't be an issue. But then the devs would have to roll their own solution for cross-progression so game saves from PS4/4Pro can be used on PS5 specific version. I think most of the early systems resolved this with their own cloud migration service. At least I haven't heard of Sony resolving this issue of not having genuine smart-delivery system in place.

I guess that's what they'll do over time then?

I'm really not sure about the all the specifics with recent comparisons but I simply cannot imagine developers only making only PS4 Pro profiles as time goes by. Am I crazy?
 
I guess that's what they'll do over time then?

I'm really not sure about the all the specifics with recent comparisons but I simply cannot imagine developers only making only PS4 Pro profiles as time goes by. Am I crazy?
Sony seem to have wanted a fast transition. Games should get native PS5 port that are incompatible with PS4 but still get a PS4 release. Savegame-transfer would than still not be possible (without an extra server sitting in between). At least that was Sonys plan. Now with covid19 the last-gen consoles are even stronger than initial thought, so they might change their strategy a bit.
But current gen just launched 3 month ago, normally this would solve itself with enough time and availability of the new hardware.


And now something completely different and back to topic
I really don't know if this video is correctly placed here. It is "retro-like" game but on switch ^^
 
DF Article @ https://www.eurogamer.net/articles/digitalfoundry-2021-ghost-recon-breakpoint-60fps-upgrade-tested

Ghost Recon Breakpoint doubles performance on next-gen consoles
But there's a big resolution divide between Xbox Series X and PS5.

The new console generation has so far been amazing for players who love smooth, 60 frames per second gaming. Ubisoft, especially, leads the charge with the likes of Assassin's Creed Valhalla, The Division 2, and Immortals Fenyx Rising - all come fully equipped with at least a performance option, or run at 60fps by default on Xbox Series X, Series S and PS5. We can add another to the list here too: Ghost Recon Breakpoint. It was patched around next-gen's launch, and while running in backwards compatibility mode, the doubling of frame-rate is a game-changing experience. The Division 2 tried the same trick but came unstuck somewhat on PS5, missing some visual flourishes found in every other version - even PS4 Pro. Speaking of which, there's good news with The Division 2 here worth touching on before we get into Breakpoint properly.

The Division 2's Patch 1.31 (as it appears on the PS5 front-end) came out hot on the heels of our coverage and essentially sorts out all of the issues we had with it - namely, screen-space reflections and volumetric fog are back. Looking back, this was likely an oversight from developer Massive Entertainment at the time; a simple flag for these settings that went unchecked. However, it's clear that legacy limitations from PS4 Pro are still in place - there's a vanishingly small performance advantage here opposite Xbox Series X, but it comes at the cost of a lower resolution on PS5. All is well, where the big success remains that next-gen can now achieve 60fps - a similar story to Ghost Recon Breakpoint.

...

Hmmm, strange. So, they added a new performance mode just for the next gen consoles and the PS5 is limited to 1080p while the XBS-X is limited to 1440p. Considering it's a new mode specifically added for the new consoles, I'm surprised they didn't support higher than 1080p for the PS5 ... since this is a new mode specifically for the new consoles.

Also surprising that XBS-S basically matches the PS5's resolution mode ... in both resolution and FPS. This is likely just a code limitation from PS4-P unlike the new performance mode. But it's still odd seeing it in effect.

Regards,
SB
 
Power consumption grows quadratically to frequency, since making transistors switch quicker needs more voltage. As a result PS5’s GPU running at higher frequency causes it to draw a lot more power.
This is why I am interested in this because in The Road to PS5 presentation Mark Cerny said "we run at essentially constant power and let the frequency band vary based on the workload". And I'm really curious about this because where is the excess constant power going when the PS5 is just sitting mostly inactive in a menu? I assume they mean there is an envelope but that wasn't quite how it was explained.
 
This is why I am interested in this because in The Road to PS5 presentation Mark Cerny said "we run at essentially constant power and let the frequency band vary based on the workload". And I'm really curious about this because where is the excess constant power going when the PS5 is just sitting mostly inactive in a menu? I assume they mean there is an envelope but that wasn't quite how it was explained.
I would say Mark's remark is (over)simplified for better explaining the design philosophy.
 
This is why I am interested in this because in The Road to PS5 presentation Mark Cerny said "we run at essentially constant power and let the frequency band vary based on the workload". And I'm really curious about this because where is the excess constant power going when the PS5 is just sitting mostly inactive in a menu? I assume they mean there is an envelope but that wasn't quite how it was explained.

Secretly bitcoin mining and sending the proceeds to Sony ...
 
Going to take a stab but considering how it's setup to boost and shares power between CPU and GPU, the expected mean should be very consistent between titles. Always around 200 +/- regional differences in hardware, or fan speeds adjusting for air temperature.

If PS5 isn't pushing the maximum draw of their SOC at all times, I would suspect that something is wrong or there is entirely no work to do at all.
This is why I am interested in this because in The Road to PS5 presentation Mark Cerny said "we run at essentially constant power and let the frequency band vary based on the workload". And I'm really curious about this because where is the excess constant power going when the PS5 is just sitting mostly inactive in a menu? I assume they mean there is an envelope but that wasn't quite how it was explained.
I think there is some misconception about PS5 power consumption and that's actually Cerny's faut. Cerny was talking about the worst case, when PS5 is pushed to its max in the most demanding scenes. But it can actually consumes much less in many others apps or games and consume less than Pro in the same conditions. Here some data taken by NXGamer.

- Dashboard: it usually consumes 50W (60W on Pro).
- Shadow of the colossus: ~100W on PS5 using Pro BC mode (~150W on Pro)
- PS5 native demanding games: max is about 200W but it's usually consuming from 175W to 195W during gameplay (Astro's Playroom and Spiderman, both of those games being the 2 most demanding known games on PS5).

sthGEk3.png


Interestingly we can see the PS5 consumes the most usually during the cutscenes and it's usually where the max power consumptions measures have being done. For instance DF found (in Spiderman) it's usually consuming the most at 195-205W during the cutscenes (and consistently, even when not much is displayed) or the main game menu (exactly like God of War on Pro) while it's usually hovering between 175W and 195W during normal gameplay (in both Astro's Playroom and Spiderman). So that should mean PS5 CPU or GPU are most likely not downclocked (or very rarely) during the gameplay in those games as they never reach the max known power consumption (205W reached apparently during a cutscene in Spiderman).


This data about the PS5 consuming the most during non gameplay scenes (cutcene or main menu) is interesting because this proves the PS5 is consuming the most when the CPU is actually not used very much. The benchmarks done by DF in the photo mode of Control and during a cutscene in Hitman 3 are actually scenes where the PS5 could be at its max power consumption and could potentially downclock. But it's actually not representative of the gameplay scenes where we know from the known data (same thing on Pro) that even the most demanding games should not downclock or very rarely.

Here is one of those moments in a cutscene consuming 203W taken at 6:07 in the DF video. There is barely anything that is displayed at the screen because this is probably similar to a furmak test where the GPU, not restricted by any CPU logic or vsync limitation, is most probably uselessly rendering some stuff as fast as it can.
kTR3pYp.png
 
Last edited:
Sony seem to have wanted a fast transition. Games should get native PS5 port that are incompatible with PS4 but still get a PS4 release. Savegame-transfer would than still not be possible (without an extra server sitting in between). At least that was Sonys plan. Now with covid19 the last-gen consoles are even stronger than initial thought, so they might change their strategy a bit.
But current gen just launched 3 month ago, normally this would solve itself with enough time and availability of the new hardware.


And now something completely different and back to topic
I really don't know if this video is correctly placed here. It is "retro-like" game but on switch ^^
I hope this becomes multiplatform
 
Sony seem to have wanted a fast transition.
Yes I think they did, but events have conspired against them and theres a world wide chip shortage. Not just the consoles but CPU's and GPU's
Im not 100% sure why this is?
Some say covid, some say cryptocurrency, its prolly a mixture

So they will prolly have to rely on ps4 for longer
 
I think there is some misconception about PS5 power consumption and that's actually Cerny's faut. Cerny was talking about the worst case, when PS5 is pushed to its max in the most demanding scenes.
...
Interestingly we can see the PS5 consumes the most usually during the cutscenes and it's usually where the max power consumptions measures have being done. For instance DF found (in Spiderman) it's usually consuming the most at 195-205W during the cutscenes (and consistently, even when not much is displayed) or the main game menu (exactly like God of War on Pro) while it's usually hovering between 175W and 195W during normal gameplay (in both Astro's Playroom and Spiderman). So that should mean PS5 CPU or GPU are most likely not downclocked (or very rarely) during the gameplay in those games as they never reach the max known power consumption (205W reached apparently during a cutscene in Spiderman).
Yes, I'm unsure why but this is how I understood it. They designed everything around the worst possible scenario so (in theory) the PS5 should be mostly be working within that maximum...except in things like the Horizon map (I think he used as an example).

This data about the PS5 consuming the most during non gameplay scenes (cutscene or main menu) is interesting because this proves the PS5 is consuming the most when the CPU is actually not used very much. The benchmarks done by DF in the photo mode of Control and during a cutscene in Hitman 3 are actually scenes where the PS5 could be at its max power consumption and could potentially downclock. But it's actually not representative of the gameplay scenes where we know from the known data (same thing on Pro) that even the most demanding games should not downclock or very rarely.
I hadn't thought about that aspect regarding the Control photo mode - makes a lot of sense though
 
Yes I think they did, but events have conspired against them and theres a world wide chip shortage. Not just the consoles but CPU's and GPU's
Im not 100% sure why this is?
Some say covid, some say cryptocurrency, its prolly a mixture

So they will prolly have to rely on ps4 for longer

Even car manufacturing is being affected by the poor chip manufacturing speeds and is asking help from the US government. US gov is now getting involved through an executive order yesterday, I think.
 
Especially, because they were more conservative in keeping stock and placing orders due to covid than most others.
 
I think there is some misconception about PS5 power consumption and that's actually Cerny's faut. Cerny was talking about the worst case, when PS5 is pushed to its max in the most demanding scenes. But it can actually consumes much less in many others apps or games and consume less than Pro in the same conditions. Here some data taken by NXGamer.

- Dashboard: it usually consumes 50W (60W on Pro).
- Shadow of the colossus: ~100W on PS5 using Pro BC mode (~150W on Pro)
- PS5 native demanding games: max is about 200W but it's usually consuming from 175W to 195W during gameplay (Astro's Playroom and Spiderman, both of those games being the 2 most demanding known games on PS5).

Interestingly we can see the PS5 consumes the most usually during the cutscenes and it's usually where the max power consumptions measures have being done. For instance DF found (in Spiderman) it's usually consuming the most at 195-205W during the cutscenes (and consistently, even when not much is displayed) or the main game menu (exactly like God of War on Pro) while it's usually hovering between 175W and 195W during normal gameplay (in both Astro's Playroom and Spiderman). So that should mean PS5 CPU or GPU are most likely not downclocked (or very rarely) during the gameplay in those games as they never reach the max known power consumption (205W reached apparently during a cutscene in Spiderman).


This data about the PS5 consuming the most during non gameplay scenes (cutcene or main menu) is interesting because this proves the PS5 is consuming the most when the CPU is actually not used very much. The benchmarks done by DF in the photo mode of Control and during a cutscene in Hitman 3 are actually scenes where the PS5 could be at its max power consumption and could potentially downclock. But it's actually not representative of the gameplay scenes where we know from the known data (same thing on Pro) that even the most demanding games should not downclock or very rarely.

Here is one of those moments in a cutscene consuming 203W taken at 6:07 in the DF video. There is barely anything that is displayed at the screen because this is probably similar to a furmak test where the GPU, not restricted by any CPU logic or vsync limitation, is most probably uselessly rendering some stuff as fast as it can.

Good discussion.

Looking back at my statement, it was not well thought out and quite a generic statement. There are all sorts of reasons why the PS5 can dip in power from its maximum power draw, despite having a boost clock system.

Though I do still disagree with the idea that the CPU is acting as a power virus during cutscenes, hence more power draw, and during gameplay, when the wattage is less than maximum should be translated as the GPU operating at maximum clock rate because there is still more power to give.

With some actual thought, while it's true that boost systems aim to maximize the amount of power available, one aspect is that it shares power with the CPU and I don’t think this is being properly accounted for. The challenge here for PS5 or this type of setup is that there's power shifting mechanism still is still latent. Due to the latency of shifting power from GPU to CPU we are unlikely to see a situation where the GPU is feeding just a little bit more power to the CPU as the CPU requires. That is fundamentally too fine grained controlled for a situation in which the CPU could burst for all eight cores at anytime. On top of the console has to act equivalent to all consoles run the same code as per PS5 specifications despite whatever environmental controls are in place. So there has to be a form of conservatism in which the console can draw its power. For all consoles, which means it’s likely to not be as highly tuned as a thermal boost, in which the CPU and GPU rely on it’s own always available power and controls it with boost based on thermals.

It is likely that the transferring of power between the GPU and CPU, is done through large steps. At step 0 the CPU has enough power to operate and the GPU can operate to a boost maximum of 2230MHz. Likely, at the next step 1, the GPU will only be allowed to boost to a maximum of 95% of 2230 megahertz. And the next step (step 2) will likely be the GPU can only be allowed to boost up to 90% of 2230 megahertz and so forth.

From this perspective whenever the CPU exceeds its power bracket the GPU will drop its power level significantly, recall that voltage is directly correlated to frequency, making this a simplistic model of voltage being a function of frequency cubed. A reduction in 10% in frequency of GPU is a dramatic amount of power available to the CPU. However, the CPU is not required to use all of the power provided to it as a result of that 10% reduction. I provide an example and calculations below.

Assume wattage is 200W maximum power draw for ease of calculations, though in reality the final wall number will be a combination of fan, ssd, and memory chips also taking power. For the sake of simplicity, 200W. A simple DVFS calculation here (2230*0.9 / 2230)^3 * 200W = 145.8W. (step 2)

If we assume step 0 is enough to power both CPU and GPU at 100%, by moving to step 1, where we take 5% frequency off the top of the GPU, the wattage headroom drops to 171.8W. This is too tight to 175W as you said the 'heavy action gameplay can drop to'. This may be likely still too tight, so drop it the next step at 10% (step 2). Now the reductions drop the total wattage to 145.8W. There is now significant room for the CPU to work with in terms of wattage, it now draws up an additional 25W of power to 175W. Now my calculations are wrong, as they mix some things together that shouldn’t be etc. But the point of this is to showcase that by reducing the GPU to feed the CPU, as long as the CPU doesn’t fully consume the power 100% that is given to it, the power draw will be lower.

It is likely at step 0, when power draw is at it’s maximum, is when the GPU is at the full 2230Mhz. As this aligns with the highest frequency for both CPU and GPU (and thus the lowest amount of activity from a profile perspective), and thus aligns with reduced performance/watt as frequency increases.
 
Last edited:
Good discussion.

Looking back at my statement, it was not well thought out and quite a generic statement. There are all sorts of reasons why the PS5 can dip in power from its maximum power draw, despite having a boost clock system.

Though I do still disagree with the idea that the CPU is acting as a power virus during cutscenes, hence more power draw, and during gameplay, when the wattage is less than maximum should be translated as the GPU operating at maximum clock rate because there is still more power to give.

With some actual thought, while it's true that boost systems aim to maximize the amount of power available, one aspect is that it shares power with the CPU and I don’t think this is being properly accounted for. The challenge here for PS5 or this type of setup is that there's power shifting mechanism still is still latent. Due to the latency of shifting power from GPU to CPU we are unlikely to see a situation where the GPU is feeding just a little bit more power to the CPU as the CPU requires. That is fundamentally too fine grained controlled for a situation in which the CPU could burst for all eight cores at anytime. On top of the console has to act equivalent to all consoles run the same code as per PS5 specifications despite whatever environmental controls are in place. So there has to be a form of conservatism in which the console can draw its power. For all consoles, which means it’s likely to not be as highly tuned as a thermal boost, in which the CPU and GPU rely on it’s own always available power and controls it with boost based on thermals.

It is likely that the transferring of power between the GPU and CPU, is done through large steps. At step 0 the CPU has enough power to operate and the GPU can operate to a boost maximum of 2230MHz. Likely, at the next step 1, the GPU will only be allowed to boost to a maximum of 95% of 2230 megahertz. And the next step (step 2) will likely be the GPU can only be allowed to boost up to 90% of 2230 megahertz and so forth.

From this perspective whenever the CPU exceeds its power bracket the GPU will drop its power level significantly, recall that voltage is directly correlated to frequency, making this a simplistic model of voltage being a function of frequency cubed. A reduction in 10% in frequency of GPU is a dramatic amount of power available to the CPU. However, the CPU is not required to use all of the power provided to it as a result of that 10% reduction. I provide an example and calculations below.

Assume wattage is 200W maximum power draw for ease of calculations, though in reality the final wall number will be a combination of fan, ssd, and memory chips also taking power. For the sake of simplicity, 200W. A simple DVFS calculation here (2230*0.9 / 2230)^3 * 200W = 145.8W. (step 2)

If we assume step 0 is enough to power both CPU and GPU at 100%, by moving to step 1, where we take 5% frequency off the top of the GPU, the wattage headroom drops to 171.8W. This is too tight to 175W as you said the 'heavy action gameplay can drop to'. This may be likely still too tight, so drop it the next step at 10% (step 2). Now the reductions drop the total wattage to 145.8W. There is now significant room for the CPU to work with in terms of wattage, it now draws up an additional 25W of power to 175W. Now my calculations are wrong, as they mix some things together that shouldn’t be etc. But the point of this is to showcase that by reducing the GPU to feed the CPU, as long as the CPU doesn’t fully consume the power 100% that is given to it, the power draw will be lower.

It is likely at step 0, when power draw is at it’s maximum, is when the GPU is at the full 2230Mhz. As this aligns with the highest frequency for both CPU and GPU (and thus the lowest amount of activity from a profile perspective), and thus aligns with reduced performance/watt as frequency increases.
It's interesting but it's all wild conjecture without any kind of actual data. First I have heard of latency of about ~1ms for the clocks adjustements. Second Cerny told use when there is going to be any kind of downclock it's going to be usually by 2 or 3%, not 5% or 10%. Then as we have seen in current games (and as told by Cerny) the downclocks are going to happen in unusual conditions, and it won't last at all thanks to the low latency power management and different kind of systems in order to save power consumption.

Also you are wrong thinking the clocks are alone directly giving the power consumption. It's way more complex. There are plenty of others factors in play. What kind of instructions and how many instructions (if you are doing a furmak test, you are doing the most instructions you can, but it's useless).

But I think the biggest proof is the actual power consumption during different scenes. It's actually extremely rare to even reach 200W during gameplay (I actually haven't seen it yet on any analysis) while many cutscenes are consistently at that level.
 
It's interesting but it's all wild conjecture without any kind of actual data. First I have heard of latency of about ~1ms for the clocks adjustements. Second Cerny told use when there is going to be any kind of downclock it's going to be usually by 2 or 3%, not 5% or 10%. Then as we have seen in current games (and as told by Cerny) the downclocks are going to happen in unusual conditions, and it won't last at all thanks to the low latency power management and different kind of systems in order to save power consumption.

Also you are wrong thinking the clocks are alone directly giving the power consumption. It's way more complex. There are plenty of others factors in play. What kind of instructions and how many instructions (if you are doing a furmak test, you are doing the most instructions you can, but it's useless).

But I think the biggest proof is the actual power consumption during different scenes. It's actually extremely rare to even reach 200W during gameplay (I actually haven't seen it yet on any analysis) while many cutscenes are consistently at that level.
It is best to just look at the late PS4 games. The later the first party titles came, the louder the console got. Now the PS5 should not get any louder, but instead reduce it's clock over time while games push it more and more to the limits.
Over time the PS5 might no longer reach it's highest clockrate in newer titles. Therefore it boosted earlier titles.
That is what I meant when I wrote "autooptimization". If you write non-optimal code with lots of "latencies" in it, you get a boost in clockrate. If you optimize the code so you have as good as no latencies you get a clock reduction but you still have performant code.
But you can also look at it and think that it punishes developers who optimize their code ("to much"). :D

It will take a while to get the optimal mix between optimized code and clock-rate.
 
Last edited:
It is best to just look at the late PS4 games. The later the first party titles came, the louder the console got. Now the PS5 should not get any louder, but instead reduce it's clock over time while games push it more and more to the limits.

PS4s also got louder as they got older because a lot of people never cleaned their console or replaced the thermal compound. There is also the issue that optimising for any hardware gets better over time, the APIs will be tuned and the tools will improve so that it is easier to push the hardware harder and conversely, also make some code more efficient so that it doesn't push the hardware as much.

I cleaned my PS4 then PS4 Pro every year and it always got noticeably quieter - noticeably so when changing the thermal compound - PS5 should be much better in this regard because the liquid metal layer is precisely positioned rather than just varying quantity of paste randomly splashed into the chip. ;-)
 
Status
Not open for further replies.
Back
Top