Digital Foundry Article Technical Discussion [2021]

Discussion in 'Console Technology' started by BRiT, Jan 1, 2021.

  1. Vega86

    Newcomer

    Joined:
    Sep 25, 2018
    Messages:
    182
    Likes Received:
    123
    So devs would hopefully get used to the ps5 dev kits by then?

    I think you're right though. Even current gen games can technically run on mobile atoms.

    Now I'd like to see a game that's built around next gen cpus and ssds where there's crapton of advanced ai, gpu physics and teleporting segments and gameplay balancing is around those so you can't remove or reduce them.
     
  2. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    18,768
    Likes Received:
    21,044
    If they made PS5 specific releases it wouldn't be an issue. But then the devs would have to roll their own solution for cross-progression so game saves from PS4/4Pro can be used on PS5 specific version. I think most of the early systems resolved this with their own cloud migration service. At least I haven't heard of Sony resolving this issue of not having genuine smart-delivery system in place.
     
    RagnarokFF likes this.
  3. Vega86

    Newcomer

    Joined:
    Sep 25, 2018
    Messages:
    182
    Likes Received:
    123
    I guess that's what they'll do over time then?

    I'm really not sure about the all the specifics with recent comparisons but I simply cannot imagine developers only making only PS4 Pro profiles as time goes by. Am I crazy?
     
  4. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    587
    Likes Received:
    520
    Sony seem to have wanted a fast transition. Games should get native PS5 port that are incompatible with PS4 but still get a PS4 release. Savegame-transfer would than still not be possible (without an extra server sitting in between). At least that was Sonys plan. Now with covid19 the last-gen consoles are even stronger than initial thought, so they might change their strategy a bit.
    But current gen just launched 3 month ago, normally this would solve itself with enough time and availability of the new hardware.


    And now something completely different and back to topic

    I really don't know if this video is correctly placed here. It is "retro-like" game but on switch ^^
     
    London Geezer, BRiT and Vega86 like this.
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,687
    Likes Received:
    7,681
    Hmmm, strange. So, they added a new performance mode just for the next gen consoles and the PS5 is limited to 1080p while the XBS-X is limited to 1440p. Considering it's a new mode specifically added for the new consoles, I'm surprised they didn't support higher than 1080p for the PS5 ... since this is a new mode specifically for the new consoles.

    Also surprising that XBS-S basically matches the PS5's resolution mode ... in both resolution and FPS. This is likely just a code limitation from PS4-P unlike the new performance mode. But it's still odd seeing it in effect.

    Regards,
    SB
     
    PSman1700 likes this.
  6. Kugai Calo

    Newcomer

    Joined:
    Mar 6, 2020
    Messages:
    184
    Likes Received:
    181
    Power consumption grows quadratically to frequency, since making transistors switch quicker needs more voltage. As a result PS5’s GPU running at higher frequency causes it to draw a lot more power.
     
    PSman1700 and DSoup like this.
  7. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,870
    Likes Received:
    10,985
    Location:
    London, UK
    This is why I am interested in this because in The Road to PS5 presentation Mark Cerny said "we run at essentially constant power and let the frequency band vary based on the workload". And I'm really curious about this because where is the excess constant power going when the PS5 is just sitting mostly inactive in a menu? I assume they mean there is an envelope but that wasn't quite how it was explained.
     
  8. Kugai Calo

    Newcomer

    Joined:
    Mar 6, 2020
    Messages:
    184
    Likes Received:
    181
    I would say Mark's remark is (over)simplified for better explaining the design philosophy.
     
    goonergaz, DSoup and PSman1700 like this.
  9. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    18,768
    Likes Received:
    21,044
    Secretly bitcoin mining and sending the proceeds to Sony ...
     
    RagnarokFF, function, Picao84 and 9 others like this.
  10. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,870
    Likes Received:
    10,985
    Location:
    London, UK
    The original PS3 shipped with a folding-at-home client. :yep2:
     
    w0lfram, function, VitaminB6 and 2 others like this.
  11. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,107
    Likes Received:
    3,027
    Location:
    France
    I think there is some misconception about PS5 power consumption and that's actually Cerny's faut. Cerny was talking about the worst case, when PS5 is pushed to its max in the most demanding scenes. But it can actually consumes much less in many others apps or games and consume less than Pro in the same conditions. Here some data taken by NXGamer.

    - Dashboard: it usually consumes 50W (60W on Pro).
    - Shadow of the colossus: ~100W on PS5 using Pro BC mode (~150W on Pro)
    - PS5 native demanding games: max is about 200W but it's usually consuming from 175W to 195W during gameplay (Astro's Playroom and Spiderman, both of those games being the 2 most demanding known games on PS5).

    [​IMG]

    Interestingly we can see the PS5 consumes the most usually during the cutscenes and it's usually where the max power consumptions measures have being done. For instance DF found (in Spiderman) it's usually consuming the most at 195-205W during the cutscenes (and consistently, even when not much is displayed) or the main game menu (exactly like God of War on Pro) while it's usually hovering between 175W and 195W during normal gameplay (in both Astro's Playroom and Spiderman). So that should mean PS5 CPU or GPU are most likely not downclocked (or very rarely) during the gameplay in those games as they never reach the max known power consumption (205W reached apparently during a cutscene in Spiderman).



    This data about the PS5 consuming the most during non gameplay scenes (cutcene or main menu) is interesting because this proves the PS5 is consuming the most when the CPU is actually not used very much. The benchmarks done by DF in the photo mode of Control and during a cutscene in Hitman 3 are actually scenes where the PS5 could be at its max power consumption and could potentially downclock. But it's actually not representative of the gameplay scenes where we know from the known data (same thing on Pro) that even the most demanding games should not downclock or very rarely.

    Here is one of those moments in a cutscene consuming 203W taken at 6:07 in the DF video. There is barely anything that is displayed at the screen because this is probably similar to a furmak test where the GPU, not restricted by any CPU logic or vsync limitation, is most probably uselessly rendering some stuff as fast as it can.
    [​IMG]
     
    #971 Globalisateur, Feb 24, 2021
    Last edited: Feb 25, 2021
    w0lfram, thicc_gaf, iroboto and 6 others like this.
  12. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,906
    Likes Received:
    3,072
    I hope this becomes multiplatform
     
  13. zed

    zed
    Legend Veteran

    Joined:
    Dec 16, 2005
    Messages:
    5,323
    Likes Received:
    1,361
    Yes I think they did, but events have conspired against them and theres a world wide chip shortage. Not just the consoles but CPU's and GPU's
    Im not 100% sure why this is?
    Some say covid, some say cryptocurrency, its prolly a mixture

    So they will prolly have to rely on ps4 for longer
     
    PSman1700 and thicc_gaf like this.
  14. goonergaz

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    4,186
    Likes Received:
    1,502
    Yes, I'm unsure why but this is how I understood it. They designed everything around the worst possible scenario so (in theory) the PS5 should be mostly be working within that maximum...except in things like the Horizon map (I think he used as an example).

    I hadn't thought about that aspect regarding the Control photo mode - makes a lot of sense though
     
  15. Vega86

    Newcomer

    Joined:
    Sep 25, 2018
    Messages:
    182
    Likes Received:
    123
    Even car manufacturing is being affected by the poor chip manufacturing speeds and is asking help from the US government. US gov is now getting involved through an executive order yesterday, I think.
     
  16. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,350
    Likes Received:
    1,998
    Location:
    Maastricht, The Netherlands
    Especially, because they were more conservative in keeping stock and placing orders due to covid than most others.
     
  17. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    12,986
    Likes Received:
    15,717
    Location:
    The North
    Good discussion.

    Looking back at my statement, it was not well thought out and quite a generic statement. There are all sorts of reasons why the PS5 can dip in power from its maximum power draw, despite having a boost clock system.

    Though I do still disagree with the idea that the CPU is acting as a power virus during cutscenes, hence more power draw, and during gameplay, when the wattage is less than maximum should be translated as the GPU operating at maximum clock rate because there is still more power to give.

    With some actual thought, while it's true that boost systems aim to maximize the amount of power available, one aspect is that it shares power with the CPU and I don’t think this is being properly accounted for. The challenge here for PS5 or this type of setup is that there's power shifting mechanism still is still latent. Due to the latency of shifting power from GPU to CPU we are unlikely to see a situation where the GPU is feeding just a little bit more power to the CPU as the CPU requires. That is fundamentally too fine grained controlled for a situation in which the CPU could burst for all eight cores at anytime. On top of the console has to act equivalent to all consoles run the same code as per PS5 specifications despite whatever environmental controls are in place. So there has to be a form of conservatism in which the console can draw its power. For all consoles, which means it’s likely to not be as highly tuned as a thermal boost, in which the CPU and GPU rely on it’s own always available power and controls it with boost based on thermals.

    It is likely that the transferring of power between the GPU and CPU, is done through large steps. At step 0 the CPU has enough power to operate and the GPU can operate to a boost maximum of 2230MHz. Likely, at the next step 1, the GPU will only be allowed to boost to a maximum of 95% of 2230 megahertz. And the next step (step 2) will likely be the GPU can only be allowed to boost up to 90% of 2230 megahertz and so forth.

    From this perspective whenever the CPU exceeds its power bracket the GPU will drop its power level significantly, recall that voltage is directly correlated to frequency, making this a simplistic model of voltage being a function of frequency cubed. A reduction in 10% in frequency of GPU is a dramatic amount of power available to the CPU. However, the CPU is not required to use all of the power provided to it as a result of that 10% reduction. I provide an example and calculations below.

    Assume wattage is 200W maximum power draw for ease of calculations, though in reality the final wall number will be a combination of fan, ssd, and memory chips also taking power. For the sake of simplicity, 200W. A simple DVFS calculation here (2230*0.9 / 2230)^3 * 200W = 145.8W. (step 2)

    If we assume step 0 is enough to power both CPU and GPU at 100%, by moving to step 1, where we take 5% frequency off the top of the GPU, the wattage headroom drops to 171.8W. This is too tight to 175W as you said the 'heavy action gameplay can drop to'. This may be likely still too tight, so drop it the next step at 10% (step 2). Now the reductions drop the total wattage to 145.8W. There is now significant room for the CPU to work with in terms of wattage, it now draws up an additional 25W of power to 175W. Now my calculations are wrong, as they mix some things together that shouldn’t be etc. But the point of this is to showcase that by reducing the GPU to feed the CPU, as long as the CPU doesn’t fully consume the power 100% that is given to it, the power draw will be lower.

    It is likely at step 0, when power draw is at it’s maximum, is when the GPU is at the full 2230Mhz. As this aligns with the highest frequency for both CPU and GPU (and thus the lowest amount of activity from a profile perspective), and thus aligns with reduced performance/watt as frequency increases.
     
    #977 iroboto, Feb 25, 2021
    Last edited: Feb 25, 2021
    w0lfram, PSman1700 and thicc_gaf like this.
  18. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,107
    Likes Received:
    3,027
    Location:
    France
    It's interesting but it's all wild conjecture without any kind of actual data. First I have heard of latency of about ~1ms for the clocks adjustements. Second Cerny told use when there is going to be any kind of downclock it's going to be usually by 2 or 3%, not 5% or 10%. Then as we have seen in current games (and as told by Cerny) the downclocks are going to happen in unusual conditions, and it won't last at all thanks to the low latency power management and different kind of systems in order to save power consumption.

    Also you are wrong thinking the clocks are alone directly giving the power consumption. It's way more complex. There are plenty of others factors in play. What kind of instructions and how many instructions (if you are doing a furmak test, you are doing the most instructions you can, but it's useless).

    But I think the biggest proof is the actual power consumption during different scenes. It's actually extremely rare to even reach 200W during gameplay (I actually haven't seen it yet on any analysis) while many cutscenes are consistently at that level.
     
  19. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    587
    Likes Received:
    520
    It is best to just look at the late PS4 games. The later the first party titles came, the louder the console got. Now the PS5 should not get any louder, but instead reduce it's clock over time while games push it more and more to the limits.
    Over time the PS5 might no longer reach it's highest clockrate in newer titles. Therefore it boosted earlier titles.
    That is what I meant when I wrote "autooptimization". If you write non-optimal code with lots of "latencies" in it, you get a boost in clockrate. If you optimize the code so you have as good as no latencies you get a clock reduction but you still have performant code.
    But you can also look at it and think that it punishes developers who optimize their code ("to much"). :D

    It will take a while to get the optimal mix between optimized code and clock-rate.
     
    #979 Allandor, Feb 25, 2021
    Last edited: Feb 25, 2021
    thicc_gaf and PSman1700 like this.
  20. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,870
    Likes Received:
    10,985
    Location:
    London, UK
    PS4s also got louder as they got older because a lot of people never cleaned their console or replaced the thermal compound. There is also the issue that optimising for any hardware gets better over time, the APIs will be tuned and the tools will improve so that it is easier to push the hardware harder and conversely, also make some code more efficient so that it doesn't push the hardware as much.

    I cleaned my PS4 then PS4 Pro every year and it always got noticeably quieter - noticeably so when changing the thermal compound - PS5 should be much better in this regard because the liquid metal layer is precisely positioned rather than just varying quantity of paste randomly splashed into the chip. ;-)
     
    thicc_gaf and Globalisateur like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...