Playstation 5 [PS5] [Release Holiday 2020]

Discussion in 'Console Technology' started by BRiT, Mar 17, 2020.

  1. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,496
    Likes Received:
    2,190
    Location:
    France
    My take on this: Smartshift can allocate CPU power to the GPU, while still running at 3.5ghz.

    Another interesting power saving measure: when CPU or GPU are idle for some time (and not point of increasing the clocks because of waiting for next vsync), then the frequency will be reduced (or not increased ?) during that short time, he calls this 'race to idle':

    So with all those power saving measures (I counted 3 differents):
    - Variable frequency based on current max load (100% deterministic, he confirms silicon lottery or room temp won't have an impact on the variable clocks) of both CPU / GPU,
    - Smartshift (CPU unused power given to GPU)
    - 'race to idle' (decreasing, or not increasing ?, frequency of either CPU or GPU when those are scheduled to do nothing for a few ms)

    Cerny expect CPU and GPU to run at max frequency most of the time (the time needed to usefully run at max clocks). so this means it's totally normal that the CPU is downclocked, but it should usually not mean the downclock will make the game run slowly. In many cases CPU and GPU should be downclocked without impacting game performance.

    This is really some innovative stuff. I mean, they wouldn't have needed this if they had 52CUs in the first place.
     
  2. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    I don't think your representation is quite right. The CPU and GPU can run at their maximum clocks when both are being utilized in a way that allows them to remain within their power allocation. It is possible, though, for a developer to allow utilization of one or both to exceed their allocation and this will cause some combination of power shifting to/from and throttling of one or the other parts to some degree in order to bring the total power allocation within limits. In this sense, developers would absolutely be choosing to run one of the CPU or GPU slower in order to have excessive (beyond budget) utilization of the other.
     
    DSoup, Scott_Arm, Silenti and 2 others like this.
  3. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,984
    Likes Received:
    1,565
    Who knows what it will be like with RDNA2 but most damning thing about that DF video is that performance scales much better with additional CU's than with higher clocks...the gap between XSX and PS5 could be larger than on paper if that's the case.
     
    PSman1700 and BRiT like this.
  4. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    I don't think this was clear. He never said the CPU or GPU would clock down when not loaded. He just said the GPU/CPU clocking up under low loads was pointless (in a console) and therefore was not included in their estimates that the CPU and GPU would stay at max clocks most of the time.

    Literally, the summation of the section was:

    Everything else is your speculation.
     
    BRiT and PSman1700 like this.
  5. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    1,939
    Likes Received:
    1,281
    That’s not accurate, and actually the opposite of what he said. He stated that developers can choose which to prioritize, which is in complete keeping with Richard’s characterization of “profiles” for the developers to choose from.
     
    ToTTenTranz, disco_, Picao84 and 4 others like this.
  6. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,711
    Likes Received:
    904
    Yes my fault, its what i ment with the profiling. Its not like devs dont have to choose, or the system will do it for them, most likely.

    @mpg1 besides the vrs support, seems unknown? (Hate writing on phones....)
     
  7. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,518
    Likes Received:
    878
    Location:
    France
    So no variable rate shading for PS5 ? WTF ? I smell RDNA1+RT
     
    PSman1700 likes this.
  8. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    It's probably there. They just haven't confirmed it yet. Of course, until they do, it's still an open question.
     
    Scott_Arm likes this.
  9. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Unfortunately he completely dodges the base clock question. Probably never going to get an answer on that directly from Sony. I still love the implementation, it's very innovative.
     
    PSman1700 likes this.
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    The base clock is the max clock. How low it drops in use, and how much that actually impacts game performance, will have to wait until the console is out and running games. Cerny can't dodge a question he can't know the answer to - how will devs be using the processors in five years' time and how will the power budget be distributed and will the processing of jobs be running slower as a result.
     
    ToTTenTranz and zupallinere like this.
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,829
    Likes Received:
    10,870
    Location:
    The North
    I mean, it actually does sound like they are throttling. Because running them at full speed would sort of explain the opposite entirely. Chips have a red line of how much voltage they can receive, cores themselves had a max voltage as well. Once they exceed the voltage the chip dies or suffers issues in terms of being able to work properly, so there are hard limits on voltage and frequency etc that is entirely based on the physical properties of the chip.

    In the case of _boosting_ the cores can give up some of their power to give the other cores more voltage to clock higher in frequency. And even then there is still probably left over voltage, so that voltage has no where to go, and the chip is at max frequency, so where does it go? It goes to the GPU with power shift.

    So this is how PS5 supports both at their maximum cap. But then you run into the issue that if you want to guarantee the load on the GPU, as the workload increases the operations per clock cycle increases, you need more power to feed the cores individually. So normally you'd draw back frequency and spread it back over to the other cores to do more work. In this case, it's grabbing more power from the CPU forcing it to throttle, to feed the GPU.

    This was at least my interpretation of what devs are doing. Cerny talked about hoping developers learn to optimize for power, meaning learning how ot code in such a way that you use less power, or use less power hungry operations or just being more efficient with your operations per cycle. I believe this was also in reference to this.

    Indeed, I will say it is coincidence that Cerny just those numbers 10% frequency drop = 27% reduction in power. Because a 10% frequency boost from 2.0GHz is nearly 2.223Ghz. How that works out in terms of what the chip can handle or works with each other is unknown. But an interesting remark made, it's not like Richard asked him if the chip was 9.2 TF.

    He also did not want to answer the worst case scenario for down clocks.
    Though I'm sure both MS and Sony know what the base clocks are for both chips that no amount of load would force it down any further.
     
    Pete, pharma and PSman1700 like this.
  12. Silenti

    Regular

    Joined:
    May 25, 2005
    Messages:
    519
    Likes Received:
    110
    2 things - in jest (it's April, it still counts)

    People have bitched endlessly for years on end about a V8 being replaced with something with fewer cylinders and or smaller capacity. Right or wrong, they do. How often have you heard the derisive term "4-banger"?

    In the further interest of playfully batting around with the analogy. The 2.0 6 cylinder would be a better comparison if you had said it was turbocharged. Then we can argue about turbo lag and "real horsepower". Seems to be where the discussion in headed in most parts.
     
  13. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,984
    Likes Received:
    1,565
    With that GPU clock speed IS it more efficient though?...
     
    #1093 mpg1, Apr 2, 2020
    Last edited: Apr 2, 2020
    PSman1700 likes this.
  14. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,994
    The part about managing power based on deadlines of operations is remarkable (race to idle), it was in one of Cerny's patent about clock control. The circuitry they added to profile operation, to predict and meet a deadline of macro sections of code, might be as much about BC as it could be about this power scheme.
     
    ToTTenTranz, Xbat and zupallinere like this.
  15. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,209
    Likes Received:
    5,634
    Really comes down to what "most of the time" means, and how low the clocks can drop. Is most of the time 51% or 99%? And then in practice how far will the clocks actually drop? Seems like it'll be less than 10%, so essentially for the console warriors that want to compare numbers, the Series X gpu is already faster, so it's a matter of whether it's faster by 20% or 30%. Ultimately, I don't think it matters at all unless you're just interested in fighting over the specs.
     
  16. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,994
    I believe the infatuation with clocks, percentages, and trying to map the thinking onto what we already know from PC gpu/cpu boost methods, are making everyone miss what is happening here and why it's made that way.
     
    ToTTenTranz likes this.
  17. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Do you really think so? Seems implausible. The system clocks are pre-determined based on activity counters using their model SOC. Which means they've already defined a curve which establishes clock frequency for a given activity level. It's the same curve for all machines and not dynamic, and therefore known. They've shared the curves' upper bound, but aren't sharing the lower bound. It must exists, even if you truly believe an application could never reach that condition. Which is admitted in the part of the comment describing dealing with that situation more gracefully. I don't think anyone can or is asserting how many games, if any, ever reach that point. But there is a frequency at which the system could be 100% busy, that keeps it within their power, thermal, and acoustic envelope. They couldn't have simply said, no ones ever going to get here so we don't let's not bother defining a frequency for it.
     
    Pete, PSman1700 and Scott_Arm like this.
  18. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,755
    Likes Received:
    8,146
    Location:
    London, UK
    This is the big question but I can't help but think if it was in any way a problem, this was have leaked by now. Developers would have spoken out, like they have expressed doubts about Lockhart's capabilities to Jason Schreier. These kind of issues just do not stay contained.
     
  19. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,984
    Likes Received:
    1,565
    I feel like whether it maintains a high clock is irrelevant if there a significant diminishing returns on performance with higher clocks...

    If the difference between 1.7GHZ and 2.1GHZ is an extra 5fps like DF showed who gives a shit about weather or not it is reaching it's max clock?
     
  20. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,536
    Likes Received:
    196
    Location:
    Somewhere out there
    Come to think of it he can't tell you an exact number, because this number actually depends on the workload.
    If we find some "lazy devs" (how long have we not seen this phrase!) that choose to run a very power heavy workload on a fairly unneeded area *cough* horizon overworld map *cough* then you'd most probably see a downclock.
    You can then say "it downclocks most of the time".
    But is this a typical workload? No.
    Does running the game at the capped frequency help? probably not?
    Is there more incentive for the devs to fix the code? It sure looks like it.
    Should the devs fix their code? I hope they're competent enough to do at least that.


    Then, if the devs optimized for the code to not be power intensive, we'd probably see the clocks locked at the max clocks.

    It does appear to provide good feedback to "coax" devs into running more "energy efficient" code.

    Anyway, at the very least it's probably clear that we won't hear our PS5s run like jet engines unless we clog the vents up or put the machine in an oven.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...