Next Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

  1. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,477
    Likes Received:
    10,164
    Location:
    The North
    Yea that's fine, I'm not saying hitting max frequency results in throttling.
    Cerny provided:
    a) The optimal clock rates for both CPU and GPU (we anticipate this to be not heavy saturation workloads)
    b) Cerny provided the use cases in which CPU load could have a downclocking effect on GPU load. In this case, he brought up AVX calls as being able to drop GPU frequency, but barely. This makes sense because GPU requires 4x more power than CPU in modern GPUs. So if the CPU needs some, the GPU loses it at 1/4 the amount that CPU needs.
    c) Cerny _did not_ provide the GPU loads that will cause downclocking on the CPU. The GPU needs to pull 4x the amount in the opposite direction.
    d) Cerny _did not_ provide what happens when both CPU and GPU are sufficiently loaded what the frequencies would be
     
    PSman1700 likes this.
  2. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,096
    Likes Received:
    957
    Location:
    Earth
    Agree. And to me that is why we cannot really draw any good conclusions based on data we have know. Best we can do is parrot cerny and then claim he has no credibility, he is lying or say I believe cerny, though he left some considerable wiggle room in his talk.
     
  3. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,096
    Likes Received:
    957
    Location:
    Earth
    My gut feeling is that a lot of this has to do with pc side experience. Boost on pc is just ... not that good. It starts from chips not being binned to console standard, hw being built cheaper(bad cooler on one board, great cooler on another) and ends on thermal limits as some users have stupid cases. Also bad power supplies seen on pc side.

    This is different for sony as they definitely said they only throttle clocks around power draw and designed system accordingly. Every console chip should have same behavior, i.e. limited by the worst chip. Similar to sony I don't doubt microsoft as they have done solid engineering as we can see from teardown.
     
  4. pTmdfx

    Regular Newcomer

    Joined:
    May 27, 2014
    Messages:
    280
    Likes Received:
    177
    Even in the shipping version, it is fairly clear that usage peaks are a rarity across all 6 cores, even after they have overlapped rendering and game logic. Wild guess by eyes says <50% utilisation in all cores.

    Many here have a romanticised view of hardware utilisation and software development in general. The reality is that it is tough to keep general purpose hardware usage high all the times, let alone a naive 100% utilisation. This applies to even things that have been heavy optimised for consoles, and yet many focus on theorising at practically unattainable scenarios like continuous and simultaneous 100% CPU use and 100% GPU use.

    (For sure, you can attain that if you write a "power virus" like Furmark that does gibberish specifically to achieve this, but it would not be a reflection of reality.)

    Modern processor designs exploit this exact reality, by introducing race-to-idle with dynamic power allocation in the name of "turbo" or "boost", so that the unused power due to underutilisation can be directed to get the hot paths done quicker (raising running frequency temporarily). In the context of CPU, that's the basics behind different tiers of boost clocks (single cores, two cores, 1/2 cores, all cores, etc).
     
    Xbat, Mitchings, manux and 3 others like this.
  5. RobertR1

    RobertR1 Pro
    Legend

    Joined:
    Nov 2, 2005
    Messages:
    5,735
    Likes Received:
    926
    lol what!?
     
    Scott_Arm and PSman1700 like this.
  6. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,477
    Likes Received:
    10,164
    Location:
    The North
    Agreed. I assumed as much even as I wrote it, it looked like there was more to give. I also assumed that this was why Cerny was confident that the CPU workloads would be unlikely to dip GPU clockspeeds in PS5 with the exception of AVX2 use cases.

    There are all sorts of bottlenecks sort of just limiting things, and higher clock speeds may have a positive effect on frame rate limited scenarios, so some situations where if I'm frame rate limited, having that higher clock speed is going to result in more FPS then spreading out the work over more cores (which require programmer intervention, while having a high clock speed does not)

    In this case, having super high boost is what you want for 120fps-240 fps on your CPU.

    edit:
    the graphs we were looking at are jobs not cpu utilization btw. So I’m not sure what the actual utilization/ saturation is. Not sure if what I posted was a good example anymore.
     
    #1446 iroboto, Mar 30, 2020
    Last edited: Mar 30, 2020
    pharma and megre like this.
  7. zupallinere

    Regular Subscriber

    Joined:
    Sep 8, 2006
    Messages:
    736
    Likes Received:
    86
    Game Over Man!!!
     
  8. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,096
    Likes Received:
    957
    Location:
    Earth
    On PC side boost is something that is not guaranteed. People reflect pc boost on ps5 and that leads to all kinds of wrong conclusions starting from assuming something about thermals and ending to thinking about clocks only achievable with excellent (rare) chips.
     
  9. Dictator

    Newcomer

    Joined:
    Feb 11, 2011
    Messages:
    236
    Likes Received:
    851
    As per the sampler feed back video above from the DX team, Claire Andrews says "it is a GPU hardware feature." and it is neatly printed on the slide.

    About 20 seconds into that video.

    WRT to the whole phraseology from the Sony PS5 presentation - I also have trouble following the logic of how they went from having trouble to maintaining 2.0 Ghz (GPU) 3.0 Ghz (CPU) under their old normal fixed clock paradigm to. a 2.23 and 3.5 with it transfering power between the parts. What loads were they testing that made 2.0/3.0 hard to get - and what loads are they testing that 2.23 and 3.5 happen "most of the time"? Wouldnt it make sense that loads that made 2.0 and 3.0 hard to maintain under a fixed power would have the same affect on a situation where power transfers?
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,527
    Likes Received:
    15,981
    Location:
    Under my bridge
    Is the PS5 GPU generation so much more heat than XBSX's such that XBSX can run GPU at full speed and CPU at full speed without overheating but PS5 can't? If so, PS5's design really is poor by going with only 36 CUs.
     
    ToTTenTranz and disco_ like this.
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,477
    Likes Received:
    10,164
    Location:
    The North
    I suspect that the XSX is lower on the power frequency curve. Just thinking about exponential looking graph. It’s 400 MHz down. But two will have different power frequency curves so I don’t know for sure.
     
    disco_ likes this.
  12. RobertR1

    RobertR1 Pro
    Legend

    Joined:
    Nov 2, 2005
    Messages:
    5,735
    Likes Received:
    926
    There's plenty of data on how boost works on the PC by people using high end cooling, knowing how the different parameters impact boost and testing it across a wide range of cooling.

    Boost behavior noted using using an oscilloscope
    boost behavior white board session

    I'm linking buildzoid as the 2 videos line up. There's a lot more content out there from other (including AMD) that corroborates this.
     
    #1452 RobertR1, Mar 30, 2020
    Last edited: Mar 30, 2020
    PSman1700 and iroboto like this.
  13. Love_In_Rio

    Veteran

    Joined:
    Apr 21, 2004
    Messages:
    1,579
    Likes Received:
    197
    Well, the PS5's APU size is the smallest Sony has made, and they had to make something to catch up. So, i would say yes, poor design. By the time PC cards come out PS5's GPU will be in the lower end. At least PS4 was over a Radeon 5850.
     
    PSman1700 likes this.
  14. jayco

    Veteran Regular

    Joined:
    Nov 18, 2006
    Messages:
    1,268
    Likes Received:
    570
    #1454 jayco, Mar 30, 2020
    Last edited: Mar 30, 2020
  15. RobertR1

    RobertR1 Pro
    Legend

    Joined:
    Nov 2, 2005
    Messages:
    5,735
    Likes Received:
    926
    You need to know the efficiency scaling of the node. Once tip over, the power draw/heat/frequency relationship goes through the roof.

    You also need to cross reference that against the size of the die to see it's surface area and heat dissipation capabilities.
     
    Silent_Buddha, PSman1700 and BRiT like this.
  16. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,573
    Likes Received:
    14,169
    Location:
    Cleveland
    Hard to tell unless you run the PS5 GPU at 1.85 GHz instead of 2.23 GHz and take measurements. Even more difficult because we don't have PC GPU parts to test with to know where on the knee-curve the RDNA2 GPUs normally sit, then cross reference that with next-gen console clocks and voltages.
     
    PSman1700 likes this.
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,088
    Likes Received:
    2,955
    Location:
    Finland
    No. Again, for the about millionth time: Consoles have Zen 2 based CPU cores, but they do not adhere to any AMD CPU/APU specifications nor reuse them in any way. They are built using same elements, like the Zen 2 core, but that's it, each different chip, console or not, is built from scratch using the blocks AMD has.
    The CPU cores themselves should be identical, but amount of cache hasn't been confirmed by either Sony nor MS, it could be same as Renoir (that 4900HS), it could be more, heck, it could be even less in theory. But regardless of it's exact configuration, it's not the same even if it has same amount of caches, it's built to fit the console APU
     
  18. psorcerer

    Regular

    Joined:
    Aug 9, 2004
    Messages:
    732
    Likes Received:
    134
    Again. It's so easy to understand.
    If you knew why the games in current gen were struggling with 60 fps.

    Nope. AFAIK no API has that yet. Although theoretically you can do it.
     
    #1458 psorcerer, Mar 30, 2020
    Last edited by a moderator: Mar 30, 2020
  19. Esrever

    Regular Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    766
    Likes Received:
    527
    If you take the "couple % drop in clocks can save 10% in power" and map it to a curve of any power scaling graph, you should be able to estimate where in the curve PS5 is sitting in regards to that quote.
     
  20. Metal_Spirit

    Regular Newcomer

    Joined:
    Jan 3, 2007
    Messages:
    550
    Likes Received:
    338
    This has nothing to do with boost on PC! Completly diferent tecnology.

    Quoting from Eurogamer:

    "It's really important to clarify the PlayStation 5's use of variable frequencies. It's called 'boost' but it should not be compared with similarly named technologies found in smartphones, or even PC components like CPUs and GPUs."

    As such, if it is only gut feeling, just spit it out ;) (kidding you)
     
    #1460 Metal_Spirit, Mar 30, 2020
    Last edited: Mar 30, 2020
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...