Next Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

  1. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    530
    Likes Received:
    35
    That is true.

    All in all, every topic around PS5 boost limitations are determined by their budget for the cooling solution, which we dont know. But we can infer, from Cerny words, its not a whole lot, or else they would not have trouble having the fixed frequencies he mentioned (3.0/2ghz).
     
  2. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,268
    Likes Received:
    2,595
    Location:
    Wrong thread
    Aside from the comment @mrcorbo made, don't you think this interpretation might be a bit uncharitable?

    DX12U *is* Microsoft's API after all, and the first AMD GPU [edit: generation] to support it fully just happens to be the one designed for Microsoft's own Xbox GPU.

    It's quite reasonable that MS should want DX12U and their console hardware to mirror match each other as much as possible.
     
    RagnarokFF, tinokun, Silenti and 2 others like this.
  3. pTmdfx

    Regular Newcomer

    Joined:
    May 27, 2014
    Messages:
    278
    Likes Received:
    176
    Have you thought about the fact that TDP is pointless engineering metric if it is a moving goalpost?

    Do you have an idea that rendering pipeline, eh, comprises of multiple stages that stresses different graphics subsystems over the course of a frame? Throwing async compute into the picture makes the load mix even more application dependent, and even async compute can't be the perfect gap filler all the times to suck up all "under-utilised" resources. Blanket generalisation like this can hold true only when naive/impractical assumptions like "all software use all hardware resources at 100% all the times" are made.
     
    #1423 pTmdfx, Mar 30, 2020
    Last edited: Mar 30, 2020
    KeanuReeves and psorcerer like this.
  4. jayco

    Veteran Regular

    Joined:
    Nov 18, 2006
    Messages:
    1,263
    Likes Received:
    561
    We will never have 60fps games on PS5. :runaway:
     
    disco_, Silenti and zupallinere like this.
  5. Love_In_Rio

    Veteran

    Joined:
    Apr 21, 2004
    Messages:
    1,579
    Likes Received:
    197
    That SSF contains a hardware block only appears in DF article, not in the MS own specs sheet.
     
  6. Mitchings

    Newcomer

    Joined:
    Mar 13, 2013
    Messages:
    113
    Likes Received:
    172
    Running at high clocks doesn't mean you're maxing out power draw. It's the nature of the instructions you're using and the operations you're performing which matters most.
     
    jayco likes this.
  7. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,353
    Likes Received:
    9,986
    Location:
    The North
    Filling all 8 cores with work does mean power draw though. He's right. Unless you intend to be sitting with a single core running at max cap, there needs to be a reduction as the core count increases.
    Look at how heavily they took the TLOU remaster and immediately filled all the cores on PS4 to reach 60fps.
    [​IMG]
     
    PSman1700 and BRiT like this.
  8. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,361
    Likes Received:
    3,940
    Location:
    Well within 3d
    Thermal guidelines for CPUs can be fluid. While not directly comparable, I did look at some values that were apparently sourced from AMD documentation in https://www.gamersnexus.net/guides/...lained-deep-dive-cooler-manufacturer-opinions.

    If it were just the CPU, something along the lines of a Wraith Stealth might work for the CPU portion, assuming we can get a combination of Ryzen 3600 and 3700. Sony's target clock is a notch below the 3700, although there are 8 cores versus the 65W 3600. Odds are one core in the PS5 is OS-reserved, so Sony may be able to be more sure about what sort of instructions that one uses.
    The main value I'm curious about is the P0 value (non-boost all-core base power), which we can see can be unexpectedly high for the desktop processors. We can also see how widely power can vary based on a modest change in clocks once you get in the boost ranges.

    I think a console could get away with similar temp values and a cooler with capabilities in the same range as the Series X without going with double-sided cooling. I'm still not sure what the gain is here, and I think Sony might be going for less power.

    Random speculation:
    The 3600 seems to be the one that behaves itself, but how should we interpret the fact that the PS5's top-clock is below the base clock of a 65W CPU?
    I don't recall seeing a PS4 power breakdown, but there was some speculation based on existing Jaguar chips that there might have been ~30W or so for 8 cores. How much lower than 65W is the ceiling for the PS5's CPU, and what does that mean for either the GPU or the total console power?
    Microsoft could conceivably be allocating a value close to the desktop P0 rating for its CPU, if the power ratings from Digital Foundry are representative.
    Is there a good reference for the GPU-only consumption of Navi 10? I thought I saw a reference for 180W for the 5700XT. Let's say we tossed on a 50% gain in efficiency for RDNA2, and could get ~120W for the same performance. If there were a guaranteed 90W for the GPU, 30W for the CPU, and 30W being sloshed about, then that gives a 150W APU consumption and 30-50W for the rest of the console for a Pro-like power budget. Maybe that's too conservative in terms of power savings?


    I think they'd profile their code for performance and see less performance than the raw numbers would suggest. That already happens since real life doesn't give perfect scaling, but it would be a more complex thing to profile and opens up a new class of interactions between workloads and threads as far as adverse effects are concerned.
     
    PSman1700 likes this.
  9. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,060
    Likes Received:
    926
    Location:
    Earth
    But you would have to factor in cerny specifically mentioned expected heavy(ier) use of avx2 instructions for having to reduce clock speed.

    This is all so speculative. We don't even know how much the clocks will be reduced. Could be half, could be 200MHz. And could be it really is not that usual for that to happen or could be happens all the time.
     
  10. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,620
    Likes Received:
    467
    We don't actually know that PS5 doesn't have a different max clock speed when SMT is enabled either. All we really know is that it will adjust clocks based on load/power requirements, and having SMT enabled changes the power and load. That's the issue many of us are having with Sony's reveal, it only provided best case performance figures with a guarantee that those numbers are fluid and a confident nearly close most of the time assurance that those numbers can be reached. Things would be clearer if they gave us a range or performance. If they said the CPU will never drop below 2ghz and the GPU never below 3ghz, then we would know there isn't going to be a ton or range. But if the GPU is clocking down to 800mhz that would be a really noticeable drop, I think.
     
    tinokun, disco_ and PSman1700 like this.
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,353
    Likes Received:
    9,986
    Location:
    The North
    You are right that we don't know, because the characteristics of the chip aren't known.
    But we do know that power/frequency relationship looks nearly exponential for a lot of chips, some chips may appear more linear in nature - meaning a significant amount of more power draw to marginally increase frequency.

    Everyone thought 2.0 Ghz was high, they said it wasn't achievable using normal methods. But using boost they got it to 2.23 Ghz. At an exponential scale, that's a significant amount of additional power required to hold that frequency. If work load is increasing to maintain that clock the power draw must be greater. Once the remaining power capacity is eaten up by the GPU and it requires even more power than that, it must with draw from the CPU or downclock itself.
     
    dskneo and PSman1700 like this.
  12. pTmdfx

    Regular Newcomer

    Joined:
    May 27, 2014
    Messages:
    278
    Likes Received:
    176
    I really wouldn't back "immediately filled all the cores" with this graph, which pretty much depicts the opposite of the story, i.e. the varying workload mix in reality that advanced power management techniques are designed to take advantage of. That is on top of the fact that they had already been actively seeking out major parallelisation wins (this graph in particular: rendering a frame ahead of game logic).
     
  13. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,381
    Likes Received:
    15,835
    Location:
    Under my bridge
    If MS can run AVX on 16 threads at 3.66 GHz without lowering clocks, why can't Sony run AVX on 16 threads at 3.5 GHz?
     
    disco_ likes this.
  14. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,060
    Likes Received:
    926
    Location:
    Earth
    Max clock is not max power draw. Furmark is good example of this. It heats up gpu in a way no game does. In console world it's probably possible to find furmark like cases but not in all the games, all the time. We would really need some developer break nda and tell how easy it's to create virus like power loads or is it difficult and ps5 coasts on max clock most of the time.

    edit. Cerny was specific on saying max gpu clock is not limited by power/heat. There are parts of the gpu that just refuse to work on higher clock no matter what.
     
  15. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,353
    Likes Received:
    9,986
    Location:
    The North
    you're right, it took them a while to get to this point. So it wasn't immediate in that sense. Their job functions had all sorts of gaps everywhere on all cores.
    To me their original graphs looked like they had tons of breathing room for the CPU to sit idle. To me this type of load is the type of load where the boost frequency will stay rather high.
    [​IMG]
    The original graphs were a lot less filled and they couldn't get to even a 33ms frame time. They had to overlap their rendering frame
     
    Shifty Geezer and PSman1700 like this.
  16. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,353
    Likes Received:
    9,986
    Location:
    The North
    Is the assumption that developers can't stress a GPU and CPU unless it's stress test?
    I don't think that's reflective of many current generation consoles that are running full tilt on their fans.
     
    Silenti and PSman1700 like this.
  17. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,426
    Likes Received:
    13,893
    Location:
    Cleveland
    How far do you want to drop the PS5 GPU?
     
    PSman1700 likes this.
  18. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,060
    Likes Received:
    926
    Location:
    Earth
    My best guesses would be power supply and/or cooling subsystem. Or could be MS did something specific on their chip design to allow for higher clocks.
     
  19. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,060
    Likes Received:
    926
    Location:
    Earth
    This is in no way contradicting what I claimed. I just said as per cerny max clock speed doesn't imply max power draw. Very well optimized games would find the powerdraw limits and we really don't know what that means. Cerny was very specific on mentioning avx2 for cpu side. And cerny was very specific saying gpu clock speed is not limited by thermals/power draw but some internal implementation of gpu. It just doesn't clock any higher no matter what.

    Taking what cerny said it means that to hit those power draw limits one would have to have sufficiently well optimized code. Again what that really means we will not know until some developer spills the beans. Though just hitting max clocks is not enough to cause throttling.
     
  20. Metal_Spirit

    Regular Newcomer

    Joined:
    Jan 3, 2007
    Messages:
    550
    Likes Received:
    337
    I find it strange that so many theories appear based on power problema over a 2.23 Ghz 36 CU APU and lower CPU clocks, and no one finds power problems on a APU with higher CPU clocks (+3%) and 52 CU (+44%) at 1825 (-22%) Mhz with extra logic to route more data, and wider controllers.

    Why is that? What data do you have for that line of thought? Based only on these thingsI would guess Microsoft's APU will consume more power than PS5.
     
    egoless, Pete and Mitchings like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...