Next Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

  1. VitaminB6

    Newcomer

    Joined:
    Mar 22, 2017
    Messages:
    169
    Likes Received:
    217
    That's not what I got from his statement. Wanting the same performance is based on not varying the base and boost clocks speeds for individual chips. All chips will meet the same standard of performance regardless of the fact that some chips may be able to achieve higher frequencies. I could be wrong but that's how I remember interpreting it.
     
  2. jayco

    Veteran Regular

    Joined:
    Nov 18, 2006
    Messages:
    1,508
    Likes Received:
    879
    No.

    His actually quote: "It wouldn't make sense to run the console slower because it was in a hot room,so rather than looking at the actual temp, we look at the activities that the CPU and GPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable".

    36:40 of the Road to PS5 video. If a developer designs a game to run at 2.23GHZ all the time, the console will hit those frequencies all the time.
     
    blakjedi, KeanuReeves and jgp like this.
  3. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    2,019
    Likes Received:
    1,402
    Yes, this reinforces my point. Very muddy waters, yet we’re claiming the PS5 will only hit these clocks “under light loads” as if we’ve already been disclosed on the technology. We don’t know. We’re going to have to wait.
     
    goonergaz, KeanuReeves and VitaminB6 like this.
  4. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    3,241
    Likes Received:
    1,251
    Maybe they wonder, what the CPU will do when developers opt for full GPU clocks all the time?
     
    blakjedi, VitaminB6 and BRiT like this.
  5. AbsoluteBeginner

    Regular Newcomer

    Joined:
    Jun 13, 2019
    Messages:
    811
    Likes Received:
    1,038
    This is irrespective of archictecture, other cards (Nvidia and AMD) all show similar results. Makes sense, there is little reason why peak wattage of RDNA2 would suddenly be 2x/3x higher then RDNA1 (or Nvidia cards for that matter).

    But he didnt say that. At all.
     
    PSman1700 likes this.
  6. AbsoluteBeginner

    Regular Newcomer

    Joined:
    Jun 13, 2019
    Messages:
    811
    Likes Received:
    1,038
    No, this is only in referrence to boost being based on TDP/heat, which it is not. Its based on power draw limit.

    For activities which hit power draw celling, frequency will drop. If GPU runs harder and consumes more energy, frequency will drop. Simple as that.

    Will developers have to manage it, or will they pick GPU/CPU profile, we dont know. My point about 5700XT (or take 2070S) is that average power draw and peak power draw are not that far away 9-10W.
     
    PSman1700, BRiT and VitaminB6 like this.
  7. jayco

    Veteran Regular

    Joined:
    Nov 18, 2006
    Messages:
    1,508
    Likes Received:
    879
    "Deterministic and repeatable" based on "CPU and GPU activities". I think it is very clear, a developer can design a game to hit 2.23 GHz the whole time.

    And again, Cerny also said that a "2% frequency drop, allows for a 10% in power consumption".

    These are statements that some people want to dismiss over and over again.

    There is not going to be a substantial drop in performance, doesn't matter what anyone wishes are.
     
    KeanuReeves likes this.
  8. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,255
    Likes Received:
    3,204
    Location:
    Finland
    It will be achieved if the developer limits the game to loads in which 10.28TF can be achieved.
    It's not a case where developer gets just to decide what clocks he wants to run at, it's a case where they need to make sure the load they're putting on the system is within certain power limits at all times to have certain clocks at all times
     
    PSman1700, BRiT and AbsoluteBeginner like this.
  9. VitaminB6

    Newcomer

    Joined:
    Mar 22, 2017
    Messages:
    169
    Likes Received:
    217
    Is there a transcript available for the Road to PS5 video? Found this at 37 minute mark. Cerny "running a GPU at a fixed 2Ghz target was looking unreachable with old fixed frequency." He's talking about AMD Smartshift here. It seems pretty clear that in order for the GPU to maintain 2.23Ghz, that power will have to come from decreasing the clock of the CPU. So again as a lot have been saying, it depends on what developers want for a balance between CPU and GPU performance. It may be able to maintain that GPU clock speed but we don't know what that's going to do to the CPU clock, as we don't know the base frequencies. Sony only mentions "up to" for both CPU and GPU frequencies. Cerny also mentions "running the CPU at 3Ghz was causing headaches with the old strategy" meaning fixed clocks. So it sounds to me like Sony was unable to run the GPU at a fixed 2.0Ghz while also having a fixed CPU clock of 3.0Ghz. That means that for the GPU to reach it's max frequency the CPU will be running somewhere below 3GHz.
     
    #1249 VitaminB6, Mar 28, 2020
    Last edited: Mar 28, 2020
    AzBat, blakjedi, goonergaz and 4 others like this.
  10. AbsoluteBeginner

    Regular Newcomer

    Joined:
    Jun 13, 2019
    Messages:
    811
    Likes Received:
    1,038
    No, again, you are misquoting him. If developers could design a game locked at 2.23GHz, and by Cernys own admission couple % (2-3% I guess) saves 10% TDP, why did he say they couldnt hit 2.0GHz target with old way of doing things (so, no variable frequency)?

    And note that 2.0GHz is more then 10% reduction in clocks, not 2% ,so gain in TDP should be rather substantional.

    He said :

    "We expect GPU to spend most of its time at or close to that frequency"

    He further said :

    "Similarly running CPU at 3.0GHz was causing headaches with the old strategy, but now we can run it as high as 3.5GHz. in fact it spends most of its time at that frequency"

    Most - how much is it?
    Close to- how close is close?
    Expect - ?

    This is not "PS5 has HW RT in GPU" kind of thing. Cerny has been vague with this and made no comittment on actual numbers, bar max frequency.

    Its not about what people wish, its about people discussing and speculating on actual results once consoles are out. As it stands, no hard data has been given therefore we will have to wait and see.
     
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    11,669
    Likes Received:
    12,655
    Location:
    The North
    There's no actual transcribing application you can just readily use on youtube. Most of the time you have to do it manually. Actual AI transcribers from audio to text cost money per minute of transcribing. If you have your own trained model that can do it, then your'e in the luck. *correction there is transcribing

    He did say that. Yes it wouldn't be able to reach 2.0Ghz without SmartShift.

    And he's right, MS did not pass 1.8Ghz.
     
    #1251 iroboto, Mar 28, 2020
    Last edited: Mar 30, 2020
    AzBat, blakjedi, DavidGraham and 2 others like this.
  12. goonergaz

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    3,839
    Likes Received:
    1,218
    I'm unsure what you mean here, he said (and I quote it below) 'a couple percent reduction in frequency reduces power by 10%'

    This is a genuine question; why do you not quote the bit where he said "When that worst case game arrives, it will run at a lower clock speed. But not too much lower, to reduce power by 10 per cent it only takes a couple of percent reduction in frequency, so I'd expect any downclocking to be pretty minor,"
     
    #1252 goonergaz, Mar 28, 2020
    Last edited: Mar 28, 2020
    blakjedi and VitaminB6 like this.
  13. VitaminB6

    Newcomer

    Joined:
    Mar 22, 2017
    Messages:
    169
    Likes Received:
    217

    It looks like Sony was originally targeting fixed clocks for both CPU and GPU (2ghz GPU/ 3ghz CPU) and it was not attainable, or at leas was causing some kind of issues. This leads me to believe that If developers target anywhere near max frequencies for PS5's GPU the XBSX is going to have a pretty massive CPU advantage. This also solidifies why MS went with a larger enclosure than normal for the XBSX. This also leads me to believe that if Sony wants to run their CPU at 3.5GHZ their GPU clock is going to have to be running at a frequency very close to XBSX GPU clock speed.
     
    AzBat, blakjedi, scently and 2 others like this.
  14. zupallinere

    Regular Subscriber

    Joined:
    Sep 8, 2006
    Messages:
    763
    Likes Received:
    103
    Well to the right of the SHARE and SAVE buttons under the video itself there is a 3 dot overflow menu that has the option to Open Transcript which puts a oddly shaped but functional transcript to the right of the video itself.
     
    blakjedi and Shifty Geezer like this.
  15. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,995
    Cerny explains concepts, and uses examples and real world cases. Some of you guys seem to take the examples as if they are the claims, but they are just ways to explain why they think a determimistic variable clock is better than a fixed clock. The reasons given are ultimately all about cost/performance of the entire system. The limitations of fixed clocks are presented as either unused or borderline engineering margins because of lack of real world data and future data until it's too late. The example of the launch revision of the Pro was about predicting the required engineering margins which can only be guesswork. Guessing in engineering is really bad and caused a lot of problems in game consoles for over a decade, ever since we've had 150W-200W consoles.

    The cooling is going to indicate what average power we can expect, and the games real world benchmarks will tell us how it performs.

    I expect the cooling system size will be counter-intuitive, and I hope they will show it without giving any wattage figures so we can have some fun. ;)
     
    Globalisateur likes this.
  16. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,577
    Likes Received:
    2,852
    That sounds like most of the time it is hitting those or near those clocks in both CPU and GPU simultaneously
     
  17. one

    one Unruly Member
    Veteran

    Joined:
    Jul 26, 2004
    Messages:
    4,835
    Likes Received:
    156
    Location:
    Minato-ku, Tokyo
    Is it typical for a modern console game to always maintain 100% CPU and GPU load? With all scene complexity and difficulty to test all possible cases even if they are deterministic, I thought it would be wise to have some margin except for things like in-engine cut scenes. There are cases where frame drop happens, but even in such cases it might be rare that both CPU and GPU are 100% utilized.
     
  18. VitaminB6

    Newcomer

    Joined:
    Mar 22, 2017
    Messages:
    169
    Likes Received:
    217
    That's all we have at this point but he mentions certain examples for a reason in my opinion.
     
    blakjedi and PSman1700 like this.
  19. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,300
    Location:
    Australia
    If PS5's gpu spends most of the time at 9.2TF then it would clearly contradicts to what Cerny is promising and be very disingenuous to the marketed 10.3 TF. I can only imagine the thundering uproar from the core community and media alike, it would possibly put a bad name to Sony or Playstation for next gen which is undesirable to the company. But if it hovers around 9.8-10TF under heavy load then it's gonna be totally fine. Also cpu speed is gonna be a non issue at 4k or close to 4k res, so a slight downclock would literally be unnoticed during gameplay.
    During a multiplatform gameplay using XSX rendering at native 4k for comparison, if PS5 stays at 1800p most of the time then it must mean the gpu clock is heavily dropped and 9.2TF is most likely the standard number since there's a 44% pixel difference between 2160p and 1800p. If it stays at ~2000p then it would be less than 20% in pixel difference and Cerny would be correct after all. 2000p vs 2160p would be virtually undiscernable at a normal viewing distance or even face to the screen lol, it would require hardcore magic from DF to tell the story.
     
    blakjedi, goonergaz and VitaminB6 like this.
  20. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    3,241
    Likes Received:
    1,251
    Depends on what devs do entirely, maybe they use the extra power for higher settings, or have the option to, like on pc. We are already seeing more modes current gen.

    Talking about magic, they must have done something very special (smartshift?), from not being able to sustain 2ghz/3ghz to a 2.23/3.5 basically all the time. If smartshift is that good, all other manufacturers, including ms, have missed a great oppertunity here, they must have totally looked the other way when and had rhe tech available.
     
    egoless, Silenti and VitaminB6 like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...