Digital Foundry Article Technical Discussion [2022]

Discussion in 'Console Technology' started by BRiT, Jan 1, 2022.

  1. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Sure, you could in theory run AI/ML based upscaling on a first generation Atom based CPU, or an Intel i286, after all it's just a "software problem". Is it going to run fast enough to be desirable?

    AI/ML based upscaling without hardware to assist it can be prohibitively expensive, especially when compared to more traditional compute based upscaling like what Developers have been using on consoles. However, with hardware assists (Tensor cores, for example), AI/ML based upscaling can offer higher quality with faster performance. That's why it becomes a "hardware problem".

    These implementations don't exist in a vacuum. Everything has to be weighted with consideration for their cost (performance) versus benefit (quality). If the cost is too high on X hardware then there is no benefit to using it. Thus for that particular implementation, it's a hardware problem because the hardware isn't fast enough to use it in a way that is beneficial for the product (in this case games) that potentially wants to use it.

    AMD's FSR 2.0 isn't using AI/ML based upscaling because the cost (performance) versus quality without dedicated hardware assist wasn't as good as going with a more traditional compute based approach. However, they claim that their implementation can approach the quality of AI/ML based solutions with acceptable performance. We'll see how those claims hold up once we can compare it other solutions in games.

    BTW - I'm one of the ones that doesn't particular like the Tensor based approached of DLSS just because it isn't universally applicable across multiple vendor's hardware. Even if other IHVs also had Tensor cores, it'd still be limited to NV hardware which still makes it the less desirable option to me. In that respect I like the promise of Intel's approach, although again, like FSR 2.0 we'll have to wait and see not only how good it is, but how well it performs.

    Regards,
    SB
     
    #801 Silent_Buddha, Mar 31, 2022
    Last edited: Mar 31, 2022
    Pete and PSman1700 like this.
  2. Dampf

    Regular

    Joined:
    Nov 21, 2020
    Messages:
    283
    Likes Received:
    473
    Well, I assume no game currently on PS5 comes even close to maxing Ryzen CPU out, so with every game thus far the PS5 should have been running at max GPU clocks.

    But it will be a more interesting discussion in the future.
     
    Pete and PSman1700 like this.
  3. davis.anthony

    Regular

    Joined:
    Aug 22, 2021
    Messages:
    423
    Likes Received:
    147
    According to Mark Cerny there should always be enough juice.
     
    Globalisateur likes this.
  4. Globalisateur

    Globalisateur Globby
    Veteran Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,592
    Likes Received:
    3,411
    Location:
    France
    The thing is, as usual, Alex is doubly wrong here.

    - First we know from dev insiders that PS5 rarely ever downclocks and that it has well enough power to have both CPU / GPU at full clocks. This was also said by Cerny since the beginning.
    - But more importantly from what we gathered from consumption tests is that it actually consumes less when it's CPU demanding. All tests showed this. And we already knew this based on PS4 and Pro fan noise and DF tests when they use cutscenes in various games to look for the max consumption. Matrix demo showed this as the max was actually found in one cutscene (~225W). During the "gameplay" the PS5 is often using up to ~200W-205W but even then we know from UE5 developers that this demo is actually light on CPU. Plenty others consumption tests with others games showed this.

    I like Alex thorough comparisons but those kind of remarks are totally disconnected from the facts. Ironically based on what we know the scenes he used in his comparisons are scenes that are actually very light on CPU (cutscenes) in a game already heavily limited by the GPU. This is in those kind of scenes that the PS5 could actually downclock. But thing is, it should very easy to check as they just need to test the consumption during the scene.

    We know the PS5 dynamic clock is deterministic and that it will downclock when a number of instructions will be exceeded. Those instructions are in direct correlation with the actual power consumption (because that's the whole point). And we know this current limit, it's about ~225W. Any game that will consume less than ~215w is very probably not triggering a downclock otherwise the whole system would not be fully deterministic.

    And that's assuming the power limit is actually 225W. Maybe it's higher!
     
    #804 Globalisateur, Mar 31, 2022
    Last edited: Mar 31, 2022
  5. davis.anthony

    Regular

    Joined:
    Aug 22, 2021
    Messages:
    423
    Likes Received:
    147
    It's even simpler to evidence than that, if PS5 was having clock issues it wouldn't be competing with or beating XSX in comparisons like it does.

    The only way PS5 matches or beats XSX is if it's running at full clocks.
     
    Globalisateur likes this.
  6. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    The tensor cores probably are doing something to aid in performance, ofcourse it can be done without them, but as you say we dont know how much performance would be lost.

    Ofcourse AMD gpu's can do ML reconstruction on their GPU's, but the question is how much of a performance impact that would have as these lack the seperate cores that NV gpus contain. Nvidia states these help in performance, but you (a random forum poster) claiming that 'muh, its just software' is what it is, in your vision they are lying.

    Absolutely, but with (probably) a higher performance impact. And that is where i am going with this. Some dont understand this.

    Nvidia lying, DF 'reaching and being fanboy'. What the hell is your problem? PS5 aint 'outperforming' a 2080 either, its a notch below that one generally. You cant just go after one title, it namely doesnt outperform it in other titles.

    Someone that understands.
     
    RootKit likes this.
  7. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    What clock a CPU is running or even how capable a CPU is has almost zero bearing on whether or not a game on one hardware platform is competitive to a game on another platform with powerful modern CPUs.

    As evidence I'll point back to when AMD's Phenom was competitive in games to Intel's Core lineup despite them being significantly worse CPU performers. Or how a lower clocked Intel Core CPU would perform exactly the same in many games as a higher clocked Intel Core CPU of the same generation. Why? Because games are more often than not GPU bound and not CPU bound.

    It's why on PC, you have to drastically lower the resolution you render a game at (720p or 1080p) combined with using the most powerful GPU you can install in order to see any significant differences between CPUs. Benchmarks at 1440p or 4k in most games will show virtually no differences between CPUs of drastically different CPU clocks or even drastically different CPU architectures.

    And what resolution are most games targeting on PS5 and XBS-X? 1440p to 2160p.

    Basically, current gen games up to this point on PS5 are going to be GPU bound the vast majority of the time. Thus, you'll never know if the CPU is downclocked or not.

    A game would be more likely to hit CPU performance boundaries on the previous generation with the Jaguar cores.

    Regards,
    SB
     
    RootKit and PSman1700 like this.
  8. Flappy Pannus

    Regular

    Joined:
    Jul 4, 2016
    Messages:
    329
    Likes Received:
    567
    From the written version:
    If his goal was to show the PC as a platform in the best possible light as well, then why no DLSS comparisons? It seems the focus of this video was to simply compare the two versions of the same game on the same engine and see what power on the PC is needed to match the PS5 performance, and it certainly paints the PS5 in a good light. The reasoning may be flawed, but by and large, a game actually requiring a 2080 super/TI on the PC to match the PS5 version at the same quality settings is usually not the norm. Hell the majority of Alex's videos lately have been focusing on how awful most recent PC ports have been with the stuttering issues!

    Alex's platform is obviously the PC first and foremost, and his reasoning for his theory on why the PS5 is performing exceptionally well in this game may indeed be suspect, but there's no reason for this kind of platform warring nonsense, albeit that seems to be your thing here.
     
    #808 Flappy Pannus, Mar 31, 2022
    Last edited: Mar 31, 2022
    RootKit, DavidGraham, Pete and 3 others like this.
  9. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    Cerny said PS5 would spend "most of its time at, or close to, that performance" (meaning highest clocks). I think drops will become more likely as the generation goes on, but even then they won't usually be significant.

    While we're seeing cross gen games I'd guess that this won't even be an issue. Even when Zen 2 is the baseline, PS5's half width SIMD units should shield it from the kind of huge power loads that AVX 256 can demand (it's the biggest power stressor for the cores). And if PS5 isn't relying on full rate AVX 256, it's hard to see any game being made to really rely on that kind of SIMD performance.

    Cerny obviously thinks the payoffs of limited CPU SIMD vs higher GPU clocks is worth it, and I'm not going to say he's wrong.

    Most interesting thing in the DF video for me was seeing that the extra width of the 5700XT is almost entirely without benefit in this game. This lines up perfectly with the old school wisdom about width and overheads and extra work and all that.

    I'm of the opinion that performance matters most earlier in the generation. I think Cerny/Sony made some good choices about how to get the most out of the die area and power budget, especially during the all important transitional period.
     
  10. davis.anthony

    Regular

    Joined:
    Aug 22, 2021
    Messages:
    423
    Likes Received:
    147
    What is your problem?? Others have validated that there is something wrong/fishy with those Nvidia slides, there's only you who seems to be in denial over it and can't see it, even when others members see it and have also pointed it out.

    There's nothing wrong with highlighting things that don't look 'right' - It's a good thing to challenge those kind of things.

    And what are you even talking about I can't 'go after one title'? I can assure you for the context of the discussion I can go after that one title as It's pretty clear I'm talking about the context of the latest video and not in general.
     
  11. davis.anthony

    Regular

    Joined:
    Aug 22, 2021
    Messages:
    423
    Likes Received:
    147
    I'm struggling to see what that has to do with my comment regarding PS5's clock speed vs it's performance in relation to XSX.
     
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Re-read your post that I replied to. I was basically dispelling the myth that how a game runs is evidence of the clock speed the PS5 CPU is running at.

    The PS5's CPU clocks have almost Zero bearing on how well it competes with the XBS-X because current gen. games are very rarely CPU limited at the resolutions that the PS5 and XBS-X games target with the GPUs that they have in them.

    The PS5 CPU could be clocked at 2.0 GHz and games would more than likely still perform about the same as they are at 3.0 GHz. The same could be said for the XBS-X. It's why for most games, it would likely have been beneficial if XBS-X also had dynamic clocks as it wouldn't need to run the CPU at full power all of the time.

    Regards,
    SB
     
    RootKit and PSman1700 like this.
  13. davis.anthony

    Regular

    Joined:
    Aug 22, 2021
    Messages:
    423
    Likes Received:
    147
    But in known CPU heavy scenes (Corridor of death in Control with RT enabled) PS5 is basically neck and neck with XSX.

    Same with 120fps modes which should be more CPU heavy and equally GPU heavy - PS5 does very well here too.
     
  14. iroboto

    iroboto Daft Funk
    Legend Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,833
    Likes Received:
    18,632
    Location:
    The North
    No. This is not the right way to make the comparison.
    You're comparing 2 different chips, with different memory configurations, running on 2 different variants/graphical settings of the same software with the whole thing obstructed by DRS.
    According to your statement here:
    A boost clocked 2.23GHz 200W max PS5 would be able to perform identical to a fixed clock 2.23 Ghz PS5 with no power limit.

    Pretty sure that's not likely to be true at all. It's a claim that's probably been covered honestly.
    Some GPU utilities allow us to fix power draw maximums on GPUs, and also allow us to fix clock speeds increasing power draw. In such tests, the results should not be the same.
     
    #814 iroboto, Mar 31, 2022
    Last edited: Mar 31, 2022
    RootKit, PSman1700 and function like this.
  15. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    I feel there are a number of misconceptions and unsound certainties in the feedback loop, and I'm going to try and use my limited ability (and tiny mind) to try and introduce a bit of "woah there, lets not be so definite about this" into the thread.

    Downclocking will depend on the workload. Just because early games rarely do, doesn't mean later games won't. Cerny was intentionally none committal about specifics because, as he said, he can't be certain about how software will use hardware in the future (and he's not a bullshitter). He specifically said :

    "We expect the GPU to spend most or all of it's time at, or close to, that frequency".

    He made sure not to say "always at", and didn't even say that it "always had to be close to".

    I think it's importint that if you're going to name drop Cerny, actually say what he did. If he knows enough to leave room for uncertainty and less common circumstances, I think we should pay attention to that.

    But that could be because CPU bottlenecks are causing GPU underutilisation, it's doesn't necessarily mean that high CPU and GPU workloads can't result in GPU clocks dropping. If your CPU is limiting GPU activity then your total power consumption can drop dramatically even as CPU power consumption is high.

    Fan noise is not always directly linked to power consumption. Chips have a number of thermal sensors on them, normally at potential hotspots, and a particular sensor could cause fan noise to spike even though average temperature across all sensors and power consumption are not at a level that would generally cause lots of fan noise.

    "Fan noise = power consumption" is not guaranteed. That's not a certainty.

    Yeah, cut scenes can be a great place to optimise quality and increase power consumption even if CPU use is limited. Turn up the number of lights, turn up the DoF quality, use your highest quality back scattering skin shader etc....

    It's a bit more complicated than that. Certain instructions can require more power to execute, so it's not just the number of instructions. Sony / AMD will have taken this into account, of course.
     
    #815 function, Mar 31, 2022
    Last edited: Mar 31, 2022
  16. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    For a more complete analysis one could compare to AMD GPU's in the dGPU space, as this game could be more optimized towards the other architecture. Generally the PS5 hovers around a 2070/2070S, sometimes better sometimes worse.

    I am not only talking about some slides, i am talking about the whole presentation and technical explanations that surrounds the RTX-IO/DS implementation on modern hardware. As other members have pointed out to you, yes some details are missing, but that doesnt mean everything was just to fool us.

    'Muh, PS5 perfroms on 2080 levels' is more of a general statement than 'PS5 performs like a RTX2080 in this title, but so does a 6600XT/RX6700', theres context in this one. In general, its closer to a 2070/S than a 2080S.

    Yeah.
     
    RootKit likes this.
  17. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    That could easily be because the PS5 fixed function GPU hardware advantage is eating away the XSX shader (and B/W) advantage.

    PS5 / XSX / XSS CPU advantage is about 4x over last gen Jaguar cores (about 8x for XSX / XSS if you include SIMD). It's unlikely any cross gen game is really pushing the limits of these Zen 2 CPUs.

    Even 120hz modes are unlikely to radically increase the power hungry SIMD loads as you'd normally see these utilised to handle some aspect of simulation, which should be frame rate independent.
     
    RootKit, Pete and PSman1700 like this.
  18. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    And in that scene, the PS5 is performing 12% slower than a 2080. Being very close to a RTX2070S (1% differentional). Taking into account that his game favours AMD hardware, as DF notes in this video, we are back at the RTX2070 levels somewhere again.

    I think RX6600XT (narrow, high clocked RDNA2 gpu at 10.6TF) is the closest you'd get to the PS5, and then some.

     
    pharma and RootKit like this.
  19. snc

    snc
    Veteran

    Joined:
    Mar 6, 2013
    Messages:
    2,115
    Likes Received:
    1,745
    Good to see vsync perf penalty was taken into account. ps5 gpu shows realy good perf here, nice advantage over 5700xt
     
  20. Dictator

    Regular

    Joined:
    Feb 11, 2011
    Messages:
    681
    Likes Received:
    3,969
    To answer to the critique put forward here: we have heard from devs making games which target heavy CPU usage that the GPU in PS5 does in fact downclock. But whether an end user notices this in a game with TAA, post-processing and DRS is a whole other question. That is the point of the PS5 Design.

    For example - think of a game with an unlocked framerate to 120 with DRS targetting a high output res. How exactly does that fit into a fixed and shared power budget? The obvious answer is it stresses both CPU and GPU to their max and power adjusts
     
    #820 Dictator, Apr 1, 2022
    Last edited: Apr 1, 2022
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...