Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I also find it amusing that people talk the GPU curve as if it’s fixed forever. Sony can always push out firmware updates to adjust the curve.
This won't happen. Even for BC they want do deactivate CUs, downclock etc. Somehow Sony is not able to reach BC on "standard" x86 hardware. How is this still possible in 2020 (or even on PS4 Pro)
 
Could you give a little more context about the "might not want" or "will not" part of your question? Road to PS5 is interesting but I sadly lack the time to keep going through it!

The reason MS talked about 3.8 being constant is because they were asked specifically about 3.8.

(Near the bottom) https://www.anandtech.com/show/1599...ft-xbox-series-x-system-architecture-600pm-pt

"09:35PM EDT - Q: Is link between CPU and GPU clocks? A: Hardware is independent.

09:36PM EDT - Q: Is the CPU 3.8 GHz clock a continual or turbo? A: Continual.

09:36PM EDT - Continual to minimize variance"

The clocks are continual. If AVX affected that, they wouldn't be. But we can demonstrate this! Earlier in the presentation (same link) MS stated that "AVX256 gives 972 GFLOP over CPU" (quoted from Dr Ian Cutress' transcription).

32 Flops/cycle * 8 cores * 3.8 gHz = 972.8 GFlops.

So we can say that:
- 972 GFlops is at 3.8 ghz
- The 3.8 gHz is continual to minimise variance. It is not a boost, and it is not affected by the GPU.

And this presentation wasn't from MS PR people, it was prepared by two members of the Azure silicon team. Legit experts. These people aren't clowns, and are every bit as professional as Cerny.

Btw, at 3.6 ghz, peak AVX2 output would be lower at 921.6 GFlops. It's one AVX256 instruction per cycle, per core. Which is not to say you couldn't squeeze in a tiny bit more work with HT enabled, but if you're hammering the AVX256 in a tight loop like a PC torture test there's not much time for anything else. And if you want to really burn in your chip and test the thermals and cooling you use some kind of AVX torture test.



There are no MS changes to Zen 2, and none are necessary to fix clocks.

What this article is describing is running to the limits of the server package based on temperature, voltage, power, whatever. You alter clocks to stay within limits.

If you have a fixed clock regimin where all operations (including AVX256) are below the power, temperature, voltage limits etc that the chip/system has then you know you will never need to downclock.

For example: the 3700X has a base clock of 3.6 (16 threads) and a boost clock of 4.4. Even with AVX256, if set up correctly, it won't drop below its base clock of 3.6. Now image if you turned boost off. You could run at 3.6 all day long, whatever you threw at it, and it'd be solid at 3.6.

That's basically what Xbox series X is doing.

Well, thanks you very much for the enlightment. You were clear.

As for my question about Cerny, I was asking if you knew what is exact words were. But I'll check that for myself.
 
As for my question about Cerny, I was asking if you knew what is exact words were. But I'll check that for myself.

Ah right, I get you! Yeah, if you list to about five minutes on from here he explains about consistency across PS5s, how it's continuous varying boost (complete with graphic!), mentions AMD Smartshift, says why they have the max boost clock they do, gives an example of 2% clock drop saving 10% power, and also says this:

"... [of 2.23 ghz] we expect the GPU to spend most of it's time at, or close to, that frequency and performance"
.


So basically, if you see someone saying "PS5 spends most of its time at 2.23, Cerny said!" or "It only ever drops by 2%, Cerny said!" you now know they're either mistaken, or attempting to mislead.

His actual words are far more interesting and enlightening than some of the console war edits going round. :D
 
This won't happen. Even for BC they want do deactivate CUs, downclock etc. Somehow Sony is not able to reach BC on "standard" x86 hardware. How is this still possible in 2020 (or even on PS4 Pro)
If you're referring to the GPU, the widely-accepted speculation is that in particular GNM is very thin - in that games are shoving data into GPU registers so you need the new chip to work exactly like the old one because there is no fat API to take legacy API input, adapt for new/changed hardware and return a value that is expected by the legacy calling code. In his talk early this year, Mark Cerny referred to this as a "unification of functionality" which "took years of efforts by AMD" because "any [GPU architecture] roadmap advancement creates a potential divergence in logic running ps4 and PS 4 titles". Mark Cerny also commented on boosted frequencies running too fast for some code but he didn't give specific examples.

It's seems evident that when engineering the design of PS4, Sony gave little thought to running PS4 code on a much different architecture in the future. It's probably limited some of their technical choices for PS5, whether they care to admit it or not.
 
Last edited by a moderator:
Ah right, I get you! Yeah, if you list to about five minutes on from here he explains about consistency across PS5s, how it's continuous varying boost (complete with graphic!), mentions AMD Smartshift, says why they have the max boost clock they do, gives an example of 2% clock drop saving 10% power, and also says this:

"... [of 2.23 ghz] we expect the GPU to spend most of it's time at, or close to, that frequency and performance"
.


So basically, if you see someone saying "PS5 spends most of its time at 2.23, Cerny said!" or "It only ever drops by 2%, Cerny said!" you now know they're either mistaken, or attempting to mislead.

His actual words are far more interesting and enlightening than some of the console war edits going round. :D
Why do we always take Cernys words as given, but on the other hand don't believe what other producers say about their hardware. Even developer-events are more or less PR events, just for developers ;)

The easiest thing would just have been that they build in a bigger PSU if the cooling can handle it. We know the GPU can sustain 2.23 Ghz all the time, because in the dev-kits you can fix that frequency (even if it just for testing purpose).

I know it is a more efficient way to get all of out of a specific power usage limit that you set yourself. On the other hand you could say xbox has wastes potential for not opening the clocks for some dynamic clocks.
 
Why do we always take Cernys words as given, but on the other hand don't believe what other producers say about their hardware. Even developer-events are more or less PR events, just for developers ;)

What hardware designers/architects/engineers are being dismissed? :???:
 
We know the GPU can sustain 2.23 Ghz all the time, because in the dev-kits you can fix that frequency (even if it just for testing purpose).

Is there a source on that? We know the clocks can be fixed on dev kits, but I've not seen anything that says the clocks can be fixed at 2.23, much less regardless of whatever the heck you throw at it!
 
Why do we always take Cernys words as given, but on the other hand don't believe what other producers say about their hardware. Even developer-events are more or less PR events, just for developers ;)

Cerny isn’t the sole ’engineer’ behind the console design, or atleast i hope not. Probably a whole team of engineers desinging the thing, with some input from Him.
Otherwise, ofcourse He is going to say the best case scenario, millions of people where going to watch that video, its also marketing, in special if it was the first time.

If the apu would basically never drop, only in extreme rare cases (and when only some procent), they wouldnt bother with this downclocking smartshit tech which sure isnt cheap to design into the soc. Also wouldnt explain how they could go from 2/3ghz to 2.23/3.5ghz, if the problem that caused it would never happen anyway?
 
Ah right, I get you! Yeah, if you list to about five minutes on from here he explains about consistency across PS5s, how it's continuous varying boost (complete with graphic!), mentions AMD Smartshift, says why they have the max boost clock they do, gives an example of 2% clock drop saving 10% power, and also says this:

"... [of 2.23 ghz] we expect the GPU to spend most of it's time at, or close to, that frequency and performance"
.


So basically, if you see someone saying "PS5 spends most of its time at 2.23, Cerny said!" or "It only ever drops by 2%, Cerny said!" you now know they're either mistaken, or attempting to mislead.

His actual words are far more interesting and enlightening than some of the console war edits going round. :D

??? Now I'm lost...
First because that has nothing to do with AVX, and second, because since you have variable clocks adjusted every 2 ms, according to the load, you could never say anything diferent. Could you?
Besides the drop of 2% was mentioned in a worst case scenario where, for some reason, you would pass the power envelope on either the CPU or GPU and smartshift was unable to compensate. In that case a drop of a pair of percentage in frequency would allow to gain 10% power back.
 
??? Now I'm lost...
First because that has nothing to do with AVX, and second, because since you have variable clocks adjusted every 2 ms, according to the load, you could never say anything diferent. Could you?
Besides the drop of 2% was mentioned in a worst case scenario where, for some reason, you would pass the power envelope on either the CPU or GPU and smartshift was unable to compensate. In that case a drop of a pair of percentage in frequency would allow to gain 10% power back.
It's been a while since I watched the Cerny presentation, but I'm pretty sure it was "couple percents to drop consumption 10%", which doesn't necessarily mean it's exactly 2 %. Also the power consumption drop isn't linear at all, in fact the "couple percents to drop 10% consumption" sounds more like they've already gone past the optimal clock range for that SoC
 
Last edited:
??? Now I'm lost...
First because that has nothing to do with AVX, and second, because since you have variable clocks adjusted every 2 ms, according to the load, you could never say anything diferent. Could you?
Besides the drop of 2% was mentioned in a worst case scenario where, for some reason, you would pass the power envelope on either the CPU or GPU and smartshift was unable to compensate. In that case a drop of a pair of percentage in frequency would allow to gain 10% power back.

That part was mostly about the GPU and CPU power and clock balance (including with AVX loads), but didn't specifically mention AVX. To hear him specifically say that AVX256 take a lot of power (which is obvious) you'd have to have scanned back 50 seconds.


You did hear him say that with fixed clocks, running at 3.0 ghz on CPU was causing headaches though - with the context being heat and power. Take a wild guess at what's likely to take the most power and generate the most heat. ;)

(And yes, he's basically saying in this presentation that they chose not to provision for full AVX load at full clocks under all conditions, where as at Hotchips MS explicitly gave their max AVX figure and said that clock is constant at 3.8 and GPU has no impact. Different approaches, both are valid, both have tradeoffs.)

And just because you have the capacity to vary clocks every 2 ms doesn't mean you will. You will only do that if you need to.

And finally: The 2% GPU drop was not a worst case example. It was merely "an example" - of how frequency can affect power. He did not give a worst case example. He did not attempt to. He doesn't know what the worst case will be.

But I can tell you that without a doubt the worst case loads can vary by far, far more than 10% from the typical.
 
This won't happen. Even for BC they want do deactivate CUs, downclock etc. Somehow Sony is not able to reach BC on "standard" x86 hardware. How is this still possible in 2020 (or even on PS4 Pro)
These two things are unrelated. BC has nothing to do with it.

PS4 and Xbox One both got mid-lifetime updates either giving developers more CPU or memory resources on already released hardware. Xbox One got a clock bump after production had already started.
 
But I can tell you that without a doubt the worst case loads can vary by far, far more than 10% from the typical.

I know this for a fact. But people are free to believe what they want. It’s a 9 to 10TF console at heart still, smartshift helped out alot from being a 9TF static, you basically get what pc gpus have had for years, just the other way around and not temperature dependend, but power.
 
It's been a while since I watched the Cerny presentation, but I'm pretty sure it was "couple percents to drop consumption 10%", which doesn't necessarily mean it's exactly 2 %. Also the power consumption drop isn't linear at all, in fact the "couple percents to drop 10% consumption" sounds more like they've already gone past the optimal clock range for that SoC

That's a shame because I was going to underclock my PS5 by around 700Mhz have gave it consume no power at all.

Damn you, physics!
 
Ah right, I get you! Yeah, if you list to about five minutes on from here he explains about consistency across PS5s, how it's continuous varying boost (complete with graphic!), mentions AMD Smartshift, says why they have the max boost clock they do, gives an example of 2% clock drop saving 10% power, and also says this:

"... [of 2.23 ghz] we expect the GPU to spend most of it's time at, or close to, that frequency and performance"
.


So basically, if you see someone saying "PS5 spends most of its time at 2.23, Cerny said!" or "It only ever drops by 2%, Cerny said!" you now know they're either mistaken, or attempting to mislead.

His actual words are far more interesting and enlightening than some of the console war edits going round. :D
It has already being confirmed by a known insider at resetera working closely on PS5 games (the mod Matt) that what Cerny said was true. The PS5 GPU clock is variable, yes, but it should be considered as a ~10tf machine as it spends most of the time at this level.
 
It has already being confirmed by a known insider at resetera working closely on PS5 games (the mod Matt) that what Cerny said was true. The PS5 GPU clock is variable, yes, but it should be considered as a ~10tf machine as it spends most of the time at this level.

I mean, my whole spiel is that people should listen to Cerny. But real Cerny and not some console war spin on what he said.

I really don't have a problem with some dev saying it spends most of its time at the clocks to make it a "~10 tf machine". It's what I'd expect based on what I've seen, especially when it's early days. Of course, as the machine is pushed harder through release and in the years to come, those clocks will probably drop a little, just as XSX will probably push its PSU and aspects of its cooling harder.

Edit: I'm also expecting fewer of the sharp, narrow troughs you see on some PC frequency graphs! I'm sure Sony will have tools to help you engineer out PC style wild dips. No doubt the profiler and lack of PC style driver will help!
 
Last edited:
Is there a source on that? We know the clocks can be fixed on dev kits, but I've not seen anything that says the clocks can be fixed at 2.23, much less regardless of whatever the heck you throw at it!

Why in the hell would you want fixed clocks on the dev kits? If you game won't cause any downclocking you basically have a fixed clock without any intervention (flipping a fixed clock switch). If your game does force downclocking, wouldn't you want to know to address as soon as possible?
 
Last edited:
Status
Not open for further replies.
Back
Top