Predict: Next gen console tech (10th generation edition) [2028+]

It wasn't PR. The presentation was intended as a reveal for developers at GDC. It wasn't ever targeting the general population and wasn't in any way trying to relate to the public. It was simply Cerny explaining the architecture and thinking behind it, focussing on the customisations because "It's a ZEN/RDNA AMD SOC" doesn't need any explaining.

Oh, it definitely was PR. Not solely, but it was definitely written as a pitch to the gaming media and the likes of Youtube influencers.

For a start it's not really a developer reveal months after PS5 devkits and documentation are already in the hands of developers. And you don't actually have to explain to professional developers that e.g. you have to build environments around hiding loading.

There are 16 million views of the video on the official consumer Playstation Youtube channel alone. What proportion of those are likely to be developers? And would developers actually go to to the advert publishing official consumer Playstation Youtube channel for development resources when there are no others there? And that's where you'd be thinking to upload presentations for devs day one?

So yeah it was very clearly a pitch to the games press (who mirrored the video) and to Youtube influencers as much as developers. And it was clearly very successful given the viewing figures, the buzz in the gaming press (and pseudo gaming press), and that, for example, the video gave the impression to just about everyone that "Geometry Engine" was some peice of PS5 secret sauce unique to PS5.

As another example, if R2PS5 really was for developers and not also intended for general marketing and hype building, then when talking about the power requirements of CPU vector instructions and boost clocks, they might have mentioned that they'd cut the vector units in half! That's the kind of thing developers might actually need to be aware of when you're talking about exactly that thing...?

I enjoyed R2PS5, but it hid as much as it revealed, it was as much flavour as substance, and it was the perfect time and place to do technical marketing to the general gaming public - months after the machine was already in the hands of developer.

Next gen will be no different. Marketing blending with technical information, obsfucation built into the reveal, and a massively undue emphasis placed on limited differentiating characteristics. MS and Sony have both done it and will continue to do it. I'll be a bit more sceptical of everyone and everything next time round.
 
Oh, it definitely was PR. Not solely, but it was definitely written as a pitch to the gaming media and the likes of Youtube influencers.

For a start it's not really a developer reveal months after PS5 devkits and documentation are already in the hands of developers. And you don't actually have to explain to professional developers that e.g. you have to build environments around hiding loading.

There are 16 million views of the video on the official consumer Playstation Youtube channel alone. What proportion of those are likely to be developers? And would developers actually go to to the advert publishing official consumer Playstation Youtube channel for development resources when there are no others there? And that's where you'd be thinking to upload presentations for devs day one?

So yeah it was very clearly a pitch to the games press (who mirrored the video) and to Youtube influencers as much as developers. And it was clearly very successful given the viewing figures, the buzz in the gaming press (and pseudo gaming press), and that, for example, the video gave the impression to just about everyone that "Geometry Engine" was some peice of PS5 secret sauce unique to PS5.

As another example, if R2PS5 really was for developers and not also intended for general marketing and hype building, then when talking about the power requirements of CPU vector instructions and boost clocks, they might have mentioned that they'd cut the vector units in half! That's the kind of thing developers might actually need to be aware of when you're talking about exactly that thing...?

I enjoyed R2PS5, but it hid as much as it revealed, it was as much flavour as substance, and it was the perfect time and place to do technical marketing to the general gaming public - months after the machine was already in the hands of developer.

Next gen will be no different. Marketing blending with technical information, obsfucation built into the reveal, and a massively undue emphasis placed on limited differentiating characteristics. MS and Sony have both done it and will continue to do it. I'll be a bit more sceptical of everyone and everything next time round.
Thanks, that saves me some time having to write up something myself. lol Well said. It was far too dumbed down if it was genuinely intended for developers.
 
It was a butterfly setup IIRC. When running BC they shut off half the GPU. Easiest most straight forward way to do BC. They didn’t have enhanced games IIRC; Games that were patched but could run at higher settings.

As far as I expect, it’s the same thing for PS4 BC on PS5. With the exception of “game boost” titles that have been modified to be some quasi enhanced mode. I think Xbox does something similar in terms of quasi emulation.
Correct, but that's why I say it's odd that people ever acted as though a larger GPU would pose any difficulty for BC.
 
Correct, but that's why I say it's odd that people ever acted as though a larger GPU would pose any difficulty for BC.
It’s certainly less work as it runs natively without emulation or patching. A larger GPU I don’t think is the issue, just whether the number of CUs would still be multiples of 18, and or 36. So 54CU or 72CUs is actually what I expected, the latter being a cost issue, 54 makes the most sense. But that doesn’t seem to be on the leaks.
 
Why does it have to be a multiple of 18? What does the scheduling type work flow management do that requires everything to be a multiple of 18? Why can't just 18 or 36 CUs be addressable for BC modes?
 
Why does it have to be a multiple of 18? What does the scheduling type work flow management do that requires everything to be a multiple of 18? Why can't just 18 or 36 CUs be addressable for BC modes?

I'd guess that below a certain point, the compiled code is talking to specific microcode or directly to individual bits of hardware. Without some kind of abstraction things might behave unpredicatably or not work at all, so it's best to have a hardware mode that works exactly as the old one did (similar to how the Megadrive had a Mastersystem mode).

MS did a lot of work to enable games to run on newer hardware with their VMs and stuff, and on PC things are at an even higher level (though with a lot of overhead).
 
I'd guess that below a certain point, the compiled code is talking to specific microcode or directly to individual bits of hardware.
But how does that map onto the CUs such that need be kept in groups of 18? What's the structure that 18 CUs wide or...I dunno. The whole point of work schedulers is they scale across all hardware which is why PC PGUs can have any number of CUs, down to small for portables and large for monster rigs. I get Sony devs may have been poking at specific hardware but 1) is that still a thing and 2) what precisely is the barrier to the scaling we get on PC?
 
Why does it have to be a multiple of 18? What does the scheduling type work flow management do that requires everything to be a multiple of 18? Why can't just 18 or 36 CUs be addressable for BC modes?
my understanding is that they shut off shader engines (18CU per shader engine) to provide native running versus running via emulation. That was my understanding of how it worked at a high level at least. To be able to use more hardware a patch or actual emulation code is required.
 
Last edited:
What determines a shader engine has to be 18 CUs? Can they not have an 18 CU engine and then an arbitrary other one? What happens if a shader engine is 24 CUs instead of 18 - how does that break the game?
 
I do find the whole multiples of 18 system very elegant.

On PS4:
18CU @ 800MHz - PS4 | Base Mode

On PS4 Pro:
18CU @ 800MHz - PS4 | Base Mode
18CU @ 911MHz - PS4 | Boost Mode
36CU @ 911MHz - PS4 Pro Patched

On PS5:
18CU @ 800MHz - PS4 | Base Mode
18CU @ 2233MHz - PS4 | Boost Mode
36CU @ 911MHz - PS4 Pro Patched
36CU @ 2233MHz - PS4 Pro Patched | Boost Mode
36CU @ 2233MHz - PS5 | Base Mode


And thinking ahead, for eg.

On PS5 Pro:
18CU @ 800MHz - PS4 | Base Mode
18CU @ 2500MHz - PS4 | Boost Mode
36CU @ 911MHz - PS4 Pro Patched
36CU @ 2500MHz - PS4 Pro Patched | Boost Mode
36CU @ 2233MHz - PS5 | Base Mode
36CU @ 2500MHz - PS5 | Boost Mode
54CU @ 2500mHz - PS5 Pro Patched

On PS6:
18CU @ 800MHz - PS4 | Base Mode
18CU @ 3300MHz - PS4 | Boost Mode
36CU @ 911MHz - PS4 Pro Patched
36CU @ 3300MHz - PS4 Pro Patched | Boost Mode
36CU @ 2233MHz - PS5 | Base Mode
36CU @ 2500MHz - PS5 | Boost Mode
54CU @ 2500mHz - PS5 Pro Patched
54CU @ 3300mHz - PS5 Pro Patched | Boost Mode
108CU @ 3300MHz - PS6 | Base Mode


As to whether it's a requirement to provide easier compatibility at this point, I honestly don't know. But it sure does satisfy my borderline OCD tendencies.

(also, not entirely sure if all modes on PS5 are handled quite like above..)
 
Last edited:
PS5 Pro is 60CUs according to 2 trustworthy leakers (also the only ones we have). The second specified 60CUs activated. If designed properly there should be no problem of activating only 18/36CUs for BC. PS5 Pro GPU will likely be heavily modified for BC, like PS5 was for PS4 BC.
 
What determines a shader engine has to be 18 CUs? Can they not have an 18 CU engine and then an arbitrary other one? What happens if a shader engine is 24 CUs instead of 18 - how does that break the game?
They decide how many CUs are in a shader engine. They have redundancy to account for which is why they are a little off from their equivalent PC architecture GPUs they are based on.

What breaks the game? If you use hardware natively and run a binary compiled for a different SoC and a different GPU with a different scheduler for that specific version of a SDK at that time; would it be expecting more than 18CU and 32 rops? I dunno. This is well beyond me. On MS side they have similar challenges, except that they have virtual machines that are responsible for emulation with BC titles that have been optimized to run over all the hardware. And any tweaking and performance can come from how they modify the VM.

Without those virtual machines I don’t think MS could do backwards compatibility period. They would probably have to follow what Sony has been doing.
 
But how does that map onto the CUs such that need be kept in groups of 18? What's the structure that 18 CUs wide or...I dunno. The whole point of work schedulers is they scale across all hardware which is why PC PGUs can have any number of CUs, down to small for portables and large for monster rigs. I get Sony devs may have been poking at specific hardware but 1) is that still a thing and 2) what precisely is the barrier to the scaling we get on PC?

It's 18 CUs on PS4 (with 9 per shader engine IIRC), and I think the games are distributed as builds that that talk directly to PS4 specific microcode(?) built specifically to this hardware. In compatibility mode the PS4Pro is probably using microcode that behaves very similarly to how it did on the original PS4, and it was probably much easier to do this by having hardware that could run in a PS4 Amateur mode.

I'm pretty sure I remember MS talking about how they had to do a lot of work to get the BC they have, and I think they also talked about having to do some clever stuff with microcode, but I can't find it now so that may be an alchohol induced hallucination.

During their Hotchips presentation for Series X, one of the Xbox developers talked about how Xbox games are made games using a "smash driver" model where the game and the driver are built together, resulting in vastly lower overheads for DX12 than on PC. On PC, the driver is separate and specific to each peice of hardware. This means great scalability, but the game that's distributed needs installing, needs shaders compiling, doesn't really talk directly to the GPU, and there are lots more CPU overheads.

(I think the above is fairly accurate, but if anyone in the know has anything to add or correct, please go for it!)
 
I believe BC is more important to MS than ever, imagine getting the next gen XBOX only to find out that your gamepass subscription gets you next gen only games which at launch would be around 10 or so and you no longer can play the remaining 300+ games Series X and S owners can. That would greatly reduce gamepass value starting a new generation and I can’t see MS sacrificing BC for the sake of a better SOC. This makes MS more limited to hardware options maybe even more than Sony.
 
I believe BC is more important to MS than ever, imagine getting the next gen XBOX only to find out that your gamepass subscription gets you next gen only games which at launch would be around 10 or so and you no longer can play the remaining 300+ games Series X and S owners can. That would greatly reduce gamepass value starting a new generation and I can’t see MS sacrificing BC for the sake of a better SOC. This makes MS more limited to hardware options maybe even more than Sony.

I think you've hit on a very important point, and while I partially agree with you, I have almost the complete opposite take on it.

Agree, Back-compat is more important to MS than Sony.
However I think Back-compat is basically essential to BOTH, neither company could afford to launch a new console without almost full back-compat.
I also think that Back-compat is Much easier for MS to acheive than Sony. Their console software stack uses a VM, and potentially multiple layers of VMs to provide at least 1 level of software abstraction
to the console games themselves. In addition the abstraction layer of DX / GDK provides also helps in any future efforts to provide back-compat for this gen.
I think this actually makes it much EASIER for MS to change the underlying hardware and still maintain 99% or more back-compat.

I dont know much about the details of Sony's SDK interface, but if they want to provide full back-compat in the future they need to emulate things at a much lower level,
and it becomes a lot trickier to do so. I think it also makes it much harder for them to go with a fully brand-new architecture.

tldr: MS has much more flexibility in their console HW options going forward,
and eventually Sony's lack of a VM, and use of direct to metal SDK's will actually end up hurting their ability to create band new HW for their consoles.
 
In theory because Microsoft has the Xbox Series S the new console technically can maintain library backwards compatibility just by being backwards compatible with the Series S and not the Series X. Which in turn means in theory they can have much more headroom to work with from any efficiency loss due to architecture departure.

There's going to be licensing component here but I wonder if they can give "backwards" compatibility via PC/Windows to some extent as well if they need to go that route.
 
It's 18 CUs on PS4 (with 9 per shader engine IIRC), and I think the games are distributed as builds that that talk directly to PS4 specific microcode(?) built specifically to this hardware.
But what do they talk to that they need everything grouped in 18s? If they address the shaders directly (I remember hearing that was the case talking about PS4Pro's design), indexed one to eighteen (or zero to seventeen!), more shaders shouldn't affect it. If they talk to the schedulers, don't they distribute work across available CUs?

I guess what I'm saying is GPUs are designed to scale arbitrarily and I don't understand what use case breaks that. Why can't we have a 60 CU part with the bottom 18 marked or addressable for BC mode or something and an extra 42 CUs for PS6 mode? Why would the entire GPU have the be structured around multiples of 18 due to PS4's design?
 
There's going to be licensing component here but I wonder if they can give "backwards" compatibility via PC/Windows to some extent as well if they need to go that route.

I strongly suspect that MS is now talking about licensing in a "for Xbox platform" way, rather than a "for specific console" way.
Specifically so they don't run into the problems of the past about having to go back and re-license music, or other assets for use of the same game, on a new platform via BC.

OT ( but not really ): Where's my Tony hawk 1+2 on Gamepass Microsoft?!!
 
But what do they talk to that they need everything grouped in 18s? If they address the shaders directly (I remember hearing that was the case talking about PS4Pro's design), indexed one to eighteen (or zero to seventeen!), more shaders shouldn't affect it. If they talk to the schedulers, don't they distribute work across available CUs?

I guess what I'm saying is GPUs are designed to scale arbitrarily and I don't understand what use case breaks that. Why can't we have a 60 CU part with the bottom 18 marked or addressable for BC mode or something and an extra 42 CUs for PS6 mode? Why would the entire GPU have the be structured around multiples of 18 due to PS4's design?

I don't know, unfortunately. Like you, I think it'd be interesting to know.

Maybe the schedulers distributed work across the GPU in a way that wasn't guaranteed to be 'safe' for games baked for PS4. Who knows what assumptions and optimisations Sony were able to make when they were making the PS4. It's interesting that PS4Pro had PS4 matched clocks in its core (original?) PS4 compatibility mode. Something there would seem to be timing dependent - at least if you wanted to be super sure. MS meanwhile could alter clocks at will with even just a refresh, but then again their software is supposed to be more abstracted.

I guess the closer you get to the metal and the more you bake into the product the easier it to break something with a change of hardware. Someone should message Mark Cerny!
 
Back
Top