This may be in relation to no mention in low-level system code or a lack of provision in the command packets submitted to the GPU. Enabling various fixed-function features means sending commands to the command processor that set things like rasterizer modes, things not observable at the compute ISA level. If there's reference in the system code, or the command formats lack a place to enable the feature, then the feature cannot be toggled. The lack of reference isn't a guarantee of absence, as GPU microcode can be changed and the command processors can enact new behaviors after the change, but it's also not an encouraging sign that nothing mentions it.I'm completely unfamiliar with the term "no microcode reference."
No API, I could still imagine the hardware is there but it's yet to be exposed. Might the absence microcode reference be a relatively strong indication that the hardware simply isn't there to expose?
It at least has a form similar to what AMD called primitive shaders for Vega and later, which was more focused on the culling aspect while not delving into the deferred attribute portion so much.Very interesting...
Does the PS5 have primitive shaders (as per AMD patent https://patentimages.storage.googleapis.com/66/54/00/c86d30f0c1c61e/US20200193703A1.pdf) ?
Some of the wording about what the PS5's primitive shaders can do from Sony indicate capabilities that AMD only mentioned as future possibilities.
One thing I noted is how little AMD tries to talk about primitive shaders since the fanfare with the failed attempt with Vega. We know the culling-focused primitive shaders are used much more substantially with the most recent generation, but there's barely a blip in the marketing. Meanwhile, Sony's more feature-laden description gets called primitive shaders, so did they sort of take over the marketing name?
Microsoft's solution was able to handle standard Zen 2 FPU thermal density with a silicon (edit: silicone) thermal compound. Sony's solution has a liquid metal interface and substantially increased GPU thermal density, so it seems like it has some notably better measures for handling it without cutting the silicon.MS want commonality with the PC - that's the reason for the move to the GDK. Games that do use AVX256 need to be able to run just as fast on Xbox with minimal work. Might also increase the flexibility of cloud units - they won't always be full up with games.
Downside is that MS have to be able to deal with high thermal density of AVX256 no matter what, while staying almost silent. Tiny bit more die too.
Sony probably have a bit more leeway to cut back on this particular aspect of the CPU. It's all about dem tradeoffs, and they won't be the same for everyone.
I'm still not sure about the wins for reducing the FPU. In the grand scheme of things, was Sony that desperate to limit that dimension of the chip? There's non-CPU silicon on either end of the Zen 2 section that look like at least half of that saved area didn't go into silicon with any known use, so potentially hobbling performance for a handful of mm2?
Last edited: