Digital Foundry Article Technical Discussion [2024]

I don’t see any reason to believe ps5 is not using the rdna2 front end, especially now with Remedy saying support is essentially the same. The ability to output a single primitive as the output of a mesh shader would be a critical difference.
 
I don’t see any reason to believe ps5 is not using the rdna2 front end, especially now with Remedy saying support is essentially the same. The ability to output a single primitive as the output of a mesh shader would be a critical difference.
Sebbbi and others have indicated it in the other tweet found on the other thread.

it’s a single difference and there is only one between the two. AMD rushed to get this prepared for RDNA2, we know this. Because it wasn’t until rdna3 was mesh shading properly rectified.

For alignment sakes the entirety of dx12U, series consoles, and amd gpus all moved as a single group. MS cited they needed to wait for AMD to finish, this was long after developers have had a year with PS5 devkits. I don’t think it’s possible amd could have finished rdna2 a whole year earlier with the release of rdna1 and Sony being there with kits and MS without. It’s just not a likely scenario.
 
DX12U is an abstraction of the hardware. These arbitrary requirements are not necessarily of any importance in a console environment that doesn't use DX.
 
Sebbbi and others have indicated it in the other tweet found on the other thread.

it’s a single difference and there is only one between the two. AMD rushed to get this prepared for RDNA2, we know this. Because it wasn’t until rdna3 was mesh shading properly rectified.

For alignment sakes the entirety of dx12U, series consoles, and amd gpus all moved as a single group. MS cited they needed to wait for AMD to finish, this was long after developers have had a year with PS5 devkits. I don’t think it’s possible amd could have finished rdna2 a whole year earlier with the release of rdna1 and Sony being there with kits and MS without. It’s just not a likely scenario.

I don’t see any indication in sebbbi’s tweet that ps5 is rdna1 or 2. He just says amd calls it primitive shaders, which is what amd uses at the driver/hardware level. Dx12u mesh shaders implemented with primitive shaders.
 
I don’t see any indication in sebbbi’s tweet that ps5 is rdna1 or 2. He just says amd calls it primitive shaders, which is what amd uses at the driver/hardware level. Dx12u mesh shaders implemented with primitive shaders.
Unless my comprehension is wrong:

He’s saying that the first implementation didn’t match dx12 perfectly and was not exposed. Sony calls this primitive shaders.

The latter sentence referring to the one prior. As in, Sony is still using AMDs first implementation of mesh shaders.
 
Unless my comprehension is wrong:

He’s saying that the first implementation didn’t match dx12 perfectly and was not exposed. Sony calls this primitive shaders.

The latter sentence referring to the one prior. As in, Sony is still using AMDs first implementation of mesh shaders.

Don't know who this Locuza person is, how credible they are or what their track record is. For sebbbi's tweet all he says is Sony still calls the feature primitive shaders. That doesn't really determine whether it's RDNA1 or RDNA2. What we do know is Remedy said the mesh shader implementation is almost the same as pc, which means it should be able to take meshlets, skipping the input assembler, and output primitives to the rasterizer. AMD, starting with RDNA2, supports mesh shaders by mapping them onto their NGG pipeline. They basically reuse the NGG pipeline and made changes to support the mesh shader model. So what's the likely scenario here? Sounds to me like PS5 is RDNA2 and when Sony announched the hardware they just used AMD actual hardware names (NGG, primitive shaders) instead of using the Mesh Shader name which is an API level abstraction from Microsoft and then Vulkan.
 
Don't know who this Locuza person is, how credible they are or what their track record is. For sebbbi's tweet all he says is Sony still calls the feature primitive shaders. That doesn't really determine whether it's RDNA1 or RDNA2. What we do know is Remedy said the mesh shader implementation is almost the same as pc, which means it should be able to take meshlets, skipping the input assembler, and output primitives to the rasterizer. AMD, starting with RDNA2, supports mesh shaders by mapping them onto their NGG pipeline. They basically reuse the NGG pipeline and made changes to support the mesh shader model. So what's the likely scenario here? Sounds to me like PS5 is RDNA2 and when Sony announched the hardware they just used AMD actual hardware names (NGG, primitive shaders) instead of using the Mesh Shader name which is an API level abstraction from Microsoft and then Vulkan.

Almost the same means it isn't the same. I think that's pretty evident then that it isn't the same. The missing piece from what we've been able to piece together is that the PS5 doesn't support amplification shaders which are part of what MS call Mesh Shaders in Dx12. Hence Remedy choosing not to use amplification shaders.

Regards,
SB
 
Almost the same means it isn't the same. I think that's pretty evident then that it isn't the same. The missing piece from what we've been able to piece together is that the PS5 doesn't support amplification shaders which are part of what MS call Mesh Shaders in Dx12. Hence Remedy choosing not to use amplification shaders.

Regards,
SB

In the video I think they say there are small differences like recommended meshlet sizes.

Also how and where did this get pieced together?
 
Last edited:
In the video I think they say there are small differences like recommended meshlet sizes.

Also how and where did this get pieced together?

Going back to DF Direct 135 where Alex talks about AW2:


"...they're also using Mesh Shaders, or Primitive Shaders on PS5, and actually when we were at Gamescom I did talk with Tatu Alto about this who was their TD (Technical Director) and he mentioned that they were using Mesh Shaders or the equivalent of Primitive Shading on PS5 which is something that RDNA 1 has, it's very close to Mesh Shading - just doesn't have the amplification stage that's about it - and there they're using it to fine grain cull away geometry using meshlets <snip>"

Going by this conversation it certainly doesn't sound like Primitive Shaders on PS5 can do everything that Mesh and Amplification shaders on Series X / RDNA2 can do. So it seems unlikely it's using the same front end.

From the MS developer blog on Mesh and Amplification shaders:

"What does an Amplification Shader do?
While the Mesh Shader is a fairly flexible tool, it does not allow for all tessellation scenarios and is not always the most efficient way to implement per-instance culling. For this we have the Amplification Shader. What it does is simple: dispatch threadgroups of Mesh Shaders. Each Mesh Shader has access to the data from the parent Amplification Shader and does not return anything. The Amplification Shader is optional, and also has access to groupshared memory, making it a powerful tool to allow the Mesh Shader to replace any current pipeline scenario.
"

Also, RDNA1 doesn't support per primitive attributes. Perhaps PS5 does, but there's been nothing to suggest it's different to RDNA1 so far, afaik.
 
Almost the same means it isn't the same. I think that's pretty evident then that it isn't the same. The missing piece from what we've been able to piece together is that the PS5 doesn't support amplification shaders which are part of what MS call Mesh Shaders in Dx12. Hence Remedy choosing not to use amplification shaders.

Regards,
SB

DX 12 supports task shaders that are used in conjunction with mesh shaders which is a Nvidia solution. AMD has primitive shaders which are analogous to mesh shaders but instead of amplification/task shaders, initially AMD offered surface shaders which are pre-tessellation shaders that require utilization of the hardware tessellator. Task shaders on AMD utilize compute shaders that run in parallel with the graphics pipeline in which most of the hard part to get it to work is hidden behind firmware.
 
Last edited:
DX 12 supports task shaders that are used in conjunction with mesh shaders which is a Nvidia solution. AMD has primitive shaders which are analogous to mesh shaders but instead of amplification/task shaders, initially AMD offered surface shaders which are pre-tessellation shaders that require utilization of the hardware tessellator. Task shaders on AMD utilize compute shaders that run in parallel with the graphics pipeline in which most of the hard part to get it to work is hidden behind firmware.
Yes. But amplification shaders still has a hardware requirement to work. It can’t be back ported to rdna1 as per Timothy’s blog. It would never work.

As far as I understand his blog, Hardware is required in order for the graphics and compute queues to communicate with each other in a task.

Yes it’s a compute shader. But it is highly specialized. It can be emulated, but you’ll miss out on the efficiency compared to the hardware variant.
 
Last edited:
Yes. But amplification shaders still has a hardware requirement to work. It can’t be back ported to rdna1 as per Timothy’s blog. It would never work.

As far as I understand his blog, Hardware is required in order for the graphics and compute queues to communicate with each other in a task.

Yes it’s a compute shader. But it is highly specialized. It can be emulated, but you’ll miss out on the efficiency compared to the hardware variant.
Timur or Timothy? Timur mentions that RDNA1 is missing some features such per-primitive outputs that make mesh shading support impossible. I don't recall anything specifically targeting task shaders.
 
Timur or Timothy? Timur mentions that RDNA1 is missing some features such per-primitive outputs that make mesh shading support impossible. I don't recall anything specifically targeting task shaders.
Timur. My thanks on the correction.

The issue comes down that to realize the speed gains, the data has to stay on chip, graphics pipeline and compute shaders don’t share cache. In order for their to be benefit both has to be able to access each others cache, otherwise through emulation you’re always sending data back to ram.
 
Yes it’s a compute shader. But it is highly specialized. It can be emulated, but you’ll miss out on the efficiency compared to the hardware variant.
I suspect on fixed hardware this is more of an academic point than a practical one, you can schedule around whatever tiny infefficiency you encounter here, it’s unlikely to be a bottleneck in a real production. On pc any wrinkle like this much bigger challenge due to the variety of hardware configurations, not to mention the split memory.
 
I also suspect this is much ado about nothing. DX is more of an anchor than an enabler of efficient hardware usage.
 
Reviewing Timur's blog:

The relevant details here are that most of the hard work is implemented in the firmware (good news, because that means I don’t have to implement it), and that task shaders are executed on an async compute queue and that the driver now has to submit compute and graphics work in parallel.

Keep in mind that the API hides this detail and pretends that the mesh shading pipeline is just another graphics pipeline that the application can submit to a graphics queue. So, once again we have a mismatch between the API programming model and what the HW actually does.

Task (amplification shaders) don't map to AMD hardware on RDNA2. They basically make them work on an async compute queue and there's firmware changes to handle the dispatch. I would guess RDNA1 is obviously missing the necessary firmware updates, but that's likely because RDNA1 can't support mesh shaders anyway, and the only purpose of task shaders is to launch mesh shaders.

NGG (Next Generation Geometry) is the technology that is responsible for any vertex and geometry processing in RDNA GPUs (with some caveats). Also known as “primitive shader”, the main innovations of NGG are:
  • Shaders are aware of not only vertices, but also primitives(this is why they are called primitive shader).
  • The output topology is entirely up to the shader, meaning that it can createoutput vertices and primitives with an arbitrary topology regarless of its input.
  • On RDNA2 and newer, per-primitive output attributes are also supported.

So change here to support mesh shader's on RDNA is per-primitive output attributes.

From the DF interview with Remedy:


Mesh shaders were picked just because of nicely uniform support on the platforms, on the console platforms, and the pc that we're running. And then the fact that it's possible to build more granular culling ... So it was kind of natural to take that as the basic building block going forward, and it's also supported on the current generation consoles so that's kind of like the low end for, or the baseline, for the stuff that we do now since the previous generation was cropped out.

Playstation is actually very close to pc specs, so I don't think we did anything specific to meshlets really. The meshlet sizes are slightly different on platforms, but that's basically conversion time stuff, so not really. Obviously there's been a lot of platform specific optimizations, but those are not so much specific to Playstation, Xbox or AMD or Nvidia, but they are kind of like general optimizations that help usually everything on all of the platforms, and some of them happen to work a bit better on some platform than other ones.

What I'm getting from this is the APIs that Sony provides are essentially very close as a mesh shader implementation such that they had a general solution that worked across all of the platforms to the point that they didn't have to do different optimizations for each platform.

So my conclusion from this is the playstation 5 likely has the RDNA2 changes to support mesh shaders, because otherwise it would require a tailored solution. In terms of task shader (amplification shader) support, RDNA2 on pc and xbox doesn't map it well in hardware either. It's compute shaders dispatched by firmware made to work like task shaders. Maybe PS5 doesn't have that firmware update. Maybe it does. But I'm not really seeing anything that would tell me the PS5 isn't capable, and since it most likely has the mesh shader changes it probably has the firmware changes for task shaders too.

Edit ** It's either that or RDNA1's NGG is so close RDNA2's that they can essentially implement the same thing, and there was some small technicality that prevented them from complying with the spec on PC. Or they had no way to reliably push the firmware change for amplification to gpus that were already sold. The most likely conclusion, for me, is that these gpus are practically the same. It's APUs for consoles launching at almost the same time from the same vendor, with the same CPU family, and likely the same gpu family **

When Sony talked about the PS5 hardware they likely discussed it in terms of NGG and primitive shaders, because that's how the hardware is actually implemented (on PC, Xbox and Playstation). They're not likely to talk about "mesh shaders" because that was a Microsoft API concept that was actually modelled closer to Nvidia hardware. Sony's API on the other hand sounds like it has shaders that function so similarly to PC mesh shaders that you can build a generalized solution that works for both. They might call it primitive shaders or something, but the ergonomics of the API sound lke they're almost the same. It's close enough that Remedy is calling it mesh shaders, calling it uniform support and saying it didn't require specific optimization.
 
Last edited:
Happy to see the finals get some love. It's nice to see the destruction make a comeback. In terms of the actual game, I'd say it's actually a game where RT GI is an advantage and you do not want to turn it off. If you set it to static GI many areas will become way too dark if interior lights and things are being destroyed. You're just fighting in darkness. At least with my 3080 I have nvidia rtx gi set to dynamic epic and I can still get very high fps with DLSS quality. My optimized settings are everything low, except draw distance epic, textures epic and rtx global illumination dynamic epic (keep global illumination resolution on low as that seems to be the real perf killer). What you want is the light to accumulate quickly so rooms will brighten as walls are blown open, but it doesn't need to be super accurate. Draw distance probably on low if you have an older cpu. I think shadows on low disables most shadows, but bumping to medium will give you pretty much all of the shadows of a lower quality. Don't think the hit is too big on that one, but I wanted the extra frames. Pretty sure i'm normally in the 160-220 range at 1440p (DLSS quality) on an RTX 3080.

Edit: usually in these games draw distance does not affect players, so you can safely set draw distance to low if your cpu isn't up to the higher settings. Also, I'd try rtx global illumination dynamic low before static if you need the performance for the reasons explained above and as mentioned in the video. You want the extra light in constrained spaces.
 
Last edited:
Back
Top