Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

They literally stated that they only use primitive shaders for geometry that is not performant using Nanite (foliage, hair, grass, etc).

And there is no reason the Xbox or PC couldn't also use primitive shaders/mesh shaders for that geometry as well.

Are you sure about the primitive shaders part? I seem to remember epic only talking about doing rasterization in compute for small triangles and using hw rasterization for larger triangles. The actual format in which geometry is stored is still unknown. There are some guesses geometry could be somehow baked into textures+mip levels to allow streaming and lod.
 
Where? Can you share the link?
In one of the presentations they had a slide stating that only 10% of the geometry being rendered in the demo was using primitive shaders or something along those lines. The same slide stated that Nanite only works for rigid geometry, at least for now. And I think in the Digital Foundry article after the demo was released they quoted somebody at Epic saying that primitive shaders is a fallback rasterization technique for geometry that Nanite can't really handle well. This info is out there.
 
Last edited:
I found this quote. So it indeed looks like primitive shaders are being used in ps5 to gain performance.

The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit. As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders.
https://wccftech.com/lumen-gi-uses-...nly-nanite-exploits-primitive-shaders-on-ps5/
 
The quote referencing the PlayStation’s shaders is ambiguous. You can take it to mean that:

- the PS5s primitive shaders are used when the hardware path beats the software path if they have a solution where they do some parts with compute shaders and others with hardware rasterizers within the same engine on the same hardware.

Or

- the PS5 is an example of the hardware path being the faster option and everything on PS5 uses the primitive shaders in hardware.

The calculations above, if true, as well as the potential complexity of mixing and matching in software, probably suggests the latter?
 
The quote referencing the PlayStation’s shaders is ambiguous. You can take it to mean that:

- the PS5s primitive shaders are used when the hardware path beats the software path if they have a solution where they do some parts with compute shaders and others with hardware rasterizers within the same engine on the same hardware.

Or

- the PS5 is an example of the hardware path being the faster option and everything on PS5 uses the primitive shaders in hardware.

The calculations above, if true, as well as the potential complexity of mixing and matching in software, probably suggests the latter?

I believe using sw rasterization refers to those 1 pixel or less sized triangles that become very inefficient to shade with hw. But you are right. Technically it's possible to interpret in multiple ways.
 
@CorralX in era he is a dev

https://twitter.com/Corralx

https://www.resetera.com/threads/graphic-fidelity-i-expect-next-gen.184101/post-65676278

Nanite and mesh shaders are working at a different level of abstraction.
Mesh shaders are about a new way of feeding geometry to the rasterizer, allowing cooperative multitasking similar to compute in the graphics pipeline (which is one of the biggest constraint of the traditional pipeline, every thread is independent and cannot use any information produced by the others), and avoiding some fixed-function h/w blocks to become the bottleneck with arbitrary amount of geometry.

Nanite is about how you handle and process geometry in an efficient way, and decide *what* to send to the rasterizer in the first place. If tris are big enough, you feed them to the h/w rasterizer through mesh/primitive shaders (not a requirement tho, you could use the traditional pipeline as well), if they're 1 pixel-sized or smaller, you feed it the a software-based compute rasterizer (which can be faster than the h/w).
In a sense Nanite is both a client and a superset of mesh shaders, and does lod management and tris classification on top to decided what's the optimal path for each.

Primitive shaders is AMD's name for the h/w blocks to implement Mesh shaders (which is also both Nvidia name for the h/w feature, and DirectX name for the API functionality).
Somewhat confusingly, AMD decided to stick with the Primitive shaders name they introduced in Vega GPUs, which was a proto-Mesh shader functionality, but less flexible and not intended to be exposed to developers, but used by the driver automatically transforming Vertex/Geometry/Tessellation workloads into Primitive shaders.
That never really worked (outside of synthetic benchmarks and non-gaming applications), but the hardware was extended and re-purposed to implement Mesh shaders in RDNA, which are explicitly exposed and left for developers to implement.
In practice, there are no public details about the differences of Nvidia and AMD implementations of Mesh shaders, and there's no real use case as adoption in games is gonna be even slower than ray tracing.
 
Last edited:
I believe using sw rasterization refers to those 1 pixel or less sized triangles that become very inefficient to shade with hw.
Pretty sure even with the Nanite, UE5 uses deferred shading. Triangle sizes don't matter with deferred shading since all pixel shading (compute or via pixel shaders) happens for the full screen quad anyway with attributes being fetched from already rasterized geometry buffer (normals, albedo, roughtness, etc).

The same goes for rasterizers not being used efficiently, as long as those rasterizers are faster than SW kernel you simply don't care about their efficiency, if you can beat them in SW then you go for it (probably, because you can still use async to do more by abusing rasterizers and doing some compute in parallel).
 
Last edited:
Really cool stuff in ue5. Collaborative level building seems useful. Sterile looking graphics without plants, vegetation etc. This would be perfect for fallout as it is. I suppose vegetation and other stuff comes later and is not the technological focus here.
 
The preview or "early access" version and the new demo are available to download now.

https://www.unrealengine.com/en-US/unreal-engine-5

The Valley of the Ancients demo is 100Gb.

System requirements for the demo, which needs Early Access U5 to run:


"Valley of the Ancient is a separate download of around 100 GB. If you want to run the full demo, the minimum system requirements are an NVIDIA GTX 1080 or AMD RX Vega 64 graphics card or higher, with 8 GB of VRAM and 32 GB of system RAM. For 30 FPS, we recommend a 12-core CPU at 3.4 GHz, with an NVIDIA RTX 2080 or AMD Radeon 5700 XT graphics card or higher, and 64 GB of system RAM. We have successfully run the demo on PlayStation 5 and Xbox Series X consoles at full performance."
 
Last edited:
new aa solution, temporal super resolution keeps up with all this new geometric details to create sharper more stable image than before with qualty approaching true native 4k at a cost of 1080p
 
new aa solution, temporal super resolution keeps up with all this new geometric details to create sharper more stable image than before with qualty approaching true native 4k at a cost of 1080p

That made me think their Lumen performance improvements have ended up as just "upscale more". ;)
 
Last edited:
this is super sweet.

hard for someone to look at all this free powerful stuff and not want to use it. I'm sitting here looking at my Unity project pure jelly.

also should be more tempting than ever for people to just move UE5 movies. Which is also an option for sure.

I also have a hard time believing my PC can handle that level of rendering in the editor itself. Curious to see the requirements here.
 
Not only does it run fine on all platforms, but they imitated the ratchet world switch effect. Condolences to the console warriors throughout this thread. Preview looks amazing, grabbing this to start working in this week.
 
Back
Top