Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Wukong has some pretty crazy geometric detail courtesy of Nanite in places.

Screen-Shot00001.jpg

Screen-Shot00003.jpg
 
MegaLights docs are up.
So MegaLights uses Hardware Lumen by default, it also uses ray traced shadows instead of Virtual Shadow Maps by default, because ray traced shadows are cheaper to generate.

MegaLights exposes two shadowing methods, which can be selected per light using light component’s properties:
  • Ray Tracing is the default and recommended method. It doesn't add any extra cost per light and can achieve correct area shadows. The downside is that the quality of the shadows depends on the simplified Ray Tracing Scene.
  • Virtual Shadow Maps traces rays against a Virtual Shadow Map. Virtual Shadow Maps are generated per light using rasterization and can capture full Nanite mesh details. The downside is that it can only approximate area shadows, and it has a significant additional per light cost in terms of memory, CPU, and GPU overhead used to generate shadow depths. This should be used sparingly.

Ideally, MegaLights should be used with Lumen HWRT, which allows sharing Ray Tracing Scene overhead and optimizations between both systems.
 
So MegaLights uses Hardware Lumen by default, it also uses ray traced shadows instead of Virtual Shadow Maps by default, because ray traced shadows are cheaper to generate.
It uses hardware triangle raytracing by default - it is not directly related to Lumen. As noted in the docs, the suggestion is use it together with hardware RT Lumen because then the cost of generating the RT acceleration structures is at least shared.

The VSM tradeoff is similar - if you are spending time to generate an RT acceleration structure it is generally cheaper to sample it rather than additionally generating a second entirely different structure (SDFs, shadow maps, etc) to sample. There are of course exceptions as noted in the docs, but usually if you can afford to have a RT scene of sufficient quality then using it for multiple queries is going to be the best.
 
@DavidGraham The doc is pretty nice and thorough in describing pros and cons. I do kind of wonder about the scene though. In an outdoor scene where there's basically one major light (the sun), would vsms win in that case? I remember some of those youtube demos with people playing around, and some of them suggested toggling megalights per scene. Would be interesting to see switching between indoors and outdoors in more open-world type games, and where the pros and cons fall.

Also curious about their importance sampling, and whether it's novel, or similar to known solutions like restir.
 
@DavidGraham The doc is pretty nice and thorough in describing pros and cons. I do kind of wonder about the scene though. In an outdoor scene where there's basically one major light (the sun), would vsms win in that case? I remember some of those youtube demos with people playing around, and some of them suggested toggling megalights per scene. Would be interesting to see switching between indoors and outdoors in more open-world type games, and where the pros and cons fall.
MegaLights do not currently support directional lights (ex. the sun). As I noted earlier in the thread, there's no fundamental reason but the tradeoffs definitely swing a bit since you typically want sharper shadows over larger areas, which is less feasible to maintain an appropriate RT structure. Nanite's LOD system and VSMs work well in that kind of case, whereas with local lights sharing a global structure to query obviously scales much better.

In any case you don't really need to toggle MegaLights per scene - you can mix and match lights through both paths. VSM local lights can still go through MegaLights for projection, which garners some of the benefits.

Also curious about their importance sampling, and whether it's novel, or similar to known solutions like restir.
You can skim the source if you are interested in the nitty gritty details. At a very broad level these methods are similar in that they are ways to stochastically pick light samples, but the specifics are pretty different (but still evolving).
 
Last edited:
Nice and quick behind-the-scenes story on MegaLights. Five posts, in total.


Krzysztof Narkowicz said:
MegaLights release took me a bit by surprise. It was just a small prototype on a backburner, not planned to be showcased in UE 5.5. Now trying to catch up after the demo push, starting from writing the official docs which just went online.

Years ago I talked to a bunch of companies at GDC and everyone was bringing up optimizing number of lights, disabling shadows with distance, light overlap... It felt like there was a big opportunity to transform their workflows, so I started prototyping ideas between other work. Main goal was to make it work across all target HW, including consoles. Otherwise it would just add extra work for artists setting up another lighting scenario, and high-end GPU lighting would be constrained by content choices, which had to be made to ship on the slower hardware. Two months ago there was this crazy idea: could we dogfood the latest prototype using a demo and shape it into an experimental feature? It was quite a ride, but thanks to @TiagoCostaV joining MegaLights, help from other engineers and an amazing art team we made it.

It’s early days. Lots of ideas on P4 shelves, algorithms to try and work left. Not ready yet to talk about tech details. Still, officially the cat is out of the bag, so it’s time to buckle up and deliver on the promise. Can't wait to see the first games shipping with this tech.
 
Last edited:
Nice and quick behind-the-scenes story on MegaLights. Five posts, in total.


This is interesting. I don't know if someone can use Megalights in Matrix Awakens demo but it would be interesting to see the cost of BVH build with Nanite mesh.

Nanite Fallback Mesh triangle count and number of instances included in the Ray Tracing Scene will affect ray tracing BVH build times, used memory, and ray tracing performance. We recommend carefully increasing them according to the available performance and memory budget of your project.

And
For non real-time rendering, it’s also possible to use r.RayTracing.Nanite.Mode 1, which builds the Ray Tracing Scene out of full detail Nanite Meshes. This has a large performance and memory cost and can result in hitches during scene or camera animation when the Nanite LOD cut changes and its BVH needs to be rebuilt.

This is interesting to be able to see the cost of raytracing and micropolygon BVH.

EDIT: At least during night mode with emissive light
 
Last edited:
OMG, are they using Perforce? Why, this is crazy!
Yeah it's kind of sad but the reality is all the other systems suck a lot with big binary data in the repo. And you do really need one system for both the binary data and code changes in game development to keep all the serialization changes and hashes aligned.

I wish there was a better solution, but for bigger projects everything else I've tried breaks in bad ways with 1TB+ repositories (that's the sync size, not including the history) and huge files.
This is interesting. I don't know if someone can use Megalights in Matrix Awakens demo but it would be interesting to see the cost of BVH build with Nanite mesh.
There aren't any local lights in CitySample by default, so you'd have to add them or MegaLights wouldn't do anything. The RT scene is already set up there of course, but mainly for the level of quality needed for reflections and GI, which may not be good enough for sharp local light shadows. That said would be easy to try and a fun experiment.

r.RayTracing.Nanite.Mode 1 is good for offline stuff and cinematics but not really usable for real-time. We need API changes before true Nanite streaming w/ RT can be done well.
 
Yeah it's kind of sad but the reality is all the other systems suck a lot with big binary data in the repo. And you do really need one system for both the binary data and code changes in game development to keep all the serialization changes and hashes aligned.

I can understand why you would want to use Perforce for Fortnite, but the repo for Unreal Engine should not have that many binary assets. And you have a source dump on Github as well!
 
I can understand why you would want to use Perforce for Fortnite, but the repo for Unreal Engine should not have that many binary assets. And you have a source dump on Github as well!
There's *tons* of samples, internal/external demos, QA assets and other things that need to stay in sync. How do you think we develop the engine without test content? :)

The source dump on github is indeed mirrored from the P4, but it doesn't include content, so things like serialization/content changes that happen between engine releases are not usually available until the next engine release. In the vast majority of cases the engine will handle content backwards compatibility but sometimes there will be unsupported intermediate states.

Don't get me wrong, I'd love to use git or similar for source, but the reality of often having to commit content and code in synchronized atomic chunks trumps it.
 
Of course you know better than me what is better for Unreal Engine development. However it feels like the games industry is really bad at managing code in a good way, and I think that is one of the reasons why games today are built on years old versions of UE5.

On a side not, it was claimed the Bioware couldn't even version their patches on top of the Fortnite Frostbite engine during development on DA2!
 
Last edited:
On a side not, it was claimed the Bioware couldn't even version their patches on top of the Fortnite engine during development on DA2!
I'm guessing that you meant to type "Frostbite Engine" :unsure: . Did you mean DA: Inquisition or DA: Veilguard? DA2 used BioWare's internal engine, not Frostbite.
 
Of course you know better than me what is better for Unreal Engine development. However it feels like the games industry is really bad at managing code in a good way, and I think that is one of the reasons why games today are built on years old versions of UE5.

On a side not, it was claimed the Bioware couldn't even version their patches on top of the Fortnite Frostbite engine during development on DA2!
Games have very complex content and very complex runtime state -- we certainly do some things wrong but I've heard this kind of complaint about not taking patches fast enough from traditional tech before, and people are usually disabused of it pretty quick after coming over to games for a bit. Outside of specific types of games it's not very practical to update "underlying" tech that often -- it's a lot harder to have ultra clean interfaces where changes to dependcencies don't impact state or performance, or to cover the possible surface area in tests.
 
I've heard this kind of complaint about not taking patches fast enough from traditional tech before, and people are usually disabused of it pretty quick after coming over to games for a bit.

Is it an inherent problem though or is it hard because of poor past decisions? There are similar fragility issues with legacy software in traditional tech too.
 
Outside of specific types of games it's not very practical to update "underlying" tech that often -- it's a lot harder to have ultra clean interfaces where changes to dependcencies don't impact state or performance, or to cover the possible surface area in tests.

You don't need clean interfaces if you have good patch management. It's the same thing as out of tree drivers and file systems in Linux, they also have to continuously rebase to the latest version.
 
Back
Top