So MegaLights uses Hardware Lumen by default, it also uses ray traced shadows instead of Virtual Shadow Maps by default, because ray traced shadows are cheaper to generate.MegaLights docs are up.
MegaLights exposes two shadowing methods, which can be selected per light using light component’s properties:
- Ray Tracing is the default and recommended method. It doesn't add any extra cost per light and can achieve correct area shadows. The downside is that the quality of the shadows depends on the simplified Ray Tracing Scene.
- Virtual Shadow Maps traces rays against a Virtual Shadow Map. Virtual Shadow Maps are generated per light using rasterization and can capture full Nanite mesh details. The downside is that it can only approximate area shadows, and it has a significant additional per light cost in terms of memory, CPU, and GPU overhead used to generate shadow depths. This should be used sparingly.
Ideally, MegaLights should be used with Lumen HWRT, which allows sharing Ray Tracing Scene overhead and optimizations between both systems.
It uses hardware triangle raytracing by default - it is not directly related to Lumen. As noted in the docs, the suggestion is use it together with hardware RT Lumen because then the cost of generating the RT acceleration structures is at least shared.So MegaLights uses Hardware Lumen by default, it also uses ray traced shadows instead of Virtual Shadow Maps by default, because ray traced shadows are cheaper to generate.
MegaLights do not currently support directional lights (ex. the sun). As I noted earlier in the thread, there's no fundamental reason but the tradeoffs definitely swing a bit since you typically want sharper shadows over larger areas, which is less feasible to maintain an appropriate RT structure. Nanite's LOD system and VSMs work well in that kind of case, whereas with local lights sharing a global structure to query obviously scales much better.@DavidGraham The doc is pretty nice and thorough in describing pros and cons. I do kind of wonder about the scene though. In an outdoor scene where there's basically one major light (the sun), would vsms win in that case? I remember some of those youtube demos with people playing around, and some of them suggested toggling megalights per scene. Would be interesting to see switching between indoors and outdoors in more open-world type games, and where the pros and cons fall.
You can skim the source if you are interested in the nitty gritty details. At a very broad level these methods are similar in that they are ways to stochastically pick light samples, but the specifics are pretty different (but still evolving).Also curious about their importance sampling, and whether it's novel, or similar to known solutions like restir.
Krzysztof Narkowicz said:MegaLights release took me a bit by surprise. It was just a small prototype on a backburner, not planned to be showcased in UE 5.5. Now trying to catch up after the demo push, starting from writing the official docs which just went online.
Years ago I talked to a bunch of companies at GDC and everyone was bringing up optimizing number of lights, disabling shadows with distance, light overlap... It felt like there was a big opportunity to transform their workflows, so I started prototyping ideas between other work. Main goal was to make it work across all target HW, including consoles. Otherwise it would just add extra work for artists setting up another lighting scenario, and high-end GPU lighting would be constrained by content choices, which had to be made to ship on the slower hardware. Two months ago there was this crazy idea: could we dogfood the latest prototype using a demo and shape it into an experimental feature? It was quite a ride, but thanks to @TiagoCostaV joining MegaLights, help from other engineers and an amazing art team we made it.
It’s early days. Lots of ideas on P4 shelves, algorithms to try and work left. Not ready yet to talk about tech details. Still, officially the cat is out of the bag, so it’s time to buckle up and deliver on the promise. Can't wait to see the first games shipping with this tech.
Nice and quick behind-the-scenes story on MegaLights. Five posts, in total.
Nanite Fallback Mesh triangle count and number of instances included in the Ray Tracing Scene will affect ray tracing BVH build times, used memory, and ray tracing performance. We recommend carefully increasing them according to the available performance and memory budget of your project.
For non real-time rendering, it’s also possible to use r.RayTracing.Nanite.Mode 1, which builds the Ray Tracing Scene out of full detail Nanite Meshes. This has a large performance and memory cost and can result in hitches during scene or camera animation when the Nanite LOD cut changes and its BVH needs to be rebuilt.
Yeah it's kind of sad but the reality is all the other systems suck a lot with big binary data in the repo. And you do really need one system for both the binary data and code changes in game development to keep all the serialization changes and hashes aligned.OMG, are they using Perforce? Why, this is crazy!
There aren't any local lights in CitySample by default, so you'd have to add them or MegaLights wouldn't do anything. The RT scene is already set up there of course, but mainly for the level of quality needed for reflections and GI, which may not be good enough for sharp local light shadows. That said would be easy to try and a fun experiment.This is interesting. I don't know if someone can use Megalights in Matrix Awakens demo but it would be interesting to see the cost of BVH build with Nanite mesh.
Yeah it's kind of sad but the reality is all the other systems suck a lot with big binary data in the repo. And you do really need one system for both the binary data and code changes in game development to keep all the serialization changes and hashes aligned.
There's *tons* of samples, internal/external demos, QA assets and other things that need to stay in sync. How do you think we develop the engine without test content?I can understand why you would want to use Perforce for Fortnite, but the repo for Unreal Engine should not have that many binary assets. And you have a source dump on Github as well!
I'm guessing that you meant to type "Frostbite Engine" . Did you mean DA: Inquisition or DA: Veilguard? DA2 used BioWare's internal engine, not Frostbite.On a side not, it was claimed the Bioware couldn't even version their patches on top of the Fortnite engine during development on DA2!
Games have very complex content and very complex runtime state -- we certainly do some things wrong but I've heard this kind of complaint about not taking patches fast enough from traditional tech before, and people are usually disabused of it pretty quick after coming over to games for a bit. Outside of specific types of games it's not very practical to update "underlying" tech that often -- it's a lot harder to have ultra clean interfaces where changes to dependcencies don't impact state or performance, or to cover the possible surface area in tests.Of course you know better than me what is better for Unreal Engine development. However it feels like the games industry is really bad at managing code in a good way, and I think that is one of the reasons why games today are built on years old versions of UE5.
On a side not, it was claimed the Bioware couldn't even version their patches on top of theFortniteFrostbite engine during development on DA2!
I've heard this kind of complaint about not taking patches fast enough from traditional tech before, and people are usually disabused of it pretty quick after coming over to games for a bit.
Outside of specific types of games it's not very practical to update "underlying" tech that often -- it's a lot harder to have ultra clean interfaces where changes to dependcencies don't impact state or performance, or to cover the possible surface area in tests.