Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I'm not sure I totally buy the distinction you are making... feedback algorithms for multi-bounce are taking samples that are "multi-bounce" as well after the first frame. And if you want to draw a line between doing that without taking additional visibility samples, so does ReSTIR, NRC and friends with both temporal and neighbor path reuse.

Well that’s the thing, NRC and ReSTIR don’t rely solely on reuse and feedback. They’re also tracing raw multi bounce paths each frame. Lumen may also claim this in the near future as one proposed enhancement is to bypass the surface cache altogether on future hardware. Lumen may one day cast multi bounce paths each frame and cache those paths in screen space similar to ReSTIR.

Lumen is really an amazing feat of software engineering but compared to other solutions it does seem really complicated and maybe unnecessarily so. It uses 3 radiance caches (screen space, texture space and world space probes), 3 geometry representations (detail SDF, global SDF and triangle BVH) and 3 tracing methods (screen, SDF, HWRT). That’s a ton of stuff to support. Hopefully it was worth it but we’re already seeing signs that in real use cases some of those options may go unused. In the matrix demo for example BVH tracing was a net quality win over SDFs and performance was similar.

D0E2ED1A-A2EE-4F0A-9B24-7F469D11CCE6.jpeg
 
I don't know what you mean here. The PDFs are a part of the BRDF which is per-pixel.
After finding a hit, next ray direction is selected purely based on probability instead of relying on separating lighting into components and solving them individually.

I wouldn't be that strict on the term personally. The GPUs today don't have enough bandwidth to implement the offline world's techniques. I'm ok with realtime path-tracing definitions for games. They certainly can't be done the same way and will need tricks/hardware/special drivers/etc.. in order to get something looking fairly decent.
I agree that it can't and it's amazing what NV achieved in Cyberpunk. Even in offline world path tracing was regarded as an impractical and purely theoretical algorithm, which won't ever be used in production. Still we have established terminology and IMO "full ray tracing / path tracing" should be rather "hybrid recursive ray tracing" and path tracing should be reserved for 10090ti or another insane GPU which NV will release in the future, indicating that now it can render images just like in the movies.
 
In my opinion, this whole Ray-tracing technology is overrated for its usefulness. It doesn't really add as much to the visuals of a game as can be achieved with other traditional methods, e.g. with a high-quality Cube map in e.g. racing games. It's good for something, technically interesting, but there are probably a dozen other areas in terms of game graphics that could be fundamentally more spectacular. Such is, for example, the professional application of HDR, which does not require a lot of hardware performance, but rather expertise.

UE5's software Lumen is a good thing, it can be implemented on current consoles, bringing enough visuals in terms of lighting and shading in this generation for Nanite's almost unlimited geometry.

Furthermore, I think it is more important that nowadays much more spectacular graphics can be achieved with quality assets, artistic arrangement, even with static or hybrid lighting, for which the current generation consoles are suitable! That is why it is much more a matter of MONEY nowadays to implement really good graphics.
 
That new ue5 game looks really impressive. I'm suprising it is coming out in the summer. Can this give me some hope that other games may be able to move to ue5.1 despite being developed a while ago? Final fantasy 7 rebirth let's go
 
@QPlayer Nah, rasterized games are way off in terms of lighting. There are a ton of problems that are never solved. If you bake lighting you can get a natural look but it's very limiting because you can't have dynamic environments.

Also one of the main drivers is saving money. Baking out lighting for blockbuster games is a huge money and time sink. It’s a dead end.
 
Lumen is really an amazing feat of software engineering but compared to other solutions it does seem really complicated and maybe unnecessarily so. It uses 3 radiance caches (screen space, texture space and world space probes), 3 geometry representations (detail SDF, global SDF and triangle BVH) and 3 tracing methods (screen, SDF, HWRT). That’s a ton of stuff to support.
I think that's unfortunately the nature of the beast in an engine that is used for such a variety of uses. At least currently each of these axes have compelling use cases for certain types of content, and each falls off a cliff in some other types.

Hopefully it was worth it but we’re already seeing signs that in real use cases some of those options may go unused. In the matrix demo for example BVH tracing was a net quality win over SDFs and performance was similar.
Right but to speak to the above, triangle RT is absolutely a non-starter in something like AncientGame. You can of course argue that the way they constructed that content is actively aggressive towards triangle RT, but the same problem existed to a lesser degree in the Lumen in the Land of Nanite demo. Triangle RT is also relatively inefficient for stuff like animated foliage. SDFs will probably still have a place for a while in dynamic aggregate geometry and kit-bashed stuff.

It of course would/will be great to get more unified, but for now if you want to make the best use of a wide range of hardware across a wide range of content, you do still need multiple solutions.

UE5's software Lumen is a good thing, it can be implemented on current consoles, bringing enough visuals in terms of lighting and shading in this generation for Nanite's almost unlimited geometry.
Sure but even "software" Lumen is still raytracing... Hell virtual shadow map filtering is now pretty raytracing adjacent. I realize many people often say just "raytracing" when they are talking about triangle RT, but even that gets fuzzy once acceleration structures and different representations/meshes in BVHs and complex reconstruction algorithms get involved.

That aside, triangle RT is indisputably an important part of the near/medium-term future of real time rendering. Not the be-all/end-all I don't think, but an important tool.
 
Last edited:
That "Open" image is pretty interesting for sure. Not sure about the other comparison... still pretty subtle and looks like it could be reflection or other differences rather than emissive specifically.

I found a few cases where I would have expected there to be light from emissive that there clearly isn't (I'm not at the right PC but there was some chinese-style lamps for instance that were clearly an emissive material but not casting light anywhere), but it's impossible to know if that's just because the scale was messed up or something on those specific objects. The Open sign difference does point to there being real emissive casting light there. I assume you can still see the light cast from it when it is offscreen?
 

bodycam unreal game

probably using photogrammetry?

not sure how they managed to create the quite realistic animations tho. animation capture from VR controllers?
 
Last edited:
That "Open" image is pretty interesting for sure. Not sure about the other comparison... still pretty subtle and looks like it could be reflection or other differences rather than emissive specifically.

I found a few cases where I would have expected there to be light from emissive that there clearly isn't (I'm not at the right PC but there was some chinese-style lamps for instance that were clearly an emissive material but not casting light anywhere), but it's impossible to know if that's just because the scale was messed up or something on those specific objects. The Open sign difference does point to there being real emissive casting light there. I assume you can still see the light cast from it when it is offscreen?
Definitely real emissives for diffuse and specular with RT overdrive. The original Psycho RT setting (or even medium) supported diffuse emissives.
Max Raster:
Max_Raster.png
RT Psycho:
RT_Psycho.png
RT Overdrive:
RT_Overdrive.png

And here is an example of the difference of the specular response between Psycho RT and RT Overdrive for an emissive surface.

RT Psycho:
RT_Psycho2.png

RT Overdrive:
RT_Overdrive2.png
 
Last edited:
btw maybe all of these RT talk should have its own thread? as it is indeed an interesting topic, but it spread wider than just unreal engine 5.
 
And here is an example of the difference of the specular response between Psycho RT and RT Overdrive for an emissive surface.
Hmm, so why is this case different? This is just a pure single bounce specular case that is really just raytraced reflections, right? It looks like the BRDF itself is different here... is this just because the RT reflection in "psycho" mode is being clamped by a roughness threshold or something?
 
Hmm, so why is this case different? This is just a pure single bounce specular case that is really just raytraced reflections, right? It looks like the BRDF itself is different here... is this just because the RT reflection in "psycho" mode is being clamped by a roughness threshold or something?
I cannot know for certain, it could be that or be a difference of what is exactly being traced against for the reflections. Maybe the Psycho RT setting does not actually have the emissive texture there to trace against.
 
In my opinion, this whole Ray-tracing technology is overrated for its usefulness. It doesn't really add as much to the visuals of a game as can be achieved with other traditional methods, e.g. with a high-quality Cube map in e.g. racing games. It's good for something, technically interesting, but there are probably a dozen other areas in terms of game graphics that could be fundamentally more spectacular. Such is, for example, the professional application of HDR, which does not require a lot of hardware performance, but rather expertise.

Only to those who are narrow minded, we are at a point now where visual quality is so high that effects that are not equally as high stick out badly.

Horizon Burning Shores is a prime example, it's stunning to look at but there's no hiding the dreadful SSR artefacts it has as they completely ruin the immersion, and as we get deeper in to this generation these artefacts are going to stick out even more.

But people need to stop focusing on what it does for gamers and look at what it does for developers, imagine the possibility where they no longer have to make cube maps or no longer need to spend hours upon hours pre-calculating GI, the time and thus cost savings from RT for developers are huge.

Look at Spiderman, it has so many cube maps that someone had to spend many, many hours creating and then spending even more hours lining them up with the environment so they sort of match, and even then they'll only look just 'OK' and not amazing in the finished product.

Now if Spiderman was using ray tracing only for reflections then all of those many many man hours creating those cube maps would be replaced by a simple 'RT reflections' tick box that takes 2 seconds to turn on.

Think about that for a second, something that used to take multiple people months to do now takes 2 seconds.

Ray tracing is ultimately going to enable shorter development times and better graphics at the same time.
 
Last edited:
Ray tracing is ultimately going to enable shorter development times and better graphics at the same time.

I like the sentiment. With regards to UE5, they've demonstrated that non-baked solutions * are covered with enough performance on console (Matrix and Fortnite at both ends of the visual feature scale).

It's hard to see the production benefit of non-baked with UE5 in particular when we're only a year into the full release. This year we have one big budget title launching for certain with Avium. The indie & AA scene is really quiet silent. Hopefully we have a raft of things shown off during the summer events.

* HW raytracing vs other approaches seems like it's own lengthy topic.
 
I like the sentiment. With regards to UE5, they've demonstrated that non-baked solutions * are covered with enough performance on console (Matrix and Fortnite at both ends of the visual feature scale).

It's hard to see the production benefit of non-baked with UE5 in particular when we're only a year into the full release. This year we have one big budget title launching for certain with Avium. The indie & AA scene is really quiet silent. Hopefully we have a raft of things shown off during the summer events.

* HW raytracing vs other approaches seems like it's own lengthy topic.
The only thing putting a snag in this atm that I am hearing from a number of devs is that Lumen quality is not very consistent at the moment for its use in a project with diverse environments, particularly for indoor scenes. So you can get pretty great results (as demos from epic show) with the outdoor scene lighting without a lot of issues, but as soon as you go indoor or overlapping indoor and outdoor then you start running into light leaking, noise, or issues of the surface cache being black. Some rooms are fine, some rooms are noisy, some rooms have completely black indirect lighting. Some of that is down to Lumen fidelity being limited because it has to be performant on the lower-end, but some of that is art asset set up (convex vs. concave areas in a mesh). I think that latter bit is causing issues at the moment for Lumen. I hope that Epic gets rid of that art asset limitation so devs embrace real-time lighting more often.
 
Back
Top