Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Looks natural to me? It's very bright outside the open wall
Isn't that depending on the place you want to have proper exposure?

The lumen seems want to see the insides properly exposed, while non lument want to see the outside properly exposed
 
Isn't that depending on the place you want to have proper exposure?

The lumen seems want to see the insides properly exposed, while non lument want to see the outside properly exposed

They do that iris adjustment thing. If you step into a dark room you'll see exposure increase and then everything out of the windows will look a little overexposed. Not the biggest fan of that approach, because you can see the transition a little too much. But overall I think everything looks pretty nice and natural looking.
 
BTW, I've requested the acceleration of signed distance fields in the DX12 discord. This was someone's response:
I think part of that answer may be something a little lost in translation -- the answer is very focused on mathematical functions and curves to define sdfs. "Signed distance field" means, well, what it sounds like -- a field where at any given point you can sample a distance from a surface, and the distance is a positive number if you're outside the surface and a negative number if you're in the surface. It's quite easy to construct these with pure math -- you can get the signed distance from the surface of a sphere by subtracting the radius of the sphere from the distance between your point and the center of the sphere. Of course, more complex surfaces get into some pretty intense math. With reasonable expectations you can compute the sdf of a bunch of simple shapes -- cubes, spheres, etc -- in real time, which is how a lot of cool vfx on shadertoy work.

Unreal engine (and most videogames) uses a different approach to get sdfs though -- they store an approximate sdf in a 3d texture. So they have a 3d grid of pixels with distance values and mip maps at progressively lower resolutions with more approximate values. This has a number of pros and cons -- it's vastly less precise than the mathematically perfect alternative, it's extremely memory intensive vs just a list of shapes, but it's also got a predictable cost, it can be traversed quickly and approximately by starting at the low resolution mip maps and working to the high, and it's easy to automatically generate for any possible 3d mesh.

Anyway -- that's some general info on what's being done, but unlike the person on the dx12 discord I am not smart or knowledgeable enough to answer your question about hardware -- here's a paper that came up on the first google result about it though -- it looks like it's specifically interested in the grid based approach I described, but I can't really follow it: https://www.jcgt.org/published/0011/03/06/paper-lowres.pdf It seems like a place of active research, I'm sure it's something epic is looking into.
 
I could have had a look at Fortnite on the Xbox but decided to have a go on my aging 1060 laptop instead. Forgot to turn on Lumen on the first go. Made me appreciate the difference. I was running at 50% internal res scaled to 1080p. Almost hit 60fps at some points. :)

I was expecting more GI lag when knocking down walls. I didn't notice anything egregious.

I came 3rd on my second go. Given I was mostly knocking down walls and then driving around on a dirt bike it probably says something the game's matchmaking. I only died as the pickaxe doesn't seem to be the most powerful melee weapon. What a let down.
 
@Andrew Lauritzen would that be a reasonable way to accelerate SW-Lumen with Ray accelerators /RT cores in UE5?
Tracing SDFs is actually pretty fast in most cases. The limitations with SDFs are generally more around memory and precompute... they are both quite expensive/slow to precompute and they take a lot of space (as they are effectively volumetric textures for everything). So as with the various voxel techniques, they can scale pretty poorly with world size and there is not reasonable way to handle deformation at all since they cost so much to compute. There are of course ways to mitigate the various disadvantages, but I think if you were to look at some sort of acceleration, the biggest help would be on the *construction* side. That said, they are fairly fundamentally expensive from a computational perspective so it's hard to guess at how much HW acceleration would even help.

(Aside: I believe the presentation from Brian I linked earlier in the thread touches on SDFs and voxels a little bit too.)

I wonder if the 120fps mode still works
It does, but I believe it disables most of the new stuff (Nanite, Lumen, VSM - unsure on TSR).
They do that iris adjustment thing. If you step into a dark room you'll see exposure increase and then everything out of the windows will look a little overexposed. Not the biggest fan of that approach, because you can see the transition a little too much. But overall I think everything looks pretty nice and natural looking.
Local tone mapping is also basically a necessity once you start doing GI with a mix of interiors and exteriors.

[snip]
With Lumen
Yeah I was watching some side by side spectating with the new stuff on vs off and IMO the biggest obvious differences are the interiors, where Lumen makes a huge difference and makes things look so much less flat.

[snip ... in freaking Fortnite!
Someone mentioned earlier already, but just to reiterate you may need to adjust the preconceptions on Fortnite content now to be honest :D Especially stuff like the trees are likely the most complex assets shipped in a 60fps game to date. Yes the game is intentionally stylized but in terms of the underlying rendering tech and complexity it's certainly one of the most demanding:
1) dynamic world with lots of active/potential deformation on pretty much everything
2) time of day system, continuously moving sun
3) fully dynamic lighting and GI from both the sun and local lights
4) fairly high poly on many assets including geometrically-modelled foliage
5) multiplayer and building means the dynamic range of content is very large and hard to test

I'm not saying you have to like the end result or anything (personally I can go either way on the visual style), but in terms of the raw complexity of the rendering challenge, it is in many ways a more complex set of constraints than even the Matrix demo and previous UE5 stuff.
 
Last edited:
Tracing SDFs is actually pretty fast in most cases. The limitations with SDFs are generally more around memory and precompute... they are both quite expensive/slow to precompute and they take a lot of space (as they are effectively volumetric textures for everything). So as with the various voxel techniques, they can scale pretty poorly with world size and there is not reasonable way to handle deformation at all since they cost so much to compute. There are of course ways to mitigate the various disadvantages, but I think if you were to look at some sort of acceleration, the biggest help would be on the *construction* side. That said, they are fairly fundamentally expensive from a computational perspective so it's hard to guess at how much HW acceleration would even help.

(Aside: I believe the presentation from Brian I linked earlier in the thread touches on SDFs and voxels a little bit too.)

There's some interesting work with compressing SDFs with neural nets: https://arxiv.org/pdf/1901.05103.pdf as well as some mesh to SDF conversion in realtime, though only for proxy meshes and still .Xms per mesh: https://github.com/Unity-Technologies/com.unity.demoteam.mesh-to-sdf

It's interesting to see what can be done. I wish the Media Molecule would explain what they finally settled on for Dreams, they mentioned maybe switching from point splatting to SDFs at one point? And foliage of all things just animates arbitrarily there.

There is the question of surfaces for SDF, they get bad as you go high res. People messing with NERFs have tried SDFs for basic shape representation with point splatting for surface detail with some success.
 
some comparison between 60 and 120fps mode on PS5
 

Attachments

  • Fortnite_20221205220607.jpg
    Fortnite_20221205220607.jpg
    1.6 MB · Views: 40
  • Fortnite_20221205221025.jpg
    Fortnite_20221205221025.jpg
    1.7 MB · Views: 22
  • Fortnite_20221205221011.jpg
    Fortnite_20221205221011.jpg
    1.3 MB · Views: 24
  • Fortnite_20221205220956.jpg
    Fortnite_20221205220956.jpg
    1.3 MB · Views: 23
  • Fortnite_20221205220940.jpg
    Fortnite_20221205220940.jpg
    1.8 MB · Views: 22
  • Fortnite_20221205220753.jpg
    Fortnite_20221205220753.jpg
    1 MB · Views: 22
  • Fortnite_20221205220739.jpg
    Fortnite_20221205220739.jpg
    821.7 KB · Views: 21
  • Fortnite_20221205220720.jpg
    Fortnite_20221205220720.jpg
    591.3 KB · Views: 22
  • Fortnite_20221205220703.jpg
    Fortnite_20221205220703.jpg
    658.1 KB · Views: 21
  • Fortnite_20221205220629.jpg
    Fortnite_20221205220629.jpg
    2.5 MB · Views: 22
suite
 

Attachments

  • Fortnite_20221205221327.jpg
    Fortnite_20221205221327.jpg
    487.5 KB · Views: 21
  • Fortnite_20221205221310.jpg
    Fortnite_20221205221310.jpg
    393.4 KB · Views: 21
  • Fortnite_20221205221200.jpg
    Fortnite_20221205221200.jpg
    475.2 KB · Views: 21
  • Fortnite_20221205221145.jpg
    Fortnite_20221205221145.jpg
    465.8 KB · Views: 21
  • Fortnite_20221205221134.jpg
    Fortnite_20221205221134.jpg
    556 KB · Views: 19
  • Fortnite_20221205221116.jpg
    Fortnite_20221205221116.jpg
    618.4 KB · Views: 20
  • Fortnite_20221205221102.jpg
    Fortnite_20221205221102.jpg
    1.1 MB · Views: 20
  • Fortnite_20221205221047.jpg
    Fortnite_20221205221047.jpg
    1.3 MB · Views: 19
  • Fortnite_20221205222306.jpg
    Fortnite_20221205222306.jpg
    639 KB · Views: 19
  • Fortnite_20221205222321.jpg
    Fortnite_20221205222321.jpg
    482.7 KB · Views: 36
There's some interesting work with compressing SDFs with neural nets: https://arxiv.org/pdf/1901.05103.pdf as well as some mesh to SDF conversion in realtime, though only for proxy meshes and still .Xms per mesh: https://github.com/Unity-Technologies/com.unity.demoteam.mesh-to-sdf

It's interesting to see what can be done. I wish the Media Molecule would explain what they finally settled on for Dreams, they mentioned maybe switching from point splatting to SDFs at one point? And foliage of all things just animates arbitrarily there.

There is the question of surfaces for SDF, they get bad as you go high res. People messing with NERFs have tried SDFs for basic shape representation with point splatting for surface detail with some success.
Media molecule goes into detail on their big presentation on the rendering — they’re point splatting *onto* sdfs, rather than raymarching them.

Meshing sdfs isn’t great even if you can get it fast — the problem is that the meshes are too low precision. Polygonal meshes have varying resolution — edges are 100% precise, and can be placed where needed. To get to the kind of resolution where you can capture a hard edged surface you need hundreds of millions of verts, so it’s a non-starter outside of content authoring tools (like zbrush or substance modeler) — like Andrew said above the nanite siggraph presentations go over this a bit, as does the media molecule dreams rendering presentation.

Edit: sorry, realizing I misread your post — you’re talking about mesh to sdf, not sdf to mesh. Leaving the above for posterity. AFAIK from limited playing with it, ue4-5’s sdf generation is similarly fast to that. Mesh to sdf is surely something rt hardware can accelerate, but I imagine the bottleneck is actually saving out the texture, not calculating the distance field
 
last one, because it show a great improvement in geometry
it's like going from PS1 to PS2 !
 

Attachments

  • Fortnite_20221205223039.jpg
    Fortnite_20221205223039.jpg
    432.3 KB · Views: 36
  • Fortnite_20221205223055.jpg
    Fortnite_20221205223055.jpg
    337.9 KB · Views: 39
Last edited:
Some HW-RT on/off comparisons.


The second one is wrongly named Nanite/Lumen on/off, it's just a type, in reality it's a HWRT on/off comparison.
You really begin to see just how wrong real-time graphics and lighting has been since the very beginning. I've told many people before... once your eyes get used to seeing RTGI and AO... you'll look back and wonder how you ever thought game graphics were realistic before that point.
 
Back
Top