Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Unfortunately, the UE5 is leaving a lot of performance on the table. The 4090 is barely doing 4K60 native at max settings and SW-Lumen in freaking Fortnite! I don't think that's acceptable. The 4090 flies in Metro Exodus at 4K native, heck it even flies in some path traced games at 4K.

And when it comes to HW-Lumen, it runs similarly to SW-Lumen but at a higher quality, So on PC, activating HW-Lumen should be a no brainer.

UE5 needs more acceleration to extract more performance from the hardware, otherwise, other engines will surpass UE5 in terms of raw performance.
I see you are currently at stage 2 in the 5 stages of grief. Don't worry, I've been there too. ;)

I disagree that HW-Lumen is the obvious choice here. Why? Because it's barely doing 4K60 like you said. With HW-Lumen, that barely would change into not 4K60 at all. It runs similar but not exactly the same, how much the performance differiantes differs from scene to scene. In one scene, they could perform similar. But in another, HW-Lumen would be noticeably slower. And given even we, as tech enthusiasts, have trouble telling the difference between the modes, I don't think its worth it.

I guess we just have to accept that the combination of Nanite and Lumen is very demanding. For that, it really looks great though. We've come CGI a major step forward, if that means sacrificing Ray accelerators/ RT cores then so be it...

By the way "it's just Fortnite" is a pretty bad argument. The polygon count of some assets have been massively improved and of course, the lighting is a whole other level. Cartoony graphics can be demanding too (look at Pixar movies)

BTW, I've requested the acceleration of signed distance fields in the DX12 discord. This was someone's response:

that's as much of a hardware problem as a spec one
and converting SDFs to hardware logic is nontrivial, too, you need to specify a system of n-dimensional curves & tiling schemes that can be combined into valid SDFs, and then the hardware/driver needs to be able to verify that the resulting curve is well-formed
The reason SDFs haven't really taken off as a modeling scheme is that making anything complex with them is dependent on elaborate, abstract undergrad & postgrad maths. It's not like with triangles where there's a single well-known intersection scheme + you can mostly throw them together and have a reasonable mesh (even if it does have gaps, you can still put the triangles in a BVH and render something)
and what I forgot to mention here, the function composed from all these curves & schemes can't necessarily be simplified
so you could end up with a huge hardware surface to implement all the different permutations of whatever SDF representation scheme you settled on
what would be a reasonable alternative is to enable driver & hardware support for SDFs through an intermediate meshing stage (with marching cubes I guess)
Since you already have driver support for arbitrary SDFs with intersection shaders (which are just run like compute shaders & don't do any verification on the SDF), this is probably the best way to get the acceleration you're looking for
(though ofc you could also implement marching cubes yourself and make a BVH from the output mesh)
Now, that's beyond my technical comprehension. @Andrew Lauritzen would that be a reasonable way to accelerate SW-Lumen with Ray accelerators /RT cores in UE5? Thank you for taking part in the discussion by the way, I really appreciate it!
 
Last edited:
And given even we, as tech enthusiasts, have trouble telling the difference between the modes, I don't think its worth it.
I don't have troubles differentiating between the two, while using HW-Lumen, relfections on all surfaces are more elaborate and dynamic (reflecting characters), even on icy surfaces (which there are plenty of), and I am pretty sure HW-GI is a lot more accurate.

. In one scene, they could perform similar. But in another, HW-Lumen would be noticeably slower
I've never noticed that, nor have seen benchmarks showing that, both have pretty similar performance, so I stand by my statement, HW-RT is a no brainer here.

The polygon count of some assets have been massively improved and of course, the lighting is a whole other level.
Metro Exodus have the same polygonal complexities if not more, and uses heavy Tessellation, yet it's delivering a lot more performance. GI in Metro is no less advanced or complex. In fact I think it has more visual features than Lumen.
 
Metro Exodus is nowhere close to the polygon density shown in Fortnite UE5.1
Metro Exodus uses Tessellation for the terrain and the objects have a higher poly rate. But in the end every game since DX11 could have this level of geometry and use Tessellation for other areas like hair, vegetation (Final Fantasy 15) etc.
 
Metro Exodus uses Tessellation for the terrain and the objects have a higher poly rate. But in the end every game since DX11 could have this level of geometry and use Tessellation for other areas like hair, vegetation (Final Fantasy 15) etc.
Metro exodus during "open world" exploration has terrible popin, this was ruining whole experience for me. Anyway


soon!
 
I don't have troubles differentiating between the two, while using HW-Lumen, relfections on all surfaces are more elaborate and dynamic (reflecting characters), even on icy surfaces (which there are plenty of), and I am pretty sure HW-GI is a lot more accurate.


I've never noticed that, nor have seen benchmarks showing that, both have pretty similar performance, so I stand by my statement, HW-RT is a no brainer here.


Metro Exodus have the same polygonal complexities if not more, and uses heavy Tessellation, yet it's delivering a lot more performance. GI in Metro is no less advanced or complex. In fact I think it has more visual features than Lumen.

But HW-RT is here on PC. I don't understand the problem.
 
Metro exodus during "open world" exploration has terrible popin, this was ruining whole experience for me. Anyway


soon!
We already know how big the difference is between Nanite/Lumen on and off. What I am more interested to know is the difference between HW/SW Lumen and performance on 2060S/5700. I hope Alex will cover that as well and that this won't be the only video about the tech update for Fortnite.
 
The main character is still not directly reflecting on surfaces on console! But what about the up, left and right vague "shadows" reflections? Nanite and Lumen are a great achievement on consoles but this lack of self-reflections really spoils the whole thing for me.

dKqRabB.png


This should be the most important reflection to implement IMO and it has being done without dedicated hardware before:
ExS9w39XMAMbkh3.jpg
 
The main character is still not directly reflecting on surfaces on console! But what about the up, left and right vague "shadows" reflections? Nanite and Lumen are a great achievement on consoles but this lack of self-reflections really spoils the whole thing for me.

dKqRabB.png


This should be the most important reflection to implement IMO and it has being done without dedicated hardware before:
ExS9w39XMAMbkh3.jpg
SW Luemn RT does not do skinned meshes, just catches them with SSR.

Also I noticed the "moving" static meshes it does tend to be black?
 
I see you are currently at stage 2 in the 5 stages of grief. Don't worry, I've been there too. ;)

Theres no reason for him or any other pc gamer to be ’at those states’. Nothing to do with the discussion either on someone elses emotional state anyway?

By the way "it's just Fortnite" is a pretty bad argument.

If anything fortnite ue5.1 is a rather good showcase on all platforms, playing to each’s strengths.
 
Very impressed to see the transformative differences lumen and nanite can bring to a large multiplayer game like Fortnite.

The improvements brought over their previous render path are clear as day. Most users would be able to load up the game and immediately tell that it looks better.

I look forward to seeing the UE5 evolve and footage from single player games using it.

Kudos to Epic
 
Looking very good. Can't wait for johns video.

To hit 60fps with nanite plus lumen on consoles takes careful optimization. But I think it is worth it in most cases looking at what im seeing.
 
From my very limited testing, enabling HW ray tracing on my 3080 did not actually decrease performance. Frame rate did not change at all. Lumen Epic settings have a pretty big hit over Lumen High, but I was not able to spot the differences. Might require more detailed direct comparisons. Just from a superficial experience spending a very small amount of time in the game, I'd go with both Lumen settings on High. I'm pretty much able to stay in the 90-110 fps range with all the good stuff on High and tsr on recommended which sets 3d scale to 60%. It looks suprisingly clean and doesn't have any signs of weird sharpening. I'm hoping when DLSS comes back it'll be a bit cheaper on the gpu than tsr and I can get even a bit more performance.

Also Lumen reflections did not cost nearly as much as I thought it would. I think standing by a lake I was dropping from 115 to 108 or something when going from reflections OFF to Lumen Reflections High (With Lumen GI High in both cases)
 
Last edited:
Back
Top