Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I'd guess next time we really hear more is when Fortnite is ported over, I think their plan was to get it out during summer.

I think we will see a bunch of games around E3 time that are running on this engine but they will all be 2022 and beyond titles . We might see 1 or 2 from epic or someone close to epic that will be out sooner. I kinda feel covid is going to push everything back.
 
I'm sure people we'll be (pleasantly) surprised how UE5 looks even on android potato phones.

I was referring to the PR blunder, where a laptop ran the demo better than the PS5 by an engineer it got leaked leaked, made the PS5 look weak and PR damage control was set in high gears....aka "politics".

But the "Streisand Effect" is hard to beat ;)

Unreal Engine 5 PS5 demo runs happily on current-gen graphics cards and SSDs | PC Gamer

It also made some fans have their panties in a bundle...in other news...water is still wet ;)
 
I was referring to the PR blunder, where a laptop ran the demo better than the PS5 by an engineer it got leaked leaked, made the PS5 look weak and PR damage control was set in high gears....aka "politics".

But the "Streisand Effect" is hard to beat ;)

Unreal Engine 5 PS5 demo runs happily on current-gen graphics cards and SSDs | PC Gamer

It also made some fans have their panties in a bundle...in other news...water is still wet ;)

That article makes a pretty nice point of things though that in retrospect makes all this almost a non issue. As the GPU in the laptop was a 2080, and the SSD was a 3.5GBs EVO, and the frame-rate was unlocked.

So all in all it feels a bit like a storm in a cup rather than the “Streisand Effect”, although PR wise that was at play as well to some extent obviously, but generally I’d say that we’ve now all got a pretty good grasp of where the consoles are VS PC hardware and we wouldn’t be nearly as surprised by these results today, and even consider it a good result for the 399$ PS5. You have more chance getting a PS5 than a 2080 at this point and I don’t even know what the minimum price of a 2080 would be right now? I think I managed a 2060 for close to $399.

My desktop with a six core i5 and the 2060 RTX, 16GB of RAM (that is single chip so it still needs another chip to double the bandwidth) and a half decent SSD (not nearly the PS5 speed) cost me something like 1149 euros. It is basically slower at everything, saved a little bit by NVIDIAs DLSS ...
 
Hopefully UE5 will expand upon UE4 greatly and improve it in every aspect.

-Better and stutter-free DX12 integration, ditching DX11 completely.
-Full DX12 Ultimate support
-Improved RT performance (especially for reflections and GI, the latter of which should just be replaced by Nvidia's RTXGI entirely) as well as DLSS integration on day 1
-improved texture streaming/compression and support for DirectStorage as well as SSD streaming in general
-Allow Nanite and Lumen to make use of HW-Acceleration using Mesh Shading and HW-RT.
-Machine Learning accelerated physics and UI workflow

These would be my wishes. If all of these happen, then UE is going to be the next gen engine king again.
 
Hopefully UE5 will expand upon UE4 greatly and improve it in every aspect.

-Better and stutter-free DX12 integration, ditching DX11 completely.
-Full DX12 Ultimate support
-Improved RT performance (especially for reflections and GI, the latter of which should just be replaced by Nvidia's RTXGI entirely) as well as DLSS integration on day 1
-improved texture streaming/compression and support for DirectStorage as well as SSD streaming in general
-Allow Nanite and Lumen to make use of HW-Acceleration using Mesh Shading and HW-RT.
-Machine Learning accelerated physics and UI workflow

These would be my wishes. If all of these happen, then UE is going to be the next gen engine king again.

They have Lumen for GI in Unreal Engine 5.
 
  • Like
Reactions: snc
Hopefully UE5 will expand upon UE4 greatly and improve it in every aspect.
Proper support for large worlds with 64bit coordinates will most likely be quite nice for many developers.

There should be more already known ones.
And? Doesn't mean they can't put RTXGI and RTXDI in there as well. You really think Nvidia made these solutions for a dying UE4?

Chances are RTXGI is superior to Lumen too.
RTXGI is not an engine limited solution and should pretty much work on UE5 as is.

It's really hard to compare them yet, we will see when we can test them properly on same scenes and tweak the settings.

RTXGI similar to classic probe methods, I wouldn't be surprised that there are some cases where Lumens has quality advantage.
 
Last edited:
Proper support for large worlds with 64bit coordinates will most likely be quite nice for many developers.

There should be more already known ones.

RTXGI is not an engine limited solution and should pretty much work on UE5 as is.

It's really hard to compare them yet, we will see when we can test them properly on same scenes and tweak the settings.

RTXGI similar to classic probe methods, I wouldn't be surprised that there are some cases where Lumens has quality advantage.

Don't forget Lumen was a work in progress and some research did some improvement on alternative GI method like complety eliminate light leaking.

https://deepai.org/publication/signed-distance-fields-dynamic-diffuse-global-illumination
 
-Improved RT performance (especially for reflections and GI, the latter of which should just be replaced by Nvidia's RTXGI entirely)
I don't know how UE4 RT GI works, but assume it's a path tracing like approach limited to one bounce in practice. Very high quality in theory.

RTXGI uses a volume grid of probes, which supports infinite bounces, but lacks surface detail because lighting is projected from probes in empty space to surface.
In empty space the GI signal is low frequency, but on the surface it has strong discontinuities which define how realistic GI looks. The error is big perceptually, so RTXGI is better than nothing but far from realisitc. It also has grid quantization problems if probe grid resolution is too low in comparison to scene detail (which will happen a lot).

I assume UE5 GI also uses probe grid, with SDF models + screen space refinement to calculate visibility instead using HW RT. Dynamic character support coudl be added with capsule representation, but then we still have a gap betwwen characters and other non rigid dynamic objects where capsule representation remains difficult. Another expected restriction might be such dynamic objects can block light (soft indirect shadows), but can't reflect indirect lighting.

So all those approaches have their flaws. I'm more impressed from Exodus Enhanced. I guess it also uses path tracing approach to capture the signal on the surface accurately, plus some other method to cache and accumulate infinite bounces. Maybe using mesh vertices or faces, or probe grid - IDK.
I doubt Lumen or RTXGI can match this, but ofc. there are temporal issues and HW RT requirement.
 
You have more chance getting a PS5 than a 2080 at this point and I don’t even know what the minimum price of a 2080 would be right now? I think I managed a 2060 for close to $399.

A RTX2080 is going to be more capable than the PS5 GPU though, both in raster but in special RT/reconstruction performance. Also, both GPU's and PS5 products arent really available now.
That is, IF someone is out after an ancient almost three years old 2080 by now (overpriced at that).
 
I don't know how UE4 RT GI works, but assume it's a path tracing like approach limited to one bounce in practice. Very high quality in theory.

RTXGI uses a volume grid of probes, which supports infinite bounces, but lacks surface detail because lighting is projected from probes in empty space to surface.
In empty space the GI signal is low frequency, but on the surface it has strong discontinuities which define how realistic GI looks. The error is big perceptually, so RTXGI is better than nothing but far from realisitc. It also has grid quantization problems if probe grid resolution is too low in comparison to scene detail (which will happen a lot).

I assume UE5 GI also uses probe grid, with SDF models + screen space refinement to calculate visibility instead using HW RT. Dynamic character support coudl be added with capsule representation, but then we still have a gap betwwen characters and other non rigid dynamic objects where capsule representation remains difficult. Another expected restriction might be such dynamic objects can block light (soft indirect shadows), but can't reflect indirect lighting.

So all those approaches have their flaws. I'm more impressed from Exodus Enhanced. I guess it also uses path tracing approach to capture the signal on the surface accurately, plus some other method to cache and accumulate infinite bounces. Maybe using mesh vertices or faces, or probe grid - IDK.
I doubt Lumen or RTXGI can match this, but ofc. there are temporal issues and HW RT requirement.

I doubt virtualised geometry like Nanite is raytracing friendly. The full scene geometry is not in memory. I suppose title using Nanite will use Lumen and title using classic geometry will use RT in Unreal Engine 5.

It will be a choice and a tradeoff or very detailed environnement with GI approximation or less geometry and better lighting. It depends of the art direction of the title. RT is available in UE4 no reason it will disseapear of UE5.
 
I doubt virtualised geometry like Nanite is raytracing friendly. The full scene geometry is not in memory. I suppose title using Nanite will use Lumen and title using classic geometry will use RT in Unreal Engine 5.

It will be a choice and a tradeoff or very detailed environnement with GI approximation or less geometry and better lighting.
That's not acceptable :)
I still hope on Epic to push 'fix broken DXR' :D
 
I doubt virtualised geometry like Nanite is raytracing friendly. The full scene geometry is not in memory. I suppose title using Nanite will use Lumen and title using classic geometry will use RT in Unreal Engine 5.

I don't think that's very likely. Lumen is already based around infinite bounce GI where GI can accumulate from outside of the screen -- that existing GI method has something to work against. (This is a wild guess, but my money is on sdf or voxelized representation of the whole scene) At worst, I expect to see their approach accelerated with rtx hardware to handle whatever the main cost is (ray-box intersections or whatever.)
 
I don't think that's very likely. Lumen is already based around infinite bounce GI where GI can accumulate from outside of the screen -- that existing GI method has something to work against. (This is a wild guess, but my money is on sdf or voxelized representation of the whole scene) At worst, I expect to see their approach accelerated with rtx hardware to handle whatever the main cost is (ray-box intersections or whatever.)

They use a different representation of the scene. For far object it is voxel based, for medium distance object it is signal distance field and for details and object near distance it is screen space GI. I agree it can use RT/ray box intersection acceleration.

I speak of using triangle based GI with a BVH like in Metro enhanced. Imo the best lighting in any videogames.
 
Last edited:
I looked it up.

According to Unreal Engine subreddit, UE5’s early alpha has been delayed to Summer while UE5 Preview (where Lumen and Nanite are currently said to be production ready) has been delayed to end of the year (Preview was supposed to be out around now)
Bummer.

See this on era, it seems UE 5 is late.
 
Back
Top