Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

This would be a nice comeback for D3D11 OOP and Dynamic Shader Linkage. Why were shader libraries and similar features deprecated in the first place? Creating shader libraries seems like the simplest solution to the current stuttering issues, so why isn’t it a part of DirectX12?
Cause none of the IHVs actually would do separate compilation. They'd just store the IR and inline it all in the end anyways, similar to what happens with raytracing DXIL libraries today. There are GPU architecture issues at play here, but we can definitely do somewhat better in software too (and IMO it's long past time to devote some transistors to this on the hardware side, but competitive forces don't incentivize that currently).
 
The official Unreal Engine channel posted this video about real time Path Tracing in UE5, explained by NVIDIA, where they explain that they only path trace direct lighting, indirect lighting and some reflections (rough and specular) .. mirror reflections rely on ray tracing/lumen. Translucency is also ray traced, as well as hair and single layer water. Volumetric fogs and post processing (depth of field, bloom, motion blur) are rasterized.

 
They'd just store the IR and inline it all in the end anyways, similar to what happens with raytracing DXIL libraries today. There are GPU architecture issues at play here, but we can definitely do somewhat better in software too (and IMO it's long past time to devote some transistors to this on the hardware side, but competitive forces don't incentivize that currently).
That’s something I could understand for the hardware of the time, but true function calls from the device were introduced in Kepler over 12 years ago? Yet there’s still no support in graphics to this day? This feels more like a software limitation to me rather than a hardware one.
 
...where they explain that they only path trace direct lighting, indirect lighting and some reflections (rough and specular) .. mirror reflections rely on ray tracing/lumen. Translucency is also ray traced...
These are the same algorithm. What's the distinction in terms of execution such that they are called out for being different? Notably, direct light should be 'raytraced' as it's a single iteration.
 
These are the same algorithm. What's the distinction in terms of execution such that they are called out for being different? Notably, direct light should be 'raytraced' as it's a single iteration.

I also found that wording ambiguous. The possible explanation I had in mind is that only direct lighting is using ReSTIR reservoir? If so, that's just nVidia's Mega-Lights...
 
Last edited:
Even a game as basic as Dragon Quest 3 HD2D suffers from traversal stuttering on my 9800X3D.. Plainly put, I'm really starting to hate Unreal Engine.

Final Fantasy Tactics is one of my favorite games of all time and there's a rumor SquareEnix is making a remaster of that game... and I can't even be excited for it because I'm almost certain it will be using this damn engine.
 
A new version of Off the Grid has been added to the Xbox marketplace, which surprisingly has a much improved resolution on the console. It seems that it is still possible to achieve a pleasantly good image quality of around 1440p with 60 fps on a console with this engine, it's just a matter of optimization! Note: the game is free to try. It is worth raising all the brightness settings in the game menu because it is set to low by default!

I think this is now the UE5 benchmark for console.

 
A new version of Off the Grid has been added to the Xbox marketplace, which surprisingly has a much improved resolution on the console. It seems that it is still possible to achieve a pleasantly good image quality of around 1440p with 60 fps on a console with this engine,
Of course. So long as you don't turn everything on and pick the features that work well in your game, you can even get 2160p120 via UE5.
 
Of course. So long as you don't turn everything on and pick the features that work well in your game, you can even get 2160p120 via UE5.
Yes, but it uses both Nanite and Lumen. The landmarks are very detailed when viewed up close.

Anyway, there is also a 120HZ mode, but the resolution is lower there. It is worth setting the console output to 60 HZ, because this way the game has a higher native resolution.
 
Yes, but it uses both Nanite and Lumen. The landmarks are very detailed when viewed up close.

Anyway, there is also a 120HZ mode, but the resolution is lower there. It is worth setting the console output to 60 HZ, because this way the game has a higher native resolution.

I would guess the 120Hz mode uses frame gen. They added frame gen on pc. CPU performance is pretty bad.
 
I would guess the 120Hz mode uses frame gen. They added frame gen on pc. CPU performance is pretty bad.
If they use framegen on consoles, it's a really good job! But as I said, it's more beautiful in 60HZ mode. I compared it with the build from a month and a half ago, since then there has been a significant improvement in resolution, in any case, it runs surprisingly well on Series X.
 
Even a game as basic as Dragon Quest 3 HD2D suffers from traversal stuttering on my 9800X3D.. Plainly put, I'm really starting to hate Unreal Engine.
Obviously not something you should forgive but the drive-bys on Unreal are getting tiresome. You literally just used the opposite logic in the DF thread regarding DLSS and PSSR (it must be the game's fault, despite the fact that those techniques are way more black box and have way smaller API surfaces than something like a game engine).

There's obviously plenty of games at this point that run well without issues on UE5 that are way more complicated that DQ3. There's also plenty of games that run like shit that don't use Unreal. You're not considering the prior probability (i.e. that a large number of games use Unreal these days), and letting your emotional response just pick a target that can consistently blame rather than the reality of lots of games have lots of different issues.

... and of course so am I. As I like to remind myself and everyone, it's obviously fine to call out issues in games. Pattern matching can be useful although it's worth reminding ourselves that it's a better debugging tool than root cause tool. The place where people tend to overstep their bounds is when they confidently claim a root cause for issues they don't really understand.

Ultimately consumers are not the customers of Unreal Engine (outside some stuff like UEFN, which blurs the line a bit but also comes with significant guardrails). Game developers are the customers of UE, and consumers are customers of the game developers. At B3D we like to break things down into the tech which is fantastic, but when it starts to stray into misdirected internet rage it just distracts from that discussion. I realize this is what gets clicks for the tech reviewers, but we don't really need to bring it here as well.

IMO the more interesting discussion (if people even want to have it) is what sorts of pitfalls do folks potentially run into that causes these kinds of issues that other people (ex. The Finals, Satisfactory, Hellblade, ...) are able to avoid. In reality it's gonna be a ton of different things, but naively I'd guess that the ease of use of the engine from the gameplay side is potentially a bit too enticing and can lead people down some bad paths. A common one that has been discussed in quite a few UE talks is relying on blueprints too much, but I'd extend that to the entire actor/component infrastructure. The naive way to do things (treating actors as self-contained blueprints/prototypes and then creating large amounts of them to fill out a world) can lead to significant overhead. This of course has always been an issue even in UE3 and 4, but with people creating larger and more complex worlds it becomes even more important to address. This is of course what MASS and PCG and similar systems (demo'd in CitySample and ElectricDreams among others) tackle in various ways, but really the expectation is that if you are going to create a large scene with tons of objects you're going to need to make appropriate infrastructure in the game itself, based on these frameworks or otherwise. Satisfactory is a great example of handling tons of objects, but they certainly don't have a blueprint ticking for every item on a conveyor belt...

HLOD is another area of the engine where I suspect people get tripped up on because it is both very necessary once you have worlds of certain sizes, but also a bit of a bespoke workflow. Obviously engine improvements can help here and they do continually come, but I suspect some folks effectively get a bit over their heads with respect to their ambitions.
 
Obviously not something you should forgive but the drive-bys on Unreal are getting tiresome. You literally just used the opposite logic in the DF thread regarding DLSS and PSSR (it must be the game's fault, despite the fact that those techniques are way more black box and have way smaller API surfaces than something like a game engine).

There's obviously plenty of games at this point that run well without issues on UE5 that are way more complicated that DQ3. There's also plenty of games that run like shit that don't use Unreal. You're not considering the prior probability (i.e. that a large number of games use Unreal these days), and letting your emotional response just pick a target that can consistently blame rather than the reality of lots of games have lots of different issues.

... and of course so am I. As I like to remind myself and everyone, it's obviously fine to call out issues in games. Pattern matching can be useful although it's worth reminding ourselves that it's a better debugging tool than root cause tool. The place where people tend to overstep their bounds is when they confidently claim a root cause for issues they don't really understand.

Ultimately consumers are not the customers of Unreal Engine (outside some stuff like UEFN, which blurs the line a bit but also comes with significant guardrails). Game developers are the customers of UE, and consumers are customers of the game developers. At B3D we like to break things down into the tech which is fantastic, but when it starts to stray into misdirected internet rage it just distracts from that discussion. I realize this is what gets clicks for the tech reviewers, but we don't really need to bring it here as well.

IMO the more interesting discussion (if people even want to have it) is what sorts of pitfalls do folks potentially run into that causes these kinds of issues that other people (ex. The Finals, Satisfactory, Hellblade, ...) are able to avoid. In reality it's gonna be a ton of different things, but naively I'd guess that the ease of use of the engine from the gameplay side is potentially a bit too enticing and can lead people down some bad paths. A common one that has been discussed in quite a few UE talks is relying on blueprints too much, but I'd extend that to the entire actor/component infrastructure. The naive way to do things (treating actors as self-contained blueprints/prototypes and then creating large amounts of them to fill out a world) can lead to significant overhead. This of course has always been an issue even in UE3 and 4, but with people creating larger and more complex worlds it becomes even more important to address. This is of course what MASS and PCG and similar systems (demo'd in CitySample and ElectricDreams among others) tackle in various ways, but really the expectation is that if you are going to create a large scene with tons of objects you're going to need to make appropriate infrastructure in the game itself, based on these frameworks or otherwise. Satisfactory is a great example of handling tons of objects, but they certainly don't have a blueprint ticking for every item on a conveyor belt...

HLOD is another area of the engine where I suspect people get tripped up on because it is both very necessary once you have worlds of certain sizes, but also a bit of a bespoke workflow. Obviously engine improvements can help here and they do continually come, but I suspect some folks effectively get a bit over their heads with respect to their ambitions.
I'm convinced that this is also behind many console ports that use low resolutions these days. In other words, developers do not devote enough energy to writing unique codes, but rather automatically use the options offered by the engine. And the result of this is a lack of optimization, as evidenced by numerous examples in recent times.
But as you can see in the above game that I wrote about, it can be done to the right as well, it's just a matter of attitude. I respect more and more those developers who use special code and unique solutions, e.g. for UE, in order to achieve a better visual/speed result.
 
If they use framegen on consoles, it's a really good job! But as I said, it's more beautiful in 60HZ mode. I compared it with the build from a month and a half ago, since then there has been a significant improvement in resolution, in any case, it runs surprisingly well on Series X.

I tried it when it first came out on pc, and even with a 5800X3D it was rough. But I've seen several patch notes for cpu improvements. There was one where they added something they called, "Fluid Motion" or "motion boost" or something along those lines. Seems like it was just frame gen.
 
Back
Top