Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

When it's a driver level hack versus when it's actually explicitly coded for in game.

Older cards are far less likely to have new features added to their drivers (especially hacky ones that may or may not work in a game like HyperRX) versus continuing optimization.

Regards,
SB
RDNA1&2 isn't legacy yet.
What your saying is worse as its not a hardware reason, their just choosing not to support older cards.

The point is they have their reasons to make it only available on RDNA3.
Nvidia has reasons to make frame gen only available on 40x cards.

I don't have an issue with what either company is doing. Just found his presentation hilarious.
 
RDNA1&2 isn't legacy yet.
What your saying is worse as its not a hardware reason, their just choosing not to support older cards.

The point is they have their reasons to make it only available on RDNA3.
Nvidia has reasons to make frame gen only available on 40x cards.

I don't have an issue with what either company is doing. Just found his presentation hilarious.

NV frame gen (DLSS3) isn't a driver level hack like HyperRX. DLSS3 is the same as FSR3. One is limited to X hardware, the other isn't. Support (QA, tech support) for DLSS3 and FSR3 is significantly simpler versus a driver level hack which can attempt to be applied in any game.

I could certainly see where you are coming from if FSR3 was limited to a certain generation of cards from a single IHV.

It's entirely possible that for HyperRX they are relying on the relatively unused 2nd ALU in each CU. Since it isn't getting used heavily in games, it can be used at the driver level for other stuff (like HyperRX) potentially.

Regards,
SB
 
Latest UE5 game - Fort Solis...... It's also heavy AF on the GPU.

View attachment 9498

And the frame rate on a 4070ti, max settings at native 1440p in that shot???? 33fps!!!!!!
I've tried it with my Vega56. Surprisingly perf. does not feel to bad. It's perfectly playable.
However - it's a walking simualtor. I do like WS, but in this case everything is animation first, so controls feel laggy and not precise. There also were some QTEs where i had to click buttons like a monkey in a lab. So that's no game for me personally. Feels too much like interactive movie for my taste.

Back to tech, the game can't change resolution in fullscreen mode. Ouch. So i had to play in 1440p with FSR2 on. I would guess the framerate was 30. Fine for a walking sim, but i would go down to 1080 or less eventually. I was just too lazy to change resolution in Windows to try how much it helps. (I even increased GI setting to high, left all else at defaults, which may vary across detected HW.)

The game is pretty dark, and thus the wins of Lumen and Nanite did not really shine. It's no big jump from similar UE4 games, for me at least. But it works and looks good, beside constant reconstruction artifacts which are way worse than TAA for example.
So my impression from the first real UE5 game is slightly positive. But for an action games it would not suffice.
 
NV frame gen (DLSS3) isn't a driver level hack like HyperRX. DLSS3 is the same as FSR3. One is limited to X hardware, the other isn't. Support (QA, tech support) for DLSS3 and FSR3 is significantly simpler versus a driver level hack which can attempt to be applied in any game.

I could certainly see where you are coming from if FSR3 was limited to a certain generation of cards from a single IHV.

It's entirely possible that for HyperRX they are relying on the relatively unused 2nd ALU in each CU. Since it isn't getting used heavily in games, it can be used at the driver level for other stuff (like HyperRX) potentially.

Regards,
SB
I have no issue with what amd is doing, just found the presentation funny.

Your giving reasons why it may only be possible on RDNA3, which could be hardware related (which I assumed is the case). But in the end it's only RDNA3.
DLSS FG - the reason given that only available on 40x cards are due to the updated flow gen or whatever it is hardware, so same reason.

In terms of Nvidia not making it's tech OS or work on other hardware, is a wider conversation but both companies are doing things based on the position they are in the market.
I'm trying not to get too deep into this topic as, like I said I just found it funny how he was going on about supporting older hardware, then had an RDNA3 only feature as the "but wait there's more moment."
No more no less, everything else, well that's a wider and deeper topic of conversation.
 
I've tried it with my Vega56. Surprisingly perf. does not feel to bad. It's perfectly playable.
However - it's a walking simualtor. I do like WS, but in this case everything is animation first, so controls feel laggy and not precise. There also were some QTEs where i had to click buttons like a monkey in a lab. So that's no game for me personally. Feels too much like interactive movie for my taste.

Back to tech, the game can't change resolution in fullscreen mode. Ouch. So i had to play in 1440p with FSR2 on. I would guess the framerate was 30. Fine for a walking sim, but i would go down to 1080 or less eventually. I was just too lazy to change resolution in Windows to try how much it helps. (I even increased GI setting to high, left all else at defaults, which may vary across detected HW.)

The game is pretty dark, and thus the wins of Lumen and Nanite did not really shine. It's no big jump from similar UE4 games, for me at least. But it works and looks good, beside constant reconstruction artifacts which are way worse than TAA for example.
So my impression from the first real UE5 game is slightly positive. But for an action games it would not suffice.

I'm in full screen mode and can change resolutions no problem.
 
The developer of DESORDRE has put out a beta branch with Pathtracing based on ReStir GI and DI:
In contrast to Cyberpunk in its overdrive mode, Desordre employs a different set of settings and algorithms. Specifically, I use the 'brute force' technique within Restir GI for global illumination, casting only a single ray per pixel at half the internal resolution. This produces highly accurate lighting but is notably demanding on system performance. By default, two bounces per ray are calculated. For reflections, two bounces are also computed. Within those reflections, however, global illumination is limited to a single bounce, a feature uniquely optimized by Nvidia's algorithms for performance efficiency. Regarding direct lighting (RTXDI), I sidestep the use of RIS and ReGIR, opting instead for Nvidia's default Random Sampling approach within RTX DI. However, it's worth noting that all of this is still under development and subject to change, so the final settings are yet to be determined.

Looks out of this world and runs so good on a Lovelace GPU: https://imgsli.com/MjAyNjkx
 
Last edited:
Last edited:
Desordre

Hardware Lumen vs. Full Ray Tracing 1: https://imgsli.com/MjAyNjk5
Hardware Lumen vs. Full Ray Tracing 2: https://imgsli.com/MjAyNzAw
Hardware Lumen vs. Full Ray Tracing 3: https://imgsli.com/MjAyNzA3
Hardware Lumen vs. Hardware Lumen + RTXDI: https://imgsli.com/MjAyNzAy
Hardware Lumen + RTXDI vs. Full Ray Tracing: https://imgsli.com/MjAyNzA0
Software Lumen vs. Hardware Lumen: https://imgsli.com/MjAyNzAz
This game is not apples to apples comparisons, right? The content is different. They turn off the emissive contribution in lumen and don't replace them with lights.

(This may be an argument for future improvements to how emissives are handled, if developers can't be induced to re-light their scenes, but it doesn't fairly represent what lumen can do)
 
The developer said:
One of the challenges with Lumen is the handling of emissive lighting; the light cubes in Desordre are now excluded from Lumen's calculations. These cubes produce light that flickers excessively, making the overall rendering unstable.
If Epic Games improves this feature in the future, the light cubes will, of course, be supported by Lumen again. However, the primary goal is to achieve stable lighting.


UE has a problem with emissive lights.
 
Those emissive cubes don't seem to be lighting the environment properly. The environment lighting is far too saturated to match the color/intensity of the cubes.

Still looks amazing though!
 
Desordre

Hardware Lumen vs. Full Ray Tracing 1: https://imgsli.com/MjAyNjk5
Hardware Lumen vs. Full Ray Tracing 2: https://imgsli.com/MjAyNzAw
Hardware Lumen vs. Full Ray Tracing 3: https://imgsli.com/MjAyNzA3
Hardware Lumen vs. Hardware Lumen + RTXDI: https://imgsli.com/MjAyNzAy
Hardware Lumen + RTXDI vs. Full Ray Tracing: https://imgsli.com/MjAyNzA0
Software Lumen vs. Hardware Lumen: https://imgsli.com/MjAyNzAz

Just looks like all the settings are wrong for Lumen, those emissive cubes should definitely be adding light, not a great comparison.
 
Just looks like all the settings are wrong for Lumen, those emissive cubes should definitely be adding light, not a great comparison.
Maybe those cubes are dynamic objects and thus ignored by Lumen? But then HW Lumen should still capture them, i'd guess?
I would have expected emissive materials work and are cheaper (actually completely free) than analytical lights.
Agree they would have needed to put an analytical reddish light into those cubes at least. Not only for a fair comparison, but looks really off as is.
 
I have stated in post #5,189 the reason why these cubes do not light the environment.

Regarding the red glow: I asked the developer and the red glow was deliberately chosen. Maybe I still write that some other red color tones might look better in this case.

This shot is a good academic excercise on the difference between HW and SW Lumen. HW Lumen has extended GI range and much better relfections.
I'm going to take another look at it. There are many settings in the game and these were the ones that I thought indicated Sowftware Lumen.
 
Last edited:
17 mins of Lords of the Fallen gameplay. Not sure if this is the 30 or 60 fps mode, due to my middle aged eyes.

 
Last edited:
17 mins of Lords of the Fallen gameplay. Not sure if this is the 30 or 60 fps mode, due to my middle aged eyes.


Much better looking than Aveum

Except for the particles, which as always are obviously low res and blocky. I feel like I'm the only person that ever sees that, or at least complains about it though. So I guess I can see why it's not a priority for devs, at least not when translucency is so expensive anyway.
 
Just looks like all the settings are wrong for Lumen, those emissive cubes should definitely be adding light, not a great comparison.
There does seem to be something weird going on in some of those shots... why are there not even any screen space reflections of the bright surfaces? Those should be there regardless of any weirdness in the Lumen scene. There also seem to be various issues in the "path tracing" shots as well that I would not expect (missing occlusion in reflections/secondary bounces, etc). The large difference in overall contrast in the reflections is also clearly not expected... I imagine something is not supported or scaled differently in the two paths (skylight?) as large scale differences in the material like that are not expected. The main difference should be multi-bounce cases, but even simple reflections to the sky look different in these comparisons, which they should not. Guess I'll have to give it a try again when I get a chance to fool around.

Also something funky going on with that image comparison there... even if you set the same image between the two sides the one on the left looks more aliased depending on the size of the browser window...??

And yeah no reason that I can think of not to use an analytic point light for the cubes; there are really not that many of them.

Anyways it's neat that the game provides various options to play with but I'd be a bit careful trying to draw general conclusions about the techniques based on the one example. There are many different algorithms, tradeoffs and settings that can be changed depending on the needs of the game/content.
 
Last edited:
Back
Top