Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I think it's funny that on it's 4th major iteration Nanite went back to displacement and tessellation approaches for making geometry.

It makes sense to save on storage and looks fantastic too. Wonder if the displacement is visible to the physics engine.

That Marvel trailer looks amazing, but I just don't believe the game will look like that.

It looks incredible. The metahuman stuff is truly impressive. The smoke looks unrealistically dense and compact though.
 
well She said on stage, multiple times, these are the actual graphics of the game, will it keep the same level of polish everywhere, that remains to be seen, but what we saw will be in the final game for sure, and surely that was running on a high end PC, wonder how it will end up looking on consoles, XsS particularly will be interesting, they made huge perf increases with UE5.4.
People should try again Fortnite with the latest update, it looks gorgeous and at 60fps !
 
Skydance New Media have said a few times that they're a 'small team' compared to most studios. I'd love to know how many people are actually working on Rise of Hydra, and for how long.

I still hold some hope that once developers gain more experience, UE5 will get us out of the rut of horribly long deb cycles.
 
"the addition of software variable rate shading (VRS) via Nanite compute materials brings substantial performance gains"

I can't believe it. Why not use hardware VRS? RTX 2000, RTX 3000, RTX 4000, RX 6000, RX 7000, Xbox Series S, Xbox Series X even down to mobile chips like the latest Adrenos and Exynos 2200 all support Hardware VRS!

And I bet Nanite still not uses mesh shading properly.

Why is Epic refusing to utilize the hardware I've paid a lot of money for?

Yes, I'm aware UE5 technically supports mesh shaders and HW VRS. But I say technically because both improve performance not even by a tiny amount. In contrast, we have games like Alan Wake 2 that benefit massively. It really rubs me in the wrong way that Epic tries to do everything in software. There's a reason there's hardware support in GPUs, and you wouldn't play for example an AV1 video in software when there's a hardware block available on the GPU. It's just inefficient and slower.
 
Last edited:
Why is Epic refusing to utilize the hardware I've paid a lot of money for?

Epic's choice to do things in software is because that's what it took to make Nanite work. Nanite 'wins' against any other approach to geometry. Putting art and layout aside, AW2 sports lots of geometry but it's really not comparable to something like Robocop.

Ultimately, if dedicated hardware features improve geometry or lighting performance then they'll be rolled into UE at some point. Assuming of course they're the type of features that are likely to hang around for the future.

It's not like Epic are sitting still on performance optimizations.
 
Alan Wake 2 looks much better than Robocop. Even without Raytracing the lighting system is more stable. Sometimes a compromis is better than doing one thing fantastic while sacrificing everything else.
 
Alan Wake 2 looks much better than Robocop. Even without Raytracing the lighting system is more stable. Sometimes a compromis is better than doing one thing fantastic while sacrificing everything else.

I'd agree that AW2 is the better look game, and better looking than most games. I was only comparing geometry between the two titles, not their overall tech/art choices.
 
Alan Wake 2 looks much better than Robocop. Even without Raytracing the lighting system is more stable. Sometimes a compromis is better than doing one thing fantastic while sacrificing everything else.

In terms of sheer geometric details, Robocop is leagues a head though.
 
Nobody said something else. But geometry doesnt make a frame alone. Robocop is a great example how UE5 is unable to provide proper lighting information.

The lighting is fine 90% of the time, it's only in odd scene that Lumen fails to do it's job.

But Alan Wake 2 isn't perfect.

And I was probably impressed with more scenes overall in Robocop than I was in Alan Wake 2.
 
Last edited:
I can't believe it. Why not use hardware VRS? RTX 2000, RTX 3000, RTX 4000, RX 6000, RX 7000, Xbox Series S, Xbox Series X even down to mobile chips like the latest Adrenos and Exynos 2200 all support Hardware VRS!
You can't use HW VRS with compute shaders and modern AAA games like to use their compute shaders ...
And I bet Nanite still not uses mesh shading properly.

Why is Epic refusing to utilize the hardware I've paid a lot of money for?

Yes, I'm aware UE5 technically supports mesh shaders and HW VRS. But I say technically because both improve performance not even by a tiny amount. In contrast, we have games like Alan Wake 2 that benefit massively. It really rubs me in the wrong way that Epic tries to do everything in software. There's a reason there's hardware support in GPUs, and you wouldn't play for example an AV1 video in software when there's a hardware block available on the GPU. It's just inefficient and slower.
HW VRS is only used for stereo rendering (XR/VR content) and mesh shaders don't give us custom 64-bit integer depth format support either to be able to perform the necessary depth testing phase during rasterization ...
 
I can't believe it. Why not use hardware VRS? RTX 2000, RTX 3000, RTX 4000, RX 6000, RX 7000, Xbox Series S, Xbox Series X even down to mobile chips like the latest Adrenos and Exynos 2200 all support Hardware VRS!
From the question, I'm not sure you really understand how this stuff works. "Hardware VRS" is nothing magic and has no specific hardware acceleration outside of effectively what has been the MSAA hardware for the last couple decades. It's useful if you need some amount of VRS while doing triangle rasterization (i.e. forward shading), but it is already less important with deferred shading, and simply not applicable with visibility buffers.

And I bet Nanite still not uses mesh shading properly.
Do you have the knowledge to back up such a statement? The source code is public, why don't you go ahead and point us to the parts that you consider "improper" and explain why rather than make baseless claims.

Yes, I'm aware UE5 technically supports mesh shaders and HW VRS. But I say technically because both improve performance not even by a tiny amount.
We've had this discussion before but this is just bad logic. In the case of mesh shaders, they make less of an improvement because Nanite renders all geometry more efficiently in both paths. i.e. Alan Wake 2's non-mesh-shader path is unnecessarily slow (because they chose not to focus on it, which is fine), while Nanite's is still pretty good. There are certainly cases where mesh shaders make a non-trivial difference to Nanite performance as well, but you have to be pretty technical to understand the platform differences and relevant situations. This is not really a topic that can be trivialized into "hardware feature makes gpu go brrrrr".

With VRS an additional advantage of the Nanite compute materials implementation is that it can pack *pixels*, not just quads due material shader analysis and analytic derivatives. Hopefully Graham's slides will be public soon so folks can see the details, but even if so-called "hardware VRS" applied to visibility buffers (which it doesn't), there would still be advantages to Nanite's VRS approach.
 
Last edited:
Back
Top