Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Bethesda gives the impression of being obstinately "backwards" on tech, and certainly there was a lack of investment. But their what-you-see-is-what-you-get realtime editor dedicated to kitbashing and the user facing tech being able to stream an indefinitely sized world were two things a lot of other studios/engines thought of as far more "techy" were missing.

Sony Santa Monica and Naughty Dog both still use Maya as a game editor. UE4 had to be painfully bashed into shape to get the two big open world games on it, Hogwarts Legacy and FFVII Rebirth, to really work well. You'll notice in terms of open world stuff even Skyrim has more features than these 2 do and that was over a decade ago.

As Starfield was started before UE5 that wasn't an option. And it's not until UE5 that the engine and editor were made for open world streaming games. UE5's open world stuff still needs work actually, navmesh generation for large worlds is listed as "experimental" in 5.4, etc. etc.

But Bethesda's lack of investment in their own engine has become painfully obvious. Nanite, a deep procedural world generation system, Metahuman, and spline meshes all seem tailor made to the sort of workflow Bethesda uses to produce masses of content relatively quickly. Open world streaming is effectively in on multiple levels while still half missing in Starfield. Realtime lightingwould be a huge workflow benefit. I can easily see them switching for ES6, but admittedly I can also see them not doing so.
biggest problems with the creation engine games are the way characters and particularly faces animate, if they could do a big lift there it would help a lot
 
Is that really a technical issue or is it just because faces aren't mocapped?
You don't need mocap, Cyberpunk used an AI generated facial animation system. But you could use mocap, BG3 managed to do it for their hundred+ hour game.

Either way Starfield was still pretty poor here, some sort of improved solution would be beneficial, "Metahuman" allows an incredibly flexible number of faces to be mapped onto the same (animation) rig, so they could produce a giant variety of faces and would have the ability to display detailed animation for all of them in realtime.

But how to produce the animations themselves even if you now have the ability to display them? One example is Nvidia's Audio2Face, it's vacant and a kinda uncanny valley creepy compared to Cyberpunk's excellent results, but it would still be an improvement for Bethesda:

 
You don't need mocap, Cyberpunk used an AI generated facial animation system

'AI' sells short the amount of manual work that still needs to happen. They tune the emphasis / expressions timeline for each language's dialogue. Looks great imho though.

One thing that's interesting is that CDPR likely be bringing that whole pipeline to metahumans in their next game. It'll fun to put that next to heavily mocapped characters in the likes of Rise of Hydra and other big budget 2025+ UE5 titles.
 
@Andrew Lauritzen - How easy would it be for a dev team to go from software Lumen to hardware?

I'm thinking of PS5 Pro in this case as I think that could be a quick upgrade devs could offer for potentially little work on their end?

What about the games that don't use Nanite, would that be an easy switch?
 
So up front the usual caveat here - I'll give my 2c but HWRT is not my primary area of the engine, so take everything with a grain of salt.

@Andrew Lauritzen - How easy would it be for a dev team to go from software Lumen to hardware?
This seems to get thrown around a lot as something that's "just an easy option to expose", and while in some cases that can be true functionally, there's a lot of considerations that come with maintaining a BVH for raytracing, which is ultimately what most considerations are around. As I've noted ad nauseum at this point, tracing rays is not really the difficult part, it's maintaining a raytracing acceleration structure at sufficient quality and accuracy in dynamic scenes.

To that end, there are generally a few considerations:

1) Memory. RT BVHs can get big quickly, especially if they need to span a large part of the scene and be at relatively high quality. This can sometimes be partially offset by removing distance fields, but many games use distance fields for other things (VFX being a big one), so they can't be dropped without alternatives. HWRT is often not a good substitute for these DF-based VFX functions that are looking more for smooth, continuous fields more than visibility queries.

2) TLAS and BVH update cost. For performance reasons on PC, you are generally limited to a few hundred thousand instances max as there's no good way to stream/partially update the TLAS. Many modern Nanite scenes use far more than that number so various compromises have to be made, generally via dropping small instances, heavily clamping RT scene distances and/or leaning heavily on HLOD to try and combine and simplify the static scene. The latter brings one a little bit back towards a baking-style workflow which is of course undesirable. BVH update cost can be an issue in scenes with lots of deformable vertex animation (characters, foliage, etc.)... basically anything that is Nanite unfriendly is *even more* unfriendly to raytracing.

Given the above, the question is really for a given set of content is it reasonable to maintain a BVH at sufficient quality such that it would at least provide improved visuals without completely destroying performance. Despite the consumer sentiment of "just expose everything!" I think AAA game devs are right to try and maintain a reasonable level of QA on things they ship. Several recent DF videos have touched on cases where I think it's clear that certain combinations of options should simply not have been exposed at all.

I'm thinking of PS5 Pro in this case as I think that could be a quick upgrade devs could offer for potentially little work on their end?
Consoles are an additional consideration since generally there you ship the acceleration structures themselves and stream them. There's a lot of upsides to that, but the downside is it means recooking the game and potentially large patches. Overall though the considerations will depend a lot on the content of the given game, as described above.

What about the games that don't use Nanite, would that be an easy switch?
No simple answer to that one. On one hand not using Nanite generally implies simpler geometry and smaller instance counts, which does make RT somewhat easier. On the other though Nanite can accelerate a bunch of stuff with Lumen so there are likely to be some caveats in practice depending on the original content target.

In general while the engine does make it a lot easier to do HWRT than if you were implementing it from scratch or something (and this is continually improving), the details matter. We're not yet at the point where you can just turn it on and assume that engine magic will take any content and make it work well with raytracing. Hopefully we'll get to that point eventually but we do need some graphics API, engine and hardware evolution to that end.
 
Last edited:

Lots on show like the snow deformation. I feel the game looks it's best with direct lighting, in contrast to most games. The GI works really well in the sunlit scenes. If the more overcast scenes it doesn't seem that different from older titles.

Game's releasing in a couple of weeks so it must be looking like this in play. I guess the question then is what performance is like.
 
there were rumors of a playable demo prior to release, i hope it turns out true, i'm not a fan of the genre, but i'd like to try it to see the graphics up close.
They released a benchmark, havn't had a chance to play with it yet but this might be enough satiate your urge to want to check it and of course anyone else here.


edit: it's 8GB.
 
They released a benchmark, havn't had a chance to play with it yet but this might be enough satiate your urge to want to check it and of course anyone else here.


edit: it's 8GB.

7fps @ 4K native max settings on my 3090 😄
 
They released a benchmark, havn't had a chance to play with it yet but this might be enough satiate your urge to want to check it and of course anyone else here.


edit: it's 8GB.
Honestly, a fly by isn't a good benchmark. When you are up close to the bosses with effects covering the screen, that's a real benchmark.
 
They released a benchmark, havn't had a chance to play with it yet but this might be enough satiate your urge to want to check it and of course anyone else here.


edit: it's 8GB.
at the settings it automatically chose, 3 hitches

1723558565219.png
 
had to close the benchmark and start it again from steam when i enabled raytracing

less big hitches but still i counted 3 noticeable ones, graph shows 2?

1723558931046.png
 
with the dlss slider set to 100 (dlaa)
even with these settings there is noticeable pop-in of vegetation and some other lod stuff. i noticed around the heads in the water and tufts of grass after the river portion is over

1723559462173.png
 
Last edited:
Quick summary - how does it look and how does it run, just from eyeballing?


So it's just a scenery flypast, pretty but low framerate. PS5 version will be interesting. 60 fps seems quite a big ask but we'll see how UE5 scales from this.
 
Last edited:
Back
Top