Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

It seems like a meaningless distinction. It has plenty of sophistication for npc and physics behaviour. You could build a game with the components it demos. The open question is the scope of the experience you could squeeze in alongside the elements already on show.
Yeah minor aside on this if you guys will permit.

Recently - specifically during some of the discussion and apologetics around starfield performance - I've started to realize that people have some strange ideas about what really costs performance in games. We've had a lot of discussion about one side of that here (people underestimate the cost of rendering dynamic lighting), but people also seem to have strange ideas about how much "gameplay stuff" costs intrinsically.

Now of course every game is different to some extent, but while things like physics, animation and pathfinding can certainly add up, I don't know why people have this notion that things like quest mechanics, tracking various progress through the game or remembering where objects were placed is expensive in any way. That stuff is sometimes complicated from an engineering perspective in terms of keeping it all organized and handling all the edges cases and QA and so on, but it's not expensive from a performance budget point of view. There's a reason why scripting languages are generally fine for that level of game code. Remembering where a dynamic object was rather than resetting it to some initial position is only a save file size consideration, and even then... this is a tiny amount of data compared to the stuff that engines deal with in producing a single frame.

To bring it back on topic - IMO the Matrix/CitySample demo does include most of the performance-relevant systems that you would have in many full games. Obviously a significant amount of *work* would be needed to make it into a full game of any complexity, but I think it definitely serves as a pretty good proof of concept that a game of that fidelity would definitely be possible. And indeed I would expect that one could improve performance in newer engine versions and with some optimization.

Now how/when/if we get games that try and push that level of quality is a more complicated question, because that involves a lot of production decisions, prioritization, content resources, etc.
 
Last edited:
Yeah minor aside on this if you guys will permit.

Recently - specifically during some of the discussion and apologetics around starfield performance - I've started to realize that people have some strange ideas about what really costs performance in games. We've had a lot of discussion about one side of that here (people underestimate the cost of rendering dynamic lighting), but people also seem to have strange ideas about how much "gameplay stuff" costs intrinsically.

Now of course every game is different to some extent, but while things like physics and pathfinding can certainly add up, I don't know why people have this notion that things like quest mechanics, tracking various progress through the game or remembering where objects were placed is expensive in any way. That stuff is sometimes complicated from an engineering perspective in terms of keeping it all organized and handling all the edges cases and QA and so on, but it's not expensive from a performance budget point of view. There's a reason why scripting languages are generally fine for that level of game code. Remembering where a dynamic object was rather than resetting it to some initial position is only a save file size consideration, and even then... this is a tiny amount of data compared to the stuff that engines deal with in producing a single frame.

To bring it back on topic - IMO the Matrix/CitySample demo does include most of the performance-relevant systems that you would have in many full games. Obviously a significant amount of *work* would be needed to make it into a full game of any complexity, but I think it definitely serves as a pretty good proof of concept that a game of that fidelity would definitely be possible. And indeed I would expect that one could improve performance in newer engine versions and with some optimization.

Now how/when/if we get games that try and push that level of quality is a more complicated question, because that involves a lot of production decisions, prioritization, content resources, etc.
Only speaking for myself here, that is what I have always seen stated as one of the primary reasons why games and tech demos are not comparable. All the non visual aspects are still quite expensive in terms of perf.
 
Only speaking for myself here, that is what I have always seen stated as one of the primary reasons why games and tech demos are not comparable. All the non visual aspects are still quite expensive in terms of perf.

Tech demos are only hard to compare to full games because programmers and artists get to spend everything on a single scene, "look what we could do with unlimited budget!" kind of thing. For a full length game only Rockstar has that sort of budget. Fortunately The Matrix Awakens is much of a "game demo" level of content, day/night cycle and traffic and a big map and all.

Immortals of Aveum looks good when you get past the terrible image quality on consoles and ignore the obvious transition of engines mid development, and try to get past that they had bigger ambitions than budget. UE5 isn't some magic thing, Fortnite is a very well funded game that made a lot of cuts to get to 60 consoles where Aveum ended up with "just lower the rendering resolution more, it'll work" for whatever internal reasons they had, and it came out the worse for it.
 
Tech demos are only hard to compare to full games because programmers and artists get to spend everything on a single scene, "look what we could do with unlimited budget!" kind of thing. For a full length game only Rockstar has that sort of budget. Fortunately The Matrix Awakens is much of a "game demo" level of content, day/night cycle and traffic and a big map and all.

Immortals of Aveum looks good when you get past the terrible image quality on consoles and ignore the obvious transition of engines mid development, and try to get past that they had bigger ambitions than budget. UE5 isn't some magic thing, Fortnite is a very well funded game that made a lot of cuts to get to 60 consoles where Aveum ended up with "just lower the rendering resolution more, it'll work" for whatever internal reasons they had, and it came out the worse for it.
I would counter that even when you look at games with small scale and very high budgets like TLOU, the result doesn't come close to something like the Infiltrator demo which came out almost 10 years prior and runs just fine on a GTX 680.

I can’t agree on Immortals. I think it looks mediocre compared to the good looking, last gen games.
 
Last edited:
A presentation on some recent UE5 updates from today might be of interest to some folks here, particularly some of the later architecture parts:

There's a bunch of other interesting talks about optimizing performance in UE5 and so on as well but those are targeted a bit more specifically at developers who are more familiar with the UE technology. But feel free to browse around in the 3 live streams (6 I guess over the two days)!
 
Last edited:
A presentation on some recent UE5 updates from today might be of interest to some folks here, particularly some of the later architecture parts:
Epic is investing in Nanite and Ray Tracing, and also Path Tracing. Interesting .. They specifically talk about enabling 60fps for Hardware Lumen on consoles, and enabling using Ray Traced shadows with Nanite.
 
A presentation on some recent UE5 updates from today might be of interest to some folks here, particularly some of the later architecture parts:

There's a bunch of other interesting talks about optimizing performance in UE5 and so on as well but those are targeted a bit more specifically at developers who are more familiar with the UE technology. But feel free to browse around in the 3 live streams (6 I guess over the two days)!
Of course the same day I accuse Epic of no longer talking about the issue of Shader Compilation, I find out they talked about in their livestream from the day before! :D

Outside of that, there's some very nice improvements on the horizon it seems! I'd really love to see Epic and the teams responsible for The Matrix Awakens demo to implement all these improvements made to the engine since then, and showcase just how much better it can be now on the exact same hardware.

Very excited for this:

Screenshot-2023-10-04-192923.png
 
Last edited:
Wow nice, UE5 with Nanite and raytraced GI, reflections and shadows will be incredible. Looks like the team is working on some serious optimizations across the board.
 
A presentation on some recent UE5 updates from today might be of interest to some folks here, particularly some of the later architecture parts:

There's a bunch of other interesting talks about optimizing performance in UE5 and so on as well but those are targeted a bit more specifically at developers who are more familiar with the UE technology. But feel free to browse around in the 3 live streams (6 I guess over the two days)!
the mobile lumen presentation was great. definitely excited to see lumen on low powered devices
 
The substrate system in UE truly is a fundamental change to the material system. UE's legacy Gbuffer layout consisted of 4x RGBA8 (16 bytes/pixel) render targets whereas the new Gbuffer layout can potentially reach upto 100 bytes/pixel in the worst case scenario depending on the material! With a 1080p render target, the new Gbuffer could consume as much as ~200MB of memory and given that it's somewhat common to see a couple dozen full screen passes (AO/SSR/decals/lighting/post-processing) we could very easily consume over GB/s worth of memory bandwidth per frame!
 
The substrate system in UE truly is a fundamental change to the material system. UE's legacy Gbuffer layout consisted of 4x RGBA8 (16 bytes/pixel) render targets whereas the new Gbuffer layout can potentially reach upto 100 bytes/pixel in the worst case scenario depending on the material! With a 1080p render target, the new Gbuffer could consume as much as ~200MB of memory and given that it's somewhat common to see a couple dozen full screen passes (AO/SSR/decals/lighting/post-processing) we could very easily consume over GB/s worth of memory bandwidth per frame!

Substrate gets baked down to a single, if large-ish render target. I don't know how they do this exactly, the presentation seemed a bit hazy on this for me but maybe I missed something. But other than not being a thing g-buffer it shouldn't be anything worse than the current one once baked.
 
I would counter that even when you look at games with small scale and very high budgets like TLOU, the result doesn't come close to something like the Infiltrator demo which came out almost 10 years prior and runs just fine on a GTX 680.

I can’t agree on Immortals. I think it looks mediocre compared to the good looking, last gen games.

As for "Infiltrator" that's not even a tech demo, that's a cinematic. Play any modern cinematic heavy game and you can see just how much you can push realtime cinematics vs gameplay. It's often down to lighting, you can fake a ton of lighting in cinematics just because it's not gameplay, put lights that don't make any sense and would look broken and buggy if the player controlled the camera, but they don't!

For immortals, it can look pretty good, there's as many polys on the static objects here as Forbidden West on PS5, though maybe about the same on foliage for PS4 version, but that's what you get from a long in development, mid engine switch game:

mason-sinkula-ioa-artblast-lucium-06-mason.jpg


As a reminder, here's what AC Mirage looks like. Yes this is a screenshot. And this is on an engine that was decently (if certainly not the best) AAA 3 years ago:

assassins-creed-mirage-review-1.jpg
 
Last edited:
As for "Infiltrator" that's not even a tech demo, that's a cinematic. Play any modern cinematic heavy game and you can see just how much you can push realtime cinematics vs gameplay. It's often down to lighting, you can fake a ton of lighting in cinematics just because it's not gameplay, put lights that don't make any sense and would look broken and buggy if the player controlled the camera, but they don't!

For immortals, it can look pretty good, there's as many polys on the static objects here as Forbidden West on PS5, though maybe about the same on foliage for PS4 version, but that's what you get from a long in development, mid engine switch game:

mason-sinkula-ioa-artblast-lucium-06-mason.jpg


As a reminder, here's what AC Mirage looks like. Yes this is a screenshot. And this is on an engine that was decently (if certainly not the best) AAA 3 years ago:

assassins-creed-mirage-review-1.jpg
You can free roam around the Infiltrator demo and everything still looks amazing. Watching it again now, I still think it looks better than any available game. The solidity of the materials and rendering is still just amazing.


As for AC, those games have been very average by AAA standards after Unity. That said, a better image comparison would be both at a similar viewing distance to have comparable LOD models.

image_assassin_s_creed_mirage-45016-5058_0002.jpg
 
Last edited:
Substrate gets baked down to a single, if large-ish render target. I don't know how they do this exactly, the presentation seemed a bit hazy on this for me but maybe I missed something. But other than not being a thing g-buffer it shouldn't be anything worse than the current one once baked.
https://advances.realtimerendering.com/s2023/2023 Siggraph - Substrate.pdf (details on page 65+)

So how substrate's Gbuffer layout works is that it features a variable amount of bytes per pixel for it's data structure depending on the material features used in question. They do already pack the Gbuffer tightly so as not to be wasteful on memory and bandwidth consumption so we only pay these extra costs down to the specific pixels based on those features ...
Microsoft PowerPoint - 2023 Siggraph - Substrate Final-1.png
If you stick to the same materials in the legacy shading model prior to the introduction of substrate, the new Gbuffer layout isn't any worse but it will absolutely get progressively more expensive if we start using features like anisotropy, coating slabs, specular look up tables, and glints as they start occupying larger sections of our Gbuffer by pixel ...
 
You can free roam around the Infiltrator demo and everything still looks amazing. Watching it again now,
Infiltrator is indeed more towards the tech demo end of the spectrum. There's no real physics, low dynamic actor counts and only limited animating running for the few key cinematics. The fly-around mode is all static geometry and of course the demo is largely baked lighting.

The point is really that Matrix/CitySample does have most of the expensive systems a full game would in place; it's certainly a lot closer to a real game than your typical rendering demo.
Watching it again now, I still think it looks better than any available game. The solidity of the materials and rendering is still just amazing.
It definitely has very good art fit to the tech of the time, but I don't think I'd personally go as far as you there. Bumpy+shiny is a crowd pleaser though and has been since the early days of rendering. I do personally think most parts of a modern game like Jedi Survivor look better than that demo, and do so in much larger environments. Going further back, I'd even put the original Star Wars Battlefront up against any of these demos.
 
Infiltrator is indeed more towards the tech demo end of the spectrum. There's no real physics, low dynamic actor counts and only limited animating running for the few key cinematics. The fly-around mode is all static geometry and of course the demo is largely baked lighting.

The point is really that Matrix/CitySample does have most of the expensive systems a full game would in place; it's certainly a lot closer to a real game than your typical rendering demo.

It definitely has very good art fit to the tech of the time, but I don't think I'd personally go as far as you there. Bumpy+shiny is a crowd pleaser though and has been since the early days of rendering. I do personally think most parts of a modern game like Jedi Survivor look better than that demo, and do so in much larger environments. Going further back, I'd even put the original Star Wars Battlefront up against any of these demos.
To my eyes this looks much closer to CGI while the titles you mentioned still look quite a bit more gamey by comparison.
 
Back
Top