Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Without certain hardware there is no ray tracing and no AI super resolution so it plays a major role.
With Tensor and RT cores Nvidia has ushered in a new and exciting graphics era. I didn't see much development before that. Without ray tracing the games oof today would still not stand out much from a Star Wars Battlefront 2015.

Hellblade from Microsoft could currently be the best-looking console game. The flight simulator will also be technically complex. According to the videos I think the level of detail in the new flight simulator is very good but there are a lot of shadows in the lighting.

I haven't seen much technical ambition from Sony recently. Whether PC port or Sony exclusive games. They used to be further. Now they are lagging behind.

People often complain about Ubisoft but Avatar and Star Wars Outlaws are currently among the best-looking games on the PC.

Today smaller studios like Remedy can also achieve a lot. Maxed out Alan Wake 2 lis only for hardware enthusiasts but it looks extremely good.

With usuing Unreal small studios can produce very good looking games where most big studios can hardly keep up.
I don’t disagree but at this moment in time we’re seeing a lot of momentum using the compute pipeline. UE team has accomplished a significant amount in various aspects with compute shaders.

It enables them to backport their work to much older GPUS. That isn’t to say RT cores and AI processing aren’t important features, these are both landmark features. But the engine needs to be beefy to really make games look next gen whether they decide to leverage that hardware or not.

I don’t think any other engine is close to what UE5.5 can do right now (hardware or software based).

It’s just a crazy amount of work that needs to be done, studios that can develop both a game and an engine at the same time are likely reconsidering.

with so many helper studios picking it up, everyone knows unreal, it just seems to make so much sense for so many studios to make the switch.
 
Silent Hill 2 running on PS2 (2001).

SH2_Mirror.jpg


Silent Hill 2 remake running on PS5 (2024).

fndXr23.jpeg


Do I have to elaborate? I don't care how they did it on PS2. PS5 is like 1OO times more powerful than PS2.
 
I wonder why, with software lumen, you can't have at least just the main character traced against triangles instead of the sdf's.

But in some scenes, don't know how, they are doing the mc reflection.
Maybe that's what they are doing here :unsure:

Edit: found some footage of the reflection on a mirror (minor spoilers)

 
Last edited:
Edit: Some comments from the guy on twitter says accumulation is 12 frames by default. You can lower that to reduce ghosting, but then the tradeoff is noise. I imagine like all accumulation problems, at 120 fps or higher it'll look a lot nicer and megalights would help people hit 120 fps.
Yep, or increase sample counts or lights/pixel. There are a myriad of tradeoffs that can be made but as I think everyone is increasingly aware in real time (but has known for decades in offline...), the fundamental tradeoff between performance, noise, ghosting and sampling is going to be the main dials for the foreseeable future. You can shake up how you expose those dials and mess with various heuristics with different content, but there's no escaping signal theory in the end.

MegaLights itself will continue to improve but as with similar algorithms in other games, there's no free lunch. But you can certainly crank up sample counts and such on high end PCs to be more directly comparable to more expensive stuff like RTXDI.
 
Last edited:
But you can certainly crank up sample counts and such on high end PCs to be more directly comparable to more expensive stuff like RTXDI.

Is there something about RTXDI that makes it inherently more expensive? Or is it just that RTXDI turns those same knobs higher by default ?
 
Additional UE 5 developer plugins available.
For starters, the Audio2Face 3D Plugin is now available for both Unreal Engine 5 and Autodesk Maya. This tool enables AI-powered facial animations and lip syncing. It works by analysing audio, and then generating animations to best match what the character is expressing.

Then Nemotron-Mini 4B Instruct Plugin and Retrieval Augmented Generation (RAG) Plugin are also now available for Unreal Engine 5. These two tools provide response generation for interactive character dialogue and supply contextual information to enhance character interactions.

Finally, Nvidia has also announced that Epic's Unreal Pixel Streaming technology now supports Nvidia ACE, allowing developers to stream high-fidelity MetaHuman characters via Web Real-Time Communication. This is designed to help developers with bringing AI-powered digital humans to games and applications, with low-latency and minimal memory usage on Windows PCs.
 
Last edited by a moderator:
MegaLights itself will continue to improve but as with similar algorithms in other games, there's no free lunch. But you can certainly crank up sample counts and such on high end PCs to be more directly comparable to more expensive stuff like RTXDI.
RTXDI exhibits image blurring.

With both now available in UE5 it's only a matter of time when someone will do a direct comparison.
 
Is there something about RTXDI that makes it inherently more expensive? Or is it just that RTXDI turns those same knobs higher by default ?
There are definitely algorithmic differences; different design targets imply different choices. MegaLights is meant to work on consoles with some scaling up from there. RTXDI is targeted at high end PCs with some scaling down. I'm sure people will compare things in the middle ground and I imagine there will be different tradeoffs depending on how high or low the performance/hardware target is. Things are still very early for MegaLights so both it and RTXDI will likely continue to change a fair bit. While there are similar overall goals, I imagine the primary design targets will remain a bit different in the near term.

RTXDI exhibits image blurring.
Of course it does in some places, as does MegaLights. And both exhibit ghosting in various places too (see the discussion around the issues following the cyberpunk ray reconstruction update). There's no magic. Anyone claiming these algorithms are immune to these tradeoffs is selling something. Various algorithms will have strengths and weaknesses of course but these tradeoffs are fundamental to stochastic algorithms with temporal reuse and denoising. Unfortunately this is also what makes it really easy for anyone to cherry pick good and bad cases to support whatever narrative they desire.
 
Last edited:
I'm sure Sweeney talked about about the next 10 years in his keynote, but I didn't watch it. The Verge have an interview with him. Looks like the key aim for UE6 is not ultramegalights, but bringing UE and UE editor for Fortnite together. It's the metaverse baby! 🤮

Ok, I can dig that.. but at the same time the key aim should be getting all the features they have now more performant and fixing serious pain point issues that are continually recurring in most Unreal Engine releases. Go to any Unreal Engine based game specific forum, and you will find everyone criticizing, and quite frankly making fun of Unreal Engine and its traversal stuttering issues. More and more games now are pre-compiling shaders, which is great, but now it's shifting the focus hard to these loading stutters. Enough of a stink was made about shader compilation that Epic took notice and made major changes and improvements to it.. hopefully the same can be done here and they do something about it.. because it IS true that there are very few Unreal Engine games which don't have this issue.
 
That seems to be true of any complex software. Progress is made by extending features rather than fixing what's already in 100%. But as the two activities, adding features and bug-fixing, are generally orthogonal, I don't know how much devs should focus entirely on fixing things. Though it certainly feels like 'quality' is lower priority these days than it used to be, with applications seeming generally more buggy for longer, with sometimes pretty simple looking bugs going unfixed over many iterations.
 
That seems to be true of any complex software. Progress is made by extending features rather than fixing what's already in 100%. But as the two activities, adding features and bug-fixing, are generally orthogonal, I don't know how much devs should focus entirely on fixing things. Though it certainly feels like 'quality' is lower priority these days than it used to be, with applications seeming generally more buggy for longer, with sometimes pretty simple looking bugs going unfixed over many iterations.
To be nitpicky: bug-fixing and optimizing are two different things. Unfortunately you can't easily proof that something is always working sanely. I'm appearing to tiptoe around "bug" here, but there's just so much code around that is broken but harmless. Most companies I worked for consider it a bug just when it shows apocalyptic misbehaviour.
So I think all three stages are subject to cuts, or weakening of rigor - "this subfeature requires me to rewrite the system xy, so I left it out", "this code blows up when malloc() returns nullptr, but if that happens we have no way to continue anyway", "this feature has quadratic complexity, but the code is only 100 lines long, to make it fast it mutates to 100k lines, so live with it how it is".

I think, from my experience, the "problem" is that most code is just not universal/general, it's specific/contextual. And even if you swear by your family that this refactor is the refactor to end them all ... it's just not true. Adding a new features produce a cascade of changes, with new bugs, new suboptimalities. And you get taught the mantra that everything should be as specific as possible.
Ultimately, I guess, it's just that our way of expressing all of "this", the code, might not be the right way to do it. Not if you want to make changes and as a consequence not get bugs or slow down.
 
It's ironic that 343 chose to maintain its own fork of Bungie's Blam Engine despite retaining few if any ex-Bungie devs experienced with that engine only to eventually move to Unreal anyways, while the Coalition inherited a franchise that already ran on Unreal Engine and many experienced Epic devs.
 
Back
Top