Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Yes, and Brian Karis also said in that same stream that "I have the whole level loaded in memory right now".

Time-stamped:


Okay, but what's your point? If it's to suggest the only reason the demo was able to run on a PC is because the whole thing was stored in memory, and would otherwise be too much to stream, then lets look at the math of that...

The demo is about 8:30 in length.
We know the Nanite data was only 6.14GB on disc
So the average geometry streaming requirement would be around 12MB/s!

Okay so texture data is much larger as mentioned on the video, but how much larger? We know the entire demo fitted in Brians PC's memory so lets go crazy and assume he has 128GB of RAM in that thing and it's entirely full with this demo.

Even at 128GB the average streaming requirement would still be only about 250MB/s - well within the capabilities of a SATA SDD, let alone an NVMe drive.

Then we've got this post specifically telling us that the PS5 IO isn't required to run this demo.
 
Yes, and Brian Karis also said in that same stream that "I have the whole level loaded in memory right now".
I already explained what he meant by that here: https://forum.beyond3d.com/threads/...ilability-2022-q1.61740/page-112#post-2208251. Again, has nothing to do with Nanite... context was just why the scripting that controlled the light direction throughout the demo was not running. Virtual textures and Nanite were streaming as he flew around there just like they always do with no issues. There's not even currently a way to forcibly prefetch Nanite data because there's no reason to.
 
Last edited:
I admit that the statement is a bit misleading in the context of demoing on the PS5, but I'll also point out that it was made clear at the time that it did not require the PS5 (among other places,
).

There's no debunking needed for Nanite and Lumen being possible on platforms other than the PS5. That much was made perfectly clear 13 months ago in this very same thread.

My question, which I should have clarified better (TBH I was making the most out of some well-deserved vacations with the kids so I just shot a short post in-between pool games, so sorry about that), is if that specific statement from Nick Penwarden is incorrect or it should be interpreted in a different way.
My interpretation of his claim is that all the data from the assets seen at certain points in the demo wouldn't fit inside the console's RAM (not just Nanite / geometry, but texture data too I guess?), so triangles+textures needed to be "constantly streamed in" from the SSD. Him saying the PS5's I/O was "a key feature" that allowed them to achieve that level of realism is what tilted many of us into thinking the data throughput had to be massive. Obviously, referring to the specific IQ settings at ~1440p30 target that we saw in the last year's demo running on the PS5, and not Nanite/Lumen in general.

I guess my simplified question: is the 2020 demo "Lumen in the land of Nanite" rendered at the PS5's IQ settings + resolution + framerate not pushing anywhere close to the raw 5.5GB/s / effective >8GB/s I/O? Would it be solely bottlenecked by GPU compute if paired with e.g. an entry-level 2GB/s NVMe SSD without a fast fixed-function decompressor?




Also, this isn't aimed at you because you're our local authority in the subject and I thank you for the clarification, but I don't get other people's need to debunk something that doesn't need debunking. It's been obvious throughout this thread that Nanite and Lumen were never going to be exclusive to the PS5. Perhaps such an idea rolled in youtube comments and twitter but I haven't seen a single post here claiming such a thing. Making such a requirement would marginalize most of UE5's addressable market and it would be a terrible business choice.
Perhaps making lots of gotcha! posts might feel satisfying somehow, but they're not applicable in this thread and they're not really bringing anything to the table IMO.

Lastly, I'd really like to thank you and your team for the amazing work you've done with UE5. Me and my gamer friends haven't felt this level of breakthrough in realtime graphics since the E3 demo of Doom 3, and that was almost 20 years ago!



There is a flaw in the logic here: Referring to X instances of the same statue neither stresses memory nor storage. So i would assume this is probably a failure on reproducing of what was initially meant. I mean, such interviews get edited, shortened, and journalists do lots of bugs too.
The sentence isn't coming from an edited interview, it's a direct quote from last year's roundtable at summer game fest. Here is Nick Penwarden's statement, properly timestamped.
There's also a bunch of statements from Tim Sweeney clarifying that Nanite is possible on PC and Xbox Series X (Series S hadn't been disclosed back then), not just the PS5.
Though after listening to the video, it seems he wasn't talking about the statues' room. He's talking about realism in the demo in general.
 
I guess my simplified question: is the 2020 demo "Lumen in the land of Nanite" rendered at the PS5's IQ settings + resolution + framerate not pushing anywhere close to the raw 5.5GB/s / effective >8GB/s I/O? Would it be solely bottlenecked by GPU compute if paired with e.g. an entry-level 2GB/s NVMe SSD without a fast fixed-function decompressor?
Again I'll won't put words in Nick's mouth, but to your specific question I can reiterate to be absolutely clear: the demo Brian was running on his PC during the twitch stream with just a regular SSD (no DirectStorage or anything fancy) is the same demo/content that was run last year on the PS5. Obviously there have been engine improvements since then, but nothing that really affect the IO question here. An SSD is important to make this stuff work well, but it doesn't need to be a super fancy one. All the Nanite and virtual texture data can happily stream as you move around the world dynamically. Fabian's twitter thread that was linked earlier is a great summary in general.

Lastly, I'd really like to thank you and your team for the amazing work you've done with UE5. Me and my gamer friends haven't felt this level of breakthrough in realtime graphics since the E3 demo of Doom 3, and that was almost 20 years ago!
I'll defer the majority of credit to Brian and the others as I'm new here, but I'm glad you're excited! Honestly while I am impressed by this stuff at a technologically level, the thing that actually gets me hyped is being able to jump into the editor, import some meshes and models easily and throw together some really nice looking scenes as a non-artist. Obviously real artists are still going to make stuff that looks way better than my amateur attempts, but bringing the floor up and making it much easier for everyone to be creative is really empowering, as we're already seeing from the crazy stuff people are making in the short time since early access has been available. Personally, realizing that we're kinda getting to the point where it's faster to iterate and mess around with stuff in UE5 than in tradition CAD tools for a lot of tasks even if the goal is not to make a real-time application is kind of mind-blowing.
 
I think allowing creative people to work without as much worry about polygon and draw budgets will be a great thing for gaming. Obviously there will always be performance considerations, but it’s one less thing to worry about.
 
I think allowing creative people to work without as much worry about polygon and draw budgets will be a great thing for gaming. Obviously there will always be performance considerations, but it’s one less thing to worry about.
I doubt that there won't be any polygon or other assets budgets due to a simple need to fit the game onto console's storage.
The recent demo - being what, 10 minutes long? - is already 25GBs of storage cooked which is about 1/4th of what you'd realistically want your whole game to weigh in distribution.
 
The recent demo - being what, 10 minutes long? - is already 25GBs of storage cooked which is about 1/4th of what you'd realistically want your whole game to weigh in distribution.

There seems to have been very little if any thought given to optimizing asset sizes in the demo. Looks like the team quickly threw it together with a crazy amount of overlapping assets / overdraw.

Nanite took care of that from a performance standpoint but there’s probably a lot that could’ve been done to reduce asset sizes on disk by properly modeling the environment instead of just throwing a bunch of rock clusters on top of each other.
 
There seems to have been very little if any thought given to optimizing asset sizes in the demo. Looks like the team quickly threw it together with a crazy amount of overlapping assets / overdraw.

Nanite took care of that from a performance standpoint but there’s probably a lot that could’ve been done to reduce asset sizes on disk by properly modeling the environment instead of just throwing a bunch of rock clusters on top of each other.
Yeah but that's exactly what artists will have to care about. So while Nanite (and Lumen, to a lesser degree) solve the technical part of why designers should care about how they design assets there will be other limitations still which won't allow to just dump raw models into the project and be done with it.
 
Nanite took care of that from a performance standpoint but there’s probably a lot that could’ve been done to reduce asset sizes on disk by properly modeling the environment instead of just throwing a bunch of rock clusters on top of each other.
I think it's the exact opposite. Modeling with (few) rock instances needs less storage than modeling a whole environment at similar detail.
That's what is most shocking to me about UE5 and the workflow it suggests. It has its artistic limitations, but there is no better way of compression.
 
The only problem that in games you start noticing the same features repeated as they're reused. Even across different games.
 
I think it's the exact opposite. Modeling with (few) rock instances needs less storage than modeling a whole environment at similar detail.
That's what is most shocking to me about UE5 and the workflow it suggests. It has its artistic limitations, but there is no better way of compression.

Thats true. There was a lot of asset reuse but there was also a lot of back facing / occluded geometry. Would be interesting to see how it compares to an optimized version with more unique assets but we’re not so lucky.
 
Even across different games.
That's a big one. Guess they need to pump more money on scanning. Quixel lib feels tiny in comparison to something like Shutterstock.

Help games getting better by playing even more Fortnite! :D

Edit: Killer storage idea: Shared local asset cache for all UE5 games. Download only what's missing. :)
 
Last edited:
How effective can compression be on geometry using zlib or kraken?
 
Back
Top