Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

One big issue I see is if this 9 min game demo was 10's or 100's of GB? because of the ultra high detail models, we will not see this level of detail in 10 hour+ games. That would be fun with only one game to fill the SSD;). I think this demo intended to show us the extreme possibilities and based on all reactions they certainly succeeded. When the model detail is reduced to normal game sizes we will se higer FPS and resolutions. In the light of this new way of rendering the gaming world, we are getting game engines and gaming hardware that is closing in to not being a limiting factor, but more for storage space and how much resources the developers are willing to spend. Of course games need more on top of this, but always nice to see the "wheel reinvented". Exciting times ahead.:yes:
 
After it is impossible to see the loss of details at normal speed and the level of details is so high the streaming problem is not visible at all. After it probably depends of SSD speed but it shows that SSD is never fast enough.
What laptop was it? Is that confirmed?
 
One big issue I see is if this 9 min game demo was 10's or 100's of GB? because of the ultra high detail models, we will not see this level of detail in 10 hour+ games. That would be fun with only one game to fill the SSD;). I think this demo intended to show us the extreme possibilities and based on all reactions they certainly succeeded. When the model detail is reduced to normal game sizes we will se higer FPS and resolutions. In the light of this new way of rendering the gaming world, we are getting game engines and gaming hardware that is closing in to not being a limiting factor, but more for storage space and how much resources the developers are willing to spend. Of course games need more on top of this, but always nice to see the "wheel reinvented". Exciting times ahead.:yes:

On era one dev said that compression techniques make size a lesser issue and that epic had likely already taken that into account to make it viable for devs to use
 
On era one dev said that compression techniques make size a lesser issue and that epic had likely already taken that into account to make it viable for devs to use

Nah,size is insane for virtualized anything, there is absolutely no silver bullet for compression. Mathematically it's strictly impossible beyond a certain point, and the demo they showed would take an incredibly large space for a full game. There's already jokes on dev twitter about how many Blu-rays and terabytes you'd need for a 10 hour game. Hell install size is one of the reasons the newest Id Tech, the one that powers Doom Eternal, ditched much the same virtualized texture system uses partially over the large install size it requires.

Tim Sweeney did have ideas for getting near maximum possible compression, but he talked about them back near when UE4 was first released. 7 years later nothing has come of it.
 
If thats true then all this tech they introduced is pretty much useless in most cases no? Since it would take up so much space alone. Seems like a silly thing for epic to focus on for a new engine with no path forward but i guess it wouldent be be the first time. What a fuck up
 
Is it rly necessary to go with the LOD lvl 0 assets? I think this is an option, not a must.

How about generating an LOD lvl 1 or lvl 2 asset and use this as baseline for the engine? This should still be far better over all, compared to the traditional detail and workflow, as no normal maps or hierarchy of LOD levels need to be generated.

I do not know how the memory scaling with LOD level works, but if a level of LOD corresponds to roughly doubling the size of a triangle, the number of tris should be reduced by a factor of 4, and hence the memory? So lvl 2 would reduce memory by factor of 16 or so.

What LOD levels are typically used in modern games in close up cinematics (highest LOD)? Shouldn‘t this highes LOD lvl in current top PC games be a good baseline lvl for UE5? Or maybe one level less to increase fidelity beyond?
 
Is it rly necessary to go with the LOD lvl 0 assets? I think this is an option, not a must.

How about generating an LOD lvl 1 or lvl 2 asset and use this as baseline for the engine? This should still be far better over all, compared to the traditional detail and workflow, as no normal maps or hierarchy of LOD levels need to be generated.

I do not know how the memory scaling with LOD level works, but if a level of LOD corresponds to roughly doubling the size of a triangle, the number of tris should be reduced by a factor of 4, and hence the memory? So lvl 2 would reduce memory by factor of 16 or so.

What LOD levels are typically used in modern games in close up cinematics (highest LOD)? Shouldn‘t this highes LOD lvl in current top PC games be a good baseline lvl for UE5? Or maybe one level less to increase fidelity beyond?

I'd say go go with whatever LOD, detail level you're comfortable with and run with that.
If you're the one scanning your own assets you don't necessarily have to scan *billions and billions* of triangles. and you don't have to necessarily use the best assets. Find a level you're comfortable with and run with that.
 
I'm curious how game size limitations might be worked around by way of streaming.

A 400GB game isn't reasonable for the vast majority of players (although I'd personally be quite happy buying a photorealistic Skyrim VR on its own, dedicated SSD.) But if a PS+ or XBL subscription could grant streaming access to such a game, I wonder how popular that would be?

And in such an instance, would it make more sense to limit the max resolution to something like 1440p at the server end, and use the considerable grunt of the PS5/XSX to upscale it locally? Would it be wasteful for the server-console (for want of a better term) to spend resources upscaling to a full 4K image, only for some amount of that resolution to be lost in the compression->decompression process the streaming entails?

Or is the upscaling so intrinsic to the engine that it only makes sense to do so locally? Given their emphasis on temporal accumulation, I assume that some amount of upscaling and AA is a consequence of local temporal data, which presumably would be difficult to decouple in a server->client sense.

Edit: clarity and consistency.
 
Is it rly necessary to go with the LOD lvl 0 assets? I think this is an option, not a must.

How about generating an LOD lvl 1 or lvl 2 asset and use this as baseline for the engine? This should still be far better over all, compared to the traditional detail and workflow, as no normal maps or hierarchy of LOD levels need to be generated.

I do not know how the memory scaling with LOD level works, but if a level of LOD corresponds to roughly doubling the size of a triangle, the number of tris should be reduced by a factor of 4, and hence the memory? So lvl 2 would reduce memory by factor of 16 or so.

What LOD levels are typically used in modern games in close up cinematics (highest LOD)? Shouldn‘t this highes LOD lvl in current top PC games be a good baseline lvl for UE5? Or maybe one level less to increase fidelity beyond?
Why would you need lvl1 or 2 though? The way Unreal 5 works, scaling down the assets doesn't affect the performance much, since now it is tied to the pixel count on screen unless we are talking about a model like the character that will use traditional rasterization in the scene. Or are we talking about memory size of the in game assets sitting on the drive?
 
Well, I wouldn't expect a typical CEO to know anything about their products; they tend to have other concerns and hire people to work on engineering products. A CEO still hands on with the technical implementation of their stuff is fairly exceptional. So TBH, people here being better experts on something over the CEO of the multinational that makes it isn't that unrealistic. ;)
Tim Sweeney isn't a typical CEO, he started out developing his own games and works on the Unreal engine. The man understands technology and certainly has a better insight into UE5 than anybody here unless we have some Epic engineers floating around.
 
Why would you need lvl1 or 2 though? The way Unreal 5 works, scaling down the assets doesn't affect the performance much, since now it is tied to the pixel count on screen unless we are talking about a model like the character that will use traditional rasterization in the scene. Or are we talking about memory size of the in game assets sitting on the drive?

Yes, this is a real-life problem
if the game requires the SSD installation, next gen consoles needs to install at least a max of 8-10 games in the drive, with 800-1000 GB drives, this means around 100GB (already compressed) each.
Some games that are not using LOD0 solutions, as COD, needs already 200GB on storage, so this will become standard as the time will pass

having hundreds GB of geometry assets doesn't help for sure
maybe specialized compression+LOD1-2 can help
 
If thats true then all this tech they introduced is pretty much useless in most cases no? Since it would take up so much space alone. Seems like a silly thing for epic to focus on for a new engine with no path forward but i guess it wouldent be be the first time. What a fuck up
i think there is a mixed thing here, I don’t see why we can have lots of repeated textures (but maybe I’m being a bit dumb). Another thing is that over time disk space will increase- but as mentioned earlier, I’d be happy to have a dedicated drive for a game - but then I guess we run jnto the costs to produce the assets.
 
Yes, this is a real-life problem
if the game requires the SSD installation, next gen consoles needs to install at least a max of 8-10 games in the drive, with 800-1000 GB drives, this means around 100GB (already compressed) each.
Some games that are not using LOD0 solutions, as COD, needs already 200GB on storage, so this will become standard as the time will pass

having hundreds GB of geometry assets doesn't help for sure
maybe specialized compression+LOD1-2 can help
Less repeated textures on SSD also will help bring size down a bit I believe
 
Yes, this is a real-life problem
if the game requires the SSD installation, next gen consoles needs to install at least a max of 8-10 games in the drive, with 800-1000 GB drives, this means around 100GB (already compressed) each.
Some games that are not using LOD0 solutions, as COD, needs already 200GB on storage, so this will become standard as the time will pass

having hundreds GB of geometry assets doesn't help for sure
maybe specialized compression+LOD1-2 can help

Doesn't COD have massive install sizes because of the HDD? My understanding is that they're essentially placing the data on the HDD so that it can read a map without constant seek times, so that means data are duplicated many times.

Presumably SSD, if they have negligible seek times, wouldn't need to do for efficiency sakes. The same game with the same assets *should* be GBs smaller.
 
Nah,size is insane for virtualized anything, there is absolutely no silver bullet for compression. Mathematically it's strictly impossible beyond a certain point, and the demo they showed would take an incredibly large space for a full game. There's already jokes on dev twitter about how many Blu-rays and terabytes you'd need for a 10 hour game. Hell install size is one of the reasons the newest Id Tech, the one that powers Doom Eternal, ditched much the same virtualized texture system uses partially over the large install size it requires.

Tim Sweeney did have ideas for getting near maximum possible compression, but he talked about them back near when UE4 was first released. 7 years later nothing has come of it.

Megatexture was bloated because of the idea of unique assets, here it is not so extreme, there is tons of instancing.

unreal10.jpg


Here it reminds me Dreams with the usage of the same mesh multiple times.
 
Yes, this is a real-life problem
if the game requires the SSD installation, next gen consoles needs to install at least a max of 8-10 games in the drive, with 800-1000 GB drives, this means around 100GB (already compressed) each.
Some games that are not using LOD0 solutions, as COD, needs already 200GB on storage, so this will become standard as the time will pass

having hundreds GB of geometry assets doesn't help for sure
maybe specialized compression+LOD1-2 can help
Yes I agree. The LOD0 doesnt have to be at the super insane quality for every object we saw at the demo though. Its not like we can stick our faces on the walls and rocks of the environment to check micro detail like in this demo. We can decide the optimum amount of detail based on how close the in game camera will get to the objects most of the time. There's a chance we might be saving huge amounts of memory as well too, since a lot of detail that was represented with high res textures like normal maps, height maps (and even maybe the albedos themselves in some cases?) etc are no longer needed. We also don't need duplication of assets on the drive because of the SSDs super fast access.
Far away areas will surely use lower detailed assets.
I am pretty sure the demo can be replicated with barely noticeable reduction in detail with smaller sized assets. I think the detail was potentially overkill. Why bother much with optimizations based on camera distances when the point was to show the capabilities and potential of the engine?
 
I’d be happy to have a dedicated drive for a game.

Don't know, but the feeling is that will never accepted by the gaming comunity
One of the console have a strong feature: switching in no-time from a game to another without loading, using SSD. If I'm not wrong with the limit of 5-8 games, this can't be done if a game became too big (in the SSD must be stored the whole game + memory dump).

And this opens new problems: in order to process via compute giant geometry assets, they will compete with textures for memory bandwidh
 
If Epic doesn't have a reasonable solution for game size this tech is dead in the water, I'm positive it would of been part of there design.

I highly doubt they went through all this trouble and didn't put any emphasis on game size.
 
There's already jokes on dev twitter about how many Blu-rays and terabytes you'd need for a 10 hour game.
After we become really tired about all this repetition that now moves from texture to geometry, ...well, this is why cloud streaming seems the only way to move forward.
I do not want to depend on always online. We'll see how good artists are at hiding repetition, but sooner or later there won't be any alternative. :/
 
If Epic doesn't have a reasonable solution for game size this tech is dead in the water, I'm positive it would of been part of there design.

I highly doubt they went through all this trouble and didn't put any emphasis on game size.

I think so too but the only alternative im hearing is that epic is so incompetent that they didnt think about any scenario outside of showing off a pretty demo.

I think its a silly notion on the face of it. Hence my earlier sarcasm
 
Back
Top