Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Feel free to pick your own numbers and compression to try and get a whole world made out of source-data quality onto a 100 GB game. ;) I look forwards to seeing people's creative data manipulation to come up with a solution to this.
Negative.
I'm going to do the smart thing and ask some of our industry CGI movie professionals on the board.
So... I hope I'm right, but @Ike Turner should have some ballpark estimates!
 
Wow!! Super impressive:love::D. One funny thing when I linked to the Unreal Engine 4.25 with next gen consoles support some days back, I was thinking, why 4.25 and not use the chance to go to Unreal Engine 5? Well here it is!
 
1440p 30fps according to DF.

And in their written article they state:

Penwarden also confirms that the temporal accumulation system seen in Unreal Engine 4 - which essentially adds detail from prior frames to increase resolution in the current one - is also used in UE5 and in this demo. The transparency here from Epic is impressive. We've spent a long time poring over a range of 3840x2160 uncompressed PNG screenshots supplied by the firm. They defy pixel-counting, with resolution as a metric pretty much as meaningless as it is for, say, a Blu-ray movie. But temporal accumulation does so much more for UE5 than just anti-aliasing or image reconstruction - it underpins the Lumen GI system.
 
I have to say watching it a second time , i'm not very impressed with the character model. It looks out of place in the enviorment
 
Some important tidbits from DF:
We can have UE5 games on this generation in theory. Should still be a big uplift over UE4 today.

Epic is keen to stress a strong commitment to the interoperability of its new technology across multiple systems, despite demonstrating on PlayStation 5, where Sony has made strong arguments about the need for extreme bandwidth from storage. Meanwhile, Microsoft has developed DirectX 12 Ultimate, which also includes a radical revamp of how storage is handled on PC, but apparently the firm isn't leaning heavily on any one system's strength. However, subsequent to our interview, Epic did confirm that the next-gen primitive shader systems are in use in UE5 - but only when the hardware acceleration provides faster results than what the firm describes as its 'hyper-optimised compute shaders'.

"A number of different components are required to render this level of detail, right?" offers Sweeney. "One is the GPU performance and GPU architecture to draw an incredible amount of geometry that you're talking about - a very large number of teraflops being required for this. The other is the ability to load and stream it efficiently. One of the big efforts that's been done and is ongoing in Unreal Engine 5 now is optimising for next generation storage to make loading faster by multiples of current performance. Not just a little bit faster but a lot faster, so that you can bring in this geometry and display it, despite it not all fitting and memory, you know, taking advantage of next generation SSD architectures and everything else... Sony is pioneering here with the PlayStation 5 architecture. It's got a God-tier storage system which is pretty far ahead of PCs, bon (but) a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

And that, in a nutshell, is the definition of a micro-polygon engine. The cost in terms of GPU resources is likely to be very high, but with next-gen, there's the horsepower to pull it off and the advantages are self-evident. Rendering one triangle per pixel essentially means that performance scales closely with resolution. "Interestingly, it does work very well with our dynamic resolution technique as well," adds Penwarden. "So, when GPU load gets high we can lower the screen resolution a bit and then we can adapt to that. In the demo we actually did use dynamic resolution, although it ends up rendering at about 1440p most of the time."
 
Some important tidbits from DF:
We can have UE5 games on this generation in theory. Should still be a big uplift over UE4 today.

Epic is keen to stress a strong commitment to the interoperability of its new technology across multiple systems, despite demonstrating on PlayStation 5, where Sony has made strong arguments about the need for extreme bandwidth from storage. Meanwhile, Microsoft has developed DirectX 12 Ultimate, which also includes a radical revamp of how storage is handled on PC, but apparently the firm isn't leaning heavily on any one system's strength. However, subsequent to our interview, Epic did confirm that the next-gen primitive shader systems are in use in UE5 - but only when the hardware acceleration provides faster results than what the firm describes as its 'hyper-optimised compute shaders'.

"A number of different components are required to render this level of detail, right?" offers Sweeney. "One is the GPU performance and GPU architecture to draw an incredible amount of geometry that you're talking about - a very large number of teraflops being required for this. The other is the ability to load and stream it efficiently. One of the big efforts that's been done and is ongoing in Unreal Engine 5 now is optimising for next generation storage to make loading faster by multiples of current performance. Not just a little bit faster but a lot faster, so that you can bring in this geometry and display it, despite it not all fitting and memory, you know, taking advantage of next generation SSD architectures and everything else... Sony is pioneering here with the PlayStation 5 architecture. It's got a God-tier storage system which is pretty far ahead of PCs, bon (but) a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

And that, in a nutshell, is the definition of a micro-polygon engine. The cost in terms of GPU resources is likely to be very high, but with next-gen, there's the horsepower to pull it off and the advantages are self-evident. Rendering one triangle per pixel essentially means that performance scales closely with resolution. "Interestingly, it does work very well with our dynamic resolution technique as well," adds Penwarden. "So, when GPU load gets high we can lower the screen resolution a bit and then we can adapt to that. In the demo we actually did use dynamic resolution, although it ends up rendering at about 1440p most of the time."

Yikes 1440p . Will be real interesting to see what navi 2 and GeForce 3x00 cards can do with this demo
 
I remember being so impressed with the Samaritan and Elemental demo back in the day. This is also really impressive!
 
Yikes 1440p . Will be real interesting to see what navi 2 and GeForce 3x00 cards can do with this demo
It's a heavy demo.

If games are to look like this, we're likely looking at going back to 1080p/1440p@30fps once you factor in enemies, effects etc.
We'll need some other format of upscaling to move the resolution back up to mainstream televisions... or just not care about it, which is also possible.
The lack of temporal resolution would have reduced the latency on lighting etc, but I guess there is only so much power that is available.
 
I have to say watching it a second time , i'm not very impressed with the character model. It looks out of place in the enviorment
Yep. It's like she isn't part of the same lighting, which makes one wonder if Lumen is for scenery, not interactives? That seems very likely given what we know (expect?) of the hardware, that full on traced everything lighting isn't really doable (in this manner).

It paves the way for a new generation of VFX content production though. Things like animated TV series should be far cheaper to make on new realtime engines. Movie previs will also be greatly improved going forwards.
 
So from what I'm reading, the researcher at Epic who started this initiative for revamping the geometry engine has come up with a solution based on a few things that were proposed in the past. One is geometry textures that store a polygon per texel, or something like that, and one is a sparse voxel octree which does essentially the same thing. So essentially it's analagous to virtual texturing for geometry. You don't have to load the whole model from the ssd. You load the parts of the model that are visible in the scene, or close to it. You can avoid sub-pixel polygons and get roughly one polygon per texel. That massively cuts down on the amount of geometry they need in memory. You can probably avoid loading the entire model into memory and then culling the whole thing. At least that's how I'm understanding what I'm reading.
 
Last edited:
For a demo, sure. The entire 800 GB of PS5 SSD is filled with the one scene, perhaps. ;)

Let's do some very foggy maths. Every triangle consists of 3 vertices, but let's say we can optimise for one vertex per triangle. Positions needs to be defined in 3 dimensions at 2 bytes for 16 precision (which is I'm sure is too little and they're 32 bit...) That's 6 bytes per vertex x 33 million is which is 200 million bytes for one statue.

Feel free to pick your own numbers and compression to try and get a whole world made out of source-data quality onto a 100 GB game. ;) I look forwards to seeing people's creative data manipulation to come up with a solution to this.

Why?

Is every single triangle in there unique?
 
So I was wrong, everything is compute based lighting:
Hardware accelerated ray tracing will be supported in Unreal Engine 5, for example, but it's not a part of the PS5 tech demo revealed today.

I don't even know if RT will work with these polygon levels. Curious to see how 2080TI, Ampere, and big Navi tackle this.

There are a lot of things running through my head with this demo. They flipped everything upside down. There are no more traditional draw calls. Geometry is sampled from some kind of 3D texture or data structure, it sounds like. Throws a wrench in pretty much everything that follows.

Edit: I'm imagine rendering devs, industry-wide, are looking at their geometry engines and wondering how many years it'll take them to catch up to UE.
 
Last edited:
It's a heavy demo.

If games are to look like this, we're likely looking at going back to 1080p/1440p@30fps once you factor in enemies, effects etc.
We'll need some other format of upscaling to move the resolution back up to mainstream televisions... or just not care about it, which is also possible.
The lack of temporal resolution would have reduced the latency on lighting etc, but I guess there is only so much power that is available.

Its a shame. Hopefully there is some a happy middle ground between this tech demo and a playable game. I am not sure how happy I would be if we are back to 1080p /1440p. Although I can just buy a faster video card. It will suck getting hampered by another console generation

Yep. It's like she isn't part of the same lighting, which makes one wonder if Lumen is for scenery, not interactives? That seems very likely given what we know (expect?) of the hardware, that full on traced everything lighting isn't really doable (in this manner).

It paves the way for a new generation of VFX content production though. Things like animated TV series should be far cheaper to make on new realtime engines. Movie previs will also be greatly improved going forwards.

Maybe using ray tracing on just the character and what they interact with could fix the issue ?

Also not just animated shows but regular shows. Look at the Mandalorian
 
Back
Top