Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

1440p is a great choice. THere's a huge amount of rendering fidelity that's untapped at lower resolutions. Most games at 1080p don't look anything close to offline CG yet. I think it's smarter to pursue bridging that gap at lower resolutions than chasing native 4k.
If this is 1440p 30fps with just a few bats and one character on the screen how do you make a living world with multiple characters and maintain that fidelity at that resolution. Are they going to have to go down to 1080p to support that or further? Cut back on features ?

Hopefully since the engine wont be available till 2021 they can tighten up performance
 
You can probably avoid loading the entire model into memory and then culling the whole thing. At least that's how I'm understanding what I'm reading.
Inevitably. That's where Sony were going with their streaming intentions and was the dream even as far back as 2007 when virtual texturing was happening and virtual geometry was being talked about. I wonder f the SSD baseline was a catalyst, or just happened to coincide with what Epic were doing? I also wonder if they pursued it for gaming, or as a tech anyway (for non-gaming applications) and it turns out it's a great fit for PS5 so they leverage that IP association for PR purposes?
 
If this is 1440p 30fps with just a few bats and one character on the screen how do you make a living world with multiple characters and maintain that fidelity at that resolution.
It depends what the limiting factor was in creating the demo. Is it incapable of handling multiple characters and stuff, or did Epic want to showcase their new tech for environments and so out something simpler than the best they could possibly do?
 
If this is 1440p 30fps with just a few bats and one character on the screen how do you make a living world with multiple characters and maintain that fidelity at that resolution. Are they going to have to go down to 1080p to support that or further? Cut back on features ?
Remember, this tech demo is using some models with 33 million polygons and 8k textures throughout. It's an extreme and unrealistic example of what the engine is technically capable of doing, which is going to be a way away from what devs will be asking of UE5 in actual games.

Not to mention this is the state of UE5 now on devkit PS5. It's going to get optimised endlessly.
 
Inevitably. That's where Sony were going with their streaming intentions and was the dream even as far back as 2007 when virtual texturing was happening and virtual geometry was being talked about. I wonder f the SSD baseline was a catalyst, or just happened to coincide with what Epic were doing? I also wonder if they pursued it for gaming, or as a tech anyway (for non-gaming applications) and it turns out it's a great fit for PS5 so they leverage that IP association for PR purposes?

The engineer at Epic has been looking at it since at least 2009, so I imagine it's a combination of a more flexible rendering pipeline that dropped the vertex and tesselation shader for a more generic compute based approach that isn't single-threaded in nature, and a sudden massive base-level increase in disk io.
 
Some important tidbits from DF:
We can have UE5 games on this generation in theory. Should still be a big uplift over UE4 today.

Epic is keen to stress a strong commitment to the interoperability of its new technology across multiple systems, despite demonstrating on PlayStation 5, where Sony has made strong arguments about the need for extreme bandwidth from storage. Meanwhile, Microsoft has developed DirectX 12 Ultimate, which also includes a radical revamp of how storage is handled on PC, but apparently the firm isn't leaning heavily on any one system's strength. However, subsequent to our interview, Epic did confirm that the next-gen primitive shader systems are in use in UE5 - but only when the hardware acceleration provides faster results than what the firm describes as its 'hyper-optimised compute shaders'.

"A number of different components are required to render this level of detail, right?" offers Sweeney. "One is the GPU performance and GPU architecture to draw an incredible amount of geometry that you're talking about - a very large number of teraflops being required for this. The other is the ability to load and stream it efficiently. One of the big efforts that's been done and is ongoing in Unreal Engine 5 now is optimising for next generation storage to make loading faster by multiples of current performance. Not just a little bit faster but a lot faster, so that you can bring in this geometry and display it, despite it not all fitting and memory, you know, taking advantage of next generation SSD architectures and everything else... Sony is pioneering here with the PlayStation 5 architecture. It's got a God-tier storage system which is pretty far ahead of PCs, bon (but) a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

And that, in a nutshell, is the definition of a micro-polygon engine. The cost in terms of GPU resources is likely to be very high, but with next-gen, there's the horsepower to pull it off and the advantages are self-evident. Rendering one triangle per pixel essentially means that performance scales closely with resolution. "Interestingly, it does work very well with our dynamic resolution technique as well," adds Penwarden. "So, when GPU load gets high we can lower the screen resolution a bit and then we can adapt to that. In the demo we actually did use dynamic resolution, although it ends up rendering at about 1440p most of the time."
Sounds like the raw Tflops of XsX and SSD speed of PS5's respective advantage cancels each other out more or less in the end for this technique. Just guessing lol.

I hope UE5 is as wide spread as UE4 like this gen so that Nanite and Lumen can be used extensively in next gen titles.
 
Inevitably. That's where Sony were going with their streaming intentions and was the dream even as far back as 2007 when virtual texturing was happening and virtual geometry was being talked about. I wonder f the SSD baseline was a catalyst, or just happened to coincide with what Epic were doing? I also wonder if they pursued it for gaming, or as a tech anyway (for non-gaming applications) and it turns out it's a great fit for PS5 so they leverage that IP association for PR purposes?
everything presented today will run on this generation of devices. So reduce the geometry and asset size and requirements will go down as well as compute requirements.

The real question is how well GCN architecture with this.
 
If this is 1440p 30fps with just a few bats and one character on the screen how do you make a living world with multiple characters and maintain that fidelity at that resolution. Are they going to have to go down to 1080p to support that or further? Cut back on features ?

Hopefully since the engine wont be available till 2021 they can tighten up performance

I think it's a little too early to be concerned about a playable tech demo that pretty much demolished the industry standards for geometry and texturing.
 
Inevitably. That's where Sony were going with their streaming intentions and was the dream even as far back as 2007 when virtual texturing was happening and virtual geometry was being talked about. I wonder f the SSD baseline was a catalyst, or just happened to coincide with what Epic were doing? I also wonder if they pursued it for gaming, or as a tech anyway (for non-gaming applications) and it turns out it's a great fit for PS5 so they leverage that IP association for PR purposes?

As usual it's a joint "marketing" deal in this case:
"As for Microsoft’s Xbox Series X, Sweeney isn’t saying the new Xbox won’t be able to achieve something similar; both are using custom SSDs that promise blazing speeds. But he says Epic’s strong relationship with Sony means the company is working more closely with the PlayStation creator than it does with Microsoft on this specific area." link
This is Sweeney bog standard copy/paste answer every time a new tech is announced alongside & HW manufacturer (usually Nvidia).
Tim literally said on the live stream that both Next Gen consoles & high end PCs would run Nanite as seen in today's demo.
 
everything presented today will run on this generation of devices. So reduce the geometry and asset size and requirements will go down as well as compute requirements.

The real question is how well GCN architecture with this.

I'm very interested in how the virtual geometry works. If it's one polygon per texel, then in theory geometry will now roughly scale with resolution. I'm sure you end up reading geometric detail in chunks and have to cull parts you don't need, but it should scale much closer.
 
everything presented today will run on this generation of devices. So reduce the geometry and asset size and requirements will go down as well as compute requirements.
If streaming is the intention, HDDs will be such a bottleneck no game will scale back. I don't think these techniques will ever appear in a UE5 game on PS4 or XB1. I think by 'this gen', they mean current hardware including high-end PC GPUs that'll be using software solutions where the hardware features aren't present.
 
. Geometry is sampled from some kind of 3D texture or data structure, it sounds like
Not sure if this is a creation of the DX12.1 feature set
a) CR
b) ROVs
c) Execute Indirects
d) Tier 3 Tiled Resources

and so forth.
 
I think it's a little too early to be concerned about a playable tech demo that pretty much demolished the industry standards for geometry and texturing.
I play games not tech demos. Xbox one 2013 , xbox one x 2017 , ps4 2013 , ps4 pro 2016. So if your saying that is to early to be concerned about the tech demo , should I just have no interest in it till have to buy new hardware that might then run it properly ?
 
If streaming is the intention, HDDs will be such a bottleneck no game will scale back. I don't think these techniques will ever appear in a UE5 game on PS4 or XB1. I think by 'this gen', they mean current hardware including high-end PC GPUs that'll be using software solutions where the hardware features aren't present.
Well, the engine is being made ot run on the same platforms, IOS, Android etc. Fortnite is being ported to UE5 and meant to run on low end systems.

I think streaming has always been the biggest intention for all of our games even this generation, the buffer is larger in memory however.

If you can store everything to be streamed, you just need a larger buffer for player space to travel in before your buffer runs out. That could result in a variety of ways, but if it's going to work the same way on Android and IOS, no reason it shouldn't work on XBO and PS4
 
I play games not tech demos. Xbox one 2013 , xbox one x 2017 , ps4 2013 , ps4 pro 2016. So if your saying that is to early to be concerned about the tech demo , should I just have no interest in it till have to buy new hardware that might then run it properly ?

I have no idea what you're talking about. I'm not sure what point you're trying to make. I'm saying wait and see on the performance side. It's way too early to be making extrapolations about how many actors you can fit into a scene based on one tech demo.
 
I play games not tech demos. Xbox one 2013 , xbox one x 2017 , ps4 2013 , ps4 pro 2016. So if your saying that is to early to be concerned about the tech demo , should I just have no interest in it till have to buy new hardware that might then run it properly ?
I think Scott's saying just enjoy it for what it is (an evolutionary leap in realtime rendering software) and don't even try to hazard a guess whether it'll scale to real games or not because no-one knows. ;)
 
Back
Top