Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I've saw the digital foundry video, seems like that the new tech from UE5 ha a lot of latency and artifacts, and it's inefficient with foliage/hair (it wastes 3/4 pixel calc when they are organizated in groups of four or more)

Because they are not using the raster hardware having 1 polygon per pixel does not cause them to lose polygon throughput. The raster engines in RDNA can output 4 fragments per triangle, so if your polygon covers less than 4 fragments you lose performance. The UE5 solution does not have this issue because it does not use that hardware.
 
Thinking on this further, I think @chris1515 linked a research paper that showed a way to store geometry as textures. Something similar would make sense, because one of the benefits of that approach is you can keep the texture compressed in memory, then decompress as needed. If you have a full model with geometry that's stored in tiles on disk, if you selectively load those tiles into memory as you need them, and unload them as you don't, your representation of the model in memory could have a lot of empty space, meaning it can be compressed. Then you'd need some scene hierarchy or data structure that maps pixels on screen to texels assosciated with those triangles in the cache. I don't think they're actually using that geometry texture solution, but it was one of the stated inspirations. Something like that would make sense. I'm wondering if they build a scene representation with SDFs, which is what they use for their lighting, and then raymarch it to figure out which pieces of geometry they need to load in.
 
Because they are not using the raster hardware having 1 polygon per pixel does not cause them to lose polygon throughput. The raster engines in RDNA can output 4 fragments per triangle, so if your polygon covers less than 4 fragments you lose performance. The UE5 solution does not have this issue because it does not use that hardware.
Although that does also mean that silicon is redundant in the GPU. How long until GPUs forgo render backends completely (as Iroboto already said!)?
 
I've saw the digital foundry video, seems like that the new tech from UE5 ha a lot of latency and artifacts, and it's inefficient with foliage/hair (it wastes 3/4 pixel calc when they are organizated in groups of four or more)
Actually Alex said that he did not know how the engine deals with foliage and hair, so that is not a correct assumption.
 
Although that does also mean that silicon is redundant in the GPU. How long until GPUs forgo render backends completely (as Iroboto already said!)?

It's true. If you can beat the performance of the fixed hardware, why use it? Maybe this is the gen that more of this hardware becomes redundant. Maybe future hardware will emulate it in compute shaders for DX11 and DX12 titles that use that part of the render pipeline.
 
I'd guess this upcoming is the last gen. RDNA still has the stuff. This gen will prove the value of software rendering, and by next gen, 2027/2028 maybe, GPUs won't have rasterising hardware.
 
I may have alluded to this point elsewhere or here so apologies if I'm repeating. But I think we'll see GI solutions such as this become prominent and RT will simply be used in a supplementary fashion to patch the leaks for off-screen or occluded elements. The same in regards to SS Reflections and RT Reflections; RT can effectively be the fallback when SS fails, on a per or sub pixel level.
 
I'd guess this upcoming is the last gen. RDNA still has the stuff. This gen will prove the value of software rendering, and by next gen, 2027/2028 maybe, GPUs won't have rasterising hardware.
We have finally entered a post DX11 world. GPU driven dispatch everywhere has finally arrived. I have waited many days to see the outcome would be like. I am not disappointed. This is what DX12 is supposed to be.
 
I may have alluded to this point elsewhere or here so apologies if I''m repeating. But I think we'll see GI solutions such as this become prominent and RT will simply be used in a supplementary fashion to patch the leaks for off-screen or occluded elements. The same in regards to SS Reflections and RT Reflections; RT can effectively be the fallback when SS fails, on a per or sub pixel level.
or if the performance improves, just replaces it entirely. At least for now until someone proves that a pure RT engine can offer more and offer higher performance (at the same fidelity)
 
or if the performance improves, just replaces it entirely. At least for now until someone proves that a pure RT engine can offer more and offer higher performance (at the same fidelity)

Yeah, I think we'll just see a gradual shift in the ratio of conventional and RT hardware in gaming GPUs. Perhaps by the time we see a PS6/XSX2 it could be half and half. I don't think we'll see RT take over entirely for a long time, if at all. General compute and rasterization may still prove more efficient for some functions for a long time coming.
 
I guess Sony 1st party studios will be on top of the list for early access to UE5 since according to this Sony and Epic has been working together with storage tech and Engine.

"We’ve been working super close with Sony for quite a long time on storage"
"Sweeney says the two companies have been working closely together during the development of UE5 and the PS5"

https://www.theverge.com/21256299/e...-ps5-ssd-impressive-pc-gaming-future-next-gen
 
I find it very hard to believe Epic made this run on a PS5 back in March
Probably cause it wasn't running on ps5 back in march. Had it not been delayed, I'd be willing to bet a 5er, the whole presentation would've been shown on PC. With how odd/delayed some of the button overlays looked yesterday, I wouldn't be surprised if it was still running on PC.(not saying it wouldn't be possible on PS5 dev kit, just that it was easier to show on PC)
 
or if the performance improves, just replaces it entirely. At least for now until someone proves that a pure RT engine can offer more and offer higher performance (at the same fidelity)

We'll have to see how UE5 handle's light leaking etc, which is one of the ways Remedy and people like Morgan McGuire at nvidia have used ray tracing to augment their pre-computed lighting models. I know UE5 isn't pre-computed, but it's calculating GI based on various lower precision representations with voxels and SDFs, so there are potential limitations.
 
I may have alluded to this point elsewhere or here so apologies if I'm repeating. But I think we'll see GI solutions such as this become prominent and RT will simply be used in a supplementary fashion to patch the leaks for off-screen or occluded elements.
I think engines will move towards a more even balance, using RT hardware in better approximations than SDFs and the like. GI will improve with less lighting latency and better accuracy and more uniformity across scene elements. RTRT hardware will allow such GI approximation solutions to be far better than compute alone could manage.
 
I wouldn't be surprised if it was still running on PC.(not saying it wouldn't be possible on PS5 dev kit, just that it was easier to show on PC)

In that case Epic lied in the interview with Geoff Keighley where they stated that the capture was taken directly from the output of the PS5 devkit and they could control the character with the PS controller
 
Back
Top