Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Cyperbunk VR mode don't use Nanite. Nanite don't support VR maybe a problem link to this.

Yes that may well be the case. My point was really that this isn't an hardware performance issue, it's a software one. The tweet implied that we just need to throw more power at it whereas what we really need is a properly working VR implementation.
 
Yes that may well be the case. My point was really that this isn't an hardware performance issue, it's a software one. The tweet implied that we just need to throw more power at it whereas what we really need is a properly working VR implementation.
That's a somewhat tenuous argument when a tools developer like Epic Games expects content creators to have sane usage patterns. Don't expect great results when a tool is improperly used ...
 
That's a somewhat tenuous argument when a tools developer like Epic Games expects content creators to have sane usage patterns. Don't expect great results when a tool is improperly used ...

It wasn't a criticism of UE5. It was a criticism of the VR implementation. Clearly a 4090 doing in VR what a Series S can do in standard 3D isn't going to be a problem from performance perspective.
 
It wasn't a criticism of UE5. It was a criticism of the VR implementation. Clearly a 4090 doing in VR what a Series S can do in standard 3D isn't going to be a problem from performance perspective.
The basis still stands. XR developers are expected to do the sane thing with the VR implementation ...

Outside of fundamental content changes I'm not even sure how much more room the VR implementation can be improved. Nanite/deferred renderer can be nearly all compute shaders so you can't exactly use HW VRS to do foveated rendering. Anecdotal, but DLSS is hardly going to be a lifeline either for stereoscopic rendering considering it doesn't meet a satisfactory visual quality based on those samples ...
 
The basis still stands. XR developers are expected to do the sane thing with the VR implementation ...

Outside of fundamental content changes I'm not even sure how much more room the VR implementation can be improved. Nanite/deferred renderer can be nearly all compute shaders so you can't exactly use HW VRS to do foveated rendering. Anecdotal, but DLSS is hardly going to be a lifeline either for stereoscopic rendering considering it doesn't meet a satisfactory visual quality based on those samples ...

I don't really understand what you're saying. You said it yourself above that to do a stereoscopic imlpementation of a standard game you will need to render things twice. That's fine and expected, and exactly what we see in other VR mods like Cyberpunk where performance is effectively halved in VR.

I've shown above that the 4090 has more than sufficient performance in the 2D version of the demo to allow that to be halved in VR and still provide a great experience.

Therefore, a good VR implementation should provide that - as a minimum. The fact that this doesn't makes it a bad VR implementation. Now that may simply be down to the fact that nanite doesn't support VR and thus any mod, no matter how skilled the modder, is going to be non functional.

But nevertheless the point still stands. This isn't a hardware performance issue that requires VRS or DLSS to work around. It's a software issue, either because UE5/Nanite can't support VR, or because the implementation of the mod is poor.
 
I don't really understand what you're saying. You said it yourself above that to do a stereoscopic imlpementation of a standard game you will need to render things twice. That's fine and expected, and exactly what we see in other VR mods like Cyberpunk where performance is effectively halved in VR.

I've shown above that the 4090 has more than sufficient performance in the 2D version of the demo to allow that to be halved in VR and still provide a great experience.

Therefore, a good VR implementation should provide that - as a minimum. The fact that this doesn't makes it a bad VR implementation. Now that may simply be down to the fact that nanite doesn't support VR and thus any mod, no matter how skilled the modder, is going to be non functional.

But nevertheless the point still stands. This isn't a hardware performance issue that requires VRS or DLSS to work around. It's a software issue, either because UE5/Nanite can't support VR, or because the implementation of the mod is poor.
Who knows exactly what he's doing or what his settings are ? The thread hints that he might be using a debug tool. What's the field of view and what CPU does he have ? Some things we don't know ...
 

I don't think Epic Games are expecting anyone to use foveated rendering with Nanite heavy content/deferred renderer and it's incompatible with DLSS too. They probably expect most XR developers to use mobile quality content with their forward renderer (incompatible with Nanite) ...
If fixed foveated is supported, presumably the engine could adapt to eye-tracked also? It'd behove Sony to make it happen as UE5 in it's best quality on PSVR2 would be a major selling point. UE5 is standing head and shoulders above everything else for looks at the moment ad being the platform to bring that to VR would be a major coup.
 
If fixed foveated is supported, presumably the engine could adapt to eye-tracked also? It'd behove Sony to make it happen as UE5 in it's best quality on PSVR2 would be a major selling point. UE5 is standing head and shoulders above everything else for looks at the moment ad being the platform to bring that to VR would be a major coup.
UE5 uses HW VRS to implement foveated rendering. Epic Games would need major engine refactors to UE5 in order to make Nanite and Lumen compatible with HW VRS. With Nanite, they'd have to ditch their software rasterizer path since it doesn't interact well with the HW accelerated graphics pipeline. With Lumen, you'd be forced to switch from a compute-based deferred renderer to a forward renderer in order to see any performance benefit with HW VRS. I think it's better to list out what might happen ...

Can't bypass HW rasterizer bottleneck anymore
Lose the ability to apply screen space techniques (unless more information is stored in V-buffer)
Lose the ability to apply async compute optimizations (if you want to maximize the benefit of HW VRS)
See higher register pressure due to integrated material and lighting pass

I don't think even Sony expects developers to have graphical parity between AAA and VR games in UE5 and Epic Games doesn't either which is why UE5 have two different renderers to begin with ...
 
PS5/XSX cant even run the tech demo of UE5 at anything above 30fps at a resolution below what the 2016 consoles did. Expecting a full blown UE5 game at that kind of settings/fidelity in VR is expecting too much i think. A 4090 might just do it.... with DLSS.
 
UE5 uses HW VRS to implement foveated rendering. Epic Games would need major engine refactors to UE5 in order to make Nanite and Lumen compatible with HW VRS. With Nanite, they'd have to ditch their software rasterizer path since it doesn't interact well with the HW accelerated graphics pipeline. With Lumen, you'd be forced to switch from a compute-based deferred renderer to a forward renderer in order to see any performance benefit with HW VRS. I think it's better to list out what might happen ...

Can't bypass HW rasterizer bottleneck anymore
Lose the ability to apply screen space techniques (unless more information is stored in V-buffer)
Lose the ability to apply async compute optimizations (if you want to maximize the benefit of HW VRS)
See higher register pressure due to integrated material and lighting pass

I don't think even Sony expects developers to have graphical parity between AAA and VR games in UE5 and Epic Games doesn't either which is why UE5 have two different renderers to begin with ...

It'd be possible for VR to match 2d games with enough work. You don't really need 2 views, you need 1 view, then to re-project the g-buffer and diffuse results into the other view, then hole fill/re trace/shade specular, which is still much cheaper than 2 native views.

You also don't need hardware VRS, there's a reason it's a bit useless and you covered them well, software VRS running on UE5 is perfectly possible though, and you get a good amount of savings back off foveated rendering.

You also don't need anything faster than 45fps with hole filling, you track player head input, reproject the previous frames results based on head movement, and hole fill. It still ends up much cheaper than bumping to 90.

Anyway, eye tracking is required for foveated rendering to work at all. You'd just glance a bit to the side and the entire image would become hyper blurry otherwise. That being said the PSVR2 doesn't really take advantage of their tech that much. Exponential savings from foveated rendering are possible as you increase FOV, but their FOV is pretty narrow.
 
It'd be possible for VR to match 2d games with enough work. You don't really need 2 views, you need 1 view, then to re-project the g-buffer and diffuse results into the other view, then hole fill/re trace/shade specular, which is still much cheaper than 2 native views.

You also don't need hardware VRS, there's a reason it's a bit useless and you covered them well, software VRS running on UE5 is perfectly possible though, and you get a good amount of savings back off foveated rendering.

You also don't need anything faster than 45fps with hole filling, you track player head input, reproject the previous frames results based on head movement, and hole fill. It still ends up much cheaper than bumping to 90.

Anyway, eye tracking is required for foveated rendering to work at all. You'd just glance a bit to the side and the entire image would become hyper blurry otherwise. That being said the PSVR2 doesn't really take advantage of their tech that much. Exponential savings from foveated rendering are possible as you increase FOV, but their FOV is pretty narrow.
I heard recent Nvidia GPUs have a hardware feature called simultaneous multi-projection that can be used to implement single pass stereo for geometry but there are many caveats covered in this series of posts ...

Also it sounds like you're forced to use the graphics pipeline if you want HW accelerated single pass stereo because it won't work with Nanite's software rasterizer which is just another compute shader so you'll need two geometry passes in the end regardless ...
 
PS5/XSX cant even run the tech demo of UE5 at anything above 30fps at a resolution below what the 2016 consoles did. Expecting a full blown UE5 game at that kind of settings/fidelity in VR is expecting too much i think. A 4090 might just do it.... with DLSS.
That was a first demo on previoew of the engine. It'll get better and faster (as we're seeing). Also foveated rendering means you only render highest quality for 1/10th of the screen. I think people underestimate its potential (although we've still to see what Sony's implementation is like). In short, scaling isn't linear - VR shouldn't be half the quality due to rendering two eyes.

UE5 uses HW VRS to implement foveated rendering
Sony has hardware support for Flexible Scale Rasterization. The question is if and how that can fit into UE's pipeline. If it can't, it'll be a tragic waste and limiting for UE's ability for VR. Devs will still use the engine but it won't be ideal for PSVR2. Once the scene is created, is it not just a case of rendering it at a few different resolutions and compositing?
 
That was a first demo on previoew of the engine. It'll get better and faster (as we're seeing). Also foveated rendering means you only render highest quality for 1/10th of the screen. I think people underestimate its potential (although we've still to see what Sony's implementation is like). In short, scaling isn't linear - VR shouldn't be half the quality due to rendering two eyes.


Sony has hardware support for Flexible Scale Rasterization. The question is if and how that can fit into UE's pipeline. If it can't, it'll be a tragic waste and limiting for UE's ability for VR. Devs will still use the engine but it won't be ideal for PSVR2. Once the scene is created, is it not just a case of rendering it at a few different resolutions and compositing?

Flexible scale rasterization use the hardware rasterizer, it is not working with Nanite. UE has a path for VR it is using a different renderer.
 
So Nanite's out, but Lumen is still usable?

In Theory yes but from a performance point of view I doubt it will be the case. VR is not using any temporal AA or reconstruction method because artefact are a problem with VR. Most engines use forward rendering, full resolution with MSAA and very high framerate. 60 fps is the minimum and not ideal.
 
I kind of want to see what a fully supported 60fps ue5.1 console game looks like now. Just cause people keep talking about it in generalities
 
Back
Top