Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Yea, the only reason worth mentioning that for a pre-rendered trailer is to build a bit of hype for the next Nvidia GPUs. It's almost certain they will have a partnership for this game. But by the time this game releases we'll be looking at 60 series release I bet lol.
 
or: using an unannounced GPU for marketing purposes to get those sweet sweet nv sponsorship bucks
And how do you think they got their hands on the GPU? Developers tenth their size get access to prototype GPUs but if anyone will have some sort of partnership with nv, it's gonna be CDP. But in general I think people are reading too much into this. Tons of trailers these days start with a disclaimer "rendered using engine X on GPU Y". It just so happens that this probably is run on RTX 50 series prototype but they can't call it that.
 
CDPR said they are using a special custom UE5 engine, they are probably using some NVIDIA features in it as well, like the NVRTX branch.
everyone's UE5 should be custom for a studio the size of CDPR. They have the skill sets to make the modifications to get the performance they need out of the engine. Smaller developers that are strictly looking to pump content will have a greater issue with that.
 
And how do you think they got their hands on the GPU? Developers tenth their size get access to prototype GPUs but if anyone will have some sort of partnership with nv, it's gonna be CDP. But in general I think people are reading too much into this. Tons of trailers these days start with a disclaimer "rendered using engine X on GPU Y". It just so happens that this probably is run on RTX 50 series prototype but they can't call it that.
yeah exactly, so it makes total sense on the economic side even if it isnt "needed" on the technical side
 
Hmm I'm not sure they would be going to far in terms of integrating Nv specific technologies, given, I assume, it has to run on current gen consoles?
 
Yeah, CyberPunk 2077 has the consoles as the bottom of performance and image quality and the using PC to see how high they can scale image quality.
I am hoping the next generation top GPU's will allow me to run at 4K with DLAA and PT in Cyberpunk 2077, because this generation tops out at 1440p with DLAA and PT if you want playable FPS.
 
CDPR said they are using a special custom UE5 engine, they are probably using some NVIDIA features in it as well, like the NVRTX branch.
hope so. You can do many things with UE5, it's an impressive engine, versatility wise, but they pay a price for such versatility and the engine needs some serious fixes, imho.

You just look at the performance of certain games with other engines and how UE5 games are performing, even if they use less advanced techniques like Lumen or Nanite and you clearly see what happens when you try to do everything. This is something that affects typical jack of all trades engines like Unity, Godot, etc.

 
This is "everything looks like Gears of War" all over again. If you don't have the resources to customize enough things in UE to have a very distinct look, it stands to reason you wouldn't have resources to produce unique AAA quality engine from scratch.
 
Even some of his points about Nanite I can somewhat relate too as I've though similar things myself.

This was Sebbi's comment on the Nanite debate.

Nanite’s software raster solves quad overdraw. The problem is that software raster doesn’t have HiZ culling. Nanite must lean purely on cluster culling, and their clusters are over 100 triangles each. This results in significant overdraw to the V-buffer with kitbashed content (such as their own demos). But V-buffer is just a 64 bit triangle+instance ID. Overdraw doesn’t mean shading the pixel many times.

While V-buffer is fast to write, it’s slow to resolve. Each pixel shader invocation needs to load the triangle and runs equivalent code to full vertex shader 3 times. The material resolve pass also needs to calculate analytic derivatives and and material binning has complexities (which manifest in potential performance cliffs).

It’s definitely possible to beat Nanite with traditional pipeline if your content doesn’t suffer much from overdraw or quad efficiency issues. And your have good batching techniques for everything you render.

However it’s worth noting that GPU-driven rendering doesn’t mandate V-buffer, SW rasterizer or deferred material system like Nanite does. Those techniques have advantages but they have big performance implications too. When I was working at Ubisoft (almost 10 years ago) we shipped several games with GPU-driven rendering (and virtual shadow mapping). Assassin’s Creed Unity with massive crowds in big city streets, Rainbox Six Siege with fully destructive environment, etc. These techniques were already usable on last gen consoles (1.8TFLOP/s GPU). Nanite is quite heavy in comparison. But they are targeting single pixel triangles. We weren't.

I am glad that we are having this conversation. Also mesh shaders are a perfect fit for GPU-driven render pipeline. AFAIK Nanite is using mesh shaders (primitive shaders) on consoles at least. Unless they use SW raster today for big triangles too. It’s been long time since I analyzed Nanite for the last time (UE5 preview). Back then their PC version was using non-indexed geometry for big triangles, which is slow.

 
I like Nanite, but I still don't understand the need for it in current generation of games as the geometry simply isn't high enough to justify the overhead.

And with nearly all (or all?) of UE5 powered games having some form of pop-up and visible LOD transitions (Especially STALKER 2) it's not providing a good visual improvement to justify it's use in my opinion.
 
Back
Top