Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

That's why we have feature levels in direct X.
If you have a DirectX 12 card, it supports the core features of DX12.
If you have a DirectX 12 Ultimate care, it supports the core features of DX12 U.

There are sub features involved for all of these as well. Feature levels like 12_1, 12_2 etc.

If this technology only worked on DX12U, that wouldn't necessarily get people to move over to UE5. Because a majority of compliant devices are closer to DX12 spec.

Yes, but a game will not be exactly the same if it relies on a feature set the card does not support, or does it so badly that is not even worth to have it "on". It's not only about raw performance capabilities like you imply. Nvidia did release drivers for RT on non RTX cards, but it was mostly pointless as performance was too low. So, yes the pipeline could have been similar, but at such a great cost, why bother to make parity visually?

I honestly don't understand the point you are really trying to make, other than just rethoric.
 
Right! When Rangers says, "why isn't this running on current gen," the reason is the HW isn't fast enough. "Why isn't this tech in use today?" is a different question, to be followed with, "okay, if scaled down to current tech, what could be achieved with this streaming?" And that'll be something very cut down and quite different. I guess current gen could also render the path-traced Minecraft if you go low-res enough. Doom was squeezed onto the Amiga at something like 120x80 resolution 10fps.
 
So UE5 will have tools that can scale down the content for lower end devices that can't support the new virtual geometry engine natively. I'm assuming that's referring to primitive(AMD)/mesh(Nvidia) shader support. They'll probably cut down texture resolution and polygon density. Basically you're generating mips and lods again, except per platform instead of within a particular game.
 
It's dumb, and I forget to comment about something: footprints. I noticed that they're missing from this impressive demo, why? Was the first thing I noticed from watching the demo.
Wasn't it mostly rock? You have you far on the wrong-side of obese to leave footprints in strata. :LOL:
 
Right! When Rangers says, "why isn't this running on current gen," the reason is the HW isn't fast enough. "Why isn't this tech in use today?" is a different question, to be followed with, "okay, if scaled down to current tech, what could be achieved with this streaming?" And that'll be something very cut down and quite different. I guess current gen could also render the path-traced Minecraft if you go low-res enough. Doom was squeezed onto the Amiga at something like 120x80 resolution 10fps.

Right, this demo at this fidelity cannot be run on this generation of device, cut it down it enough it will run.

We can actually ball park what the maximum limitations are.
The demo showcases the most important thing of it's engine which is that it is capable of rendering 1 triangle per pixel and to cull everything else out. That means performance is greatly tied to resolution.
The Xbox One does the following amount:
Realistic triangle performance: 7,812,500–21,875,000 32-pixel triangles/s, with 2 textures, lighting, Z-buffering, fogging and alpha blending (32-pixel divided from realistic fillrate)
  • 130,208–364,583 triangles per frame at 60 frame/s
  • 260,416–729,166 triangles per frame at 30 frame/s
By resolution, if this were per pixel taking the lower and upper bounds
672 x 378 ->>> 1136 x 639.

And that is the limit XBO can go at 1 triangle per pixel. Sub 720p at 30fps.
 
Last edited:
The demo looks awesome and I would love to see games get to be this nice in the future. The global illumination is really good and the draw distance and high speed movement is very impressive.

But am I the only one who think that they basically copy pasted the statue a bunch of times just hides the fact they are probably not able to do a lot of different high poly models? With only 1 model, it simplifies memory use. The only thing it shows with repeating the hundred of statues is that the geometry culling works in next gen. Loading this statue into memory and culling 99.9% of the triangles isn't all that impressive based on what we already know what primitive(mesh) shaders can do.
 
So UE5 will have tools that can scale down the content for lower end devices that can't support the new virtual geometry engine natively. I'm assuming that's referring to primitive(AMD)/mesh(Nvidia) shader support. They'll probably cut down texture resolution and polygon density. Basically you're generating mips and lods again, except per platform instead of within a particular game.
You still want install sizes to be within reason for each specific platform you target.
 
https://www.resetera.com/goto/post?id=33884886

wake up late on Wednesdays and I see a literally industry-shaking announcement.

This is absolutely impossible. So much of the current development pipeline is based on retopography, poly count budgets, draw call budgets, light baking and so on.

This just...gets rid of them.

They're gone.

Entirely.

I cannot even fathom this shit. Every developer I know is fucking losing their minds.
 
Yes, but a game will not be exactly the same if it relies on a feature set the card does not support, or does it so badly that is not even worth to have it "on". It's not only about raw performance capabilities like you imply. Nvidia did release drivers for RT on non RTX cards, but it was mostly pointless as performance was too low. So, yes the pipeline could have been similar, but at such a great cost, why bother to make parity visually?

I honestly don't understand the point you are really trying to make, other than just rethoric.
Its not rhetoric. I'm looking at the engine from the viewpoint of someone who wants to release a product on multiple systems. It's a huge advantage to have the tools to available for you to do this. If you want to release a multiplatform game, then this is ideal that you don't need to write 2 separate pipelines for a single game. You only need to reduce the asset quality as required. The developers themselves will need to decide if the gap is too large to cross. But not every game is looking to have realistic graphics like this. And to have realistic graphics you will need a realistically increased amount of IO and horsepower. But if you're not, like most indie and AA studios have smaller budgets, this is ideal.
 
I would hate to be releasing a game in the next year or two that's too late in the dev cycle to abandon a more traditional geometry pipeline. People are going to freak out over pop-in if their expectations are set high, especially if they're releasing against games that are already built that way.

Like CyberPunk2077 ?
 
Like CyberPunk2077 ?
I think Cyberpunk will do fine, but how quickly the graphics age will depend. CDPR have been pretty damn good at supporting their games long after release but if this means re-authoring huge amounts of Cyberpunks's art and rendering pipeline that may not be practical. That said, I would be very surprised if all of the graphical techniques shown in UE5 will be that much of a surprise to CDPR. I'm hoping that they have all this covered for nextgen.
 
Hyperbole.. People have to realize that there are limitations to all of this (for once only static meshes from the looks of it with Nanite) etc.. It's the same cycles of hype & PR every time something new is shown.
It took more than 2 years for the holy grail of holy grails of technologoies (RT) to be production ready (GDC 2018-> UE4. 25 last week) just as I said here when it was announced.. and it isn't even used in the new lighting engine Lumen! ...
Tech demos are nothing more than marketing vectors to sell products..
 
Last edited:
Right, this demo at this fidelity cannot be run on this generation of device, cut it down it enough it will run.

We can actually ball park what the maximum limitations are.
The demo showcases the most important thing of it's engine which is that it is capable of rendering 1 triangle per pixel and to cull everything else out. That means performance is greatly tied to resolution.
The Xbox One does the following amount:
Realistic triangle performance: 7,812,500–21,875,000 32-pixel triangles/s, with 2 textures, lighting, Z-buffering, fogging and alpha blending (32-pixel divided from realistic fillrate)
  • 130,208–364,583 triangles per frame at 60 frame/s
  • 260,416–729,166 triangles per frame at 30 frame/s
By resolution, if this were per pixel taking the lower and upper bounds
672 x 378 ->>> 1136 x 639.

And that is the limit XBO can go at 1 triangle per pixel. Sub 720p at 30fps.
What makes you think XBO can supply data fast enough to enable that? Don't look at GPU performance and pixels but drive performance, MBs of data, and latency with random access of data across the HDD.
 
What makes you think XBO can supply data fast enough to enable that? Don't look at GPU performance and pixels but drive performance, MBs of data, and latency with random access of data across the HDD.
I don't. That part I can't calculate. You're unlikely to render so fine with XBO. So you'll need supply assets with lower polygons and the accompanied lower quality textures. But how the engine will interact with those assets should be the same, just scaled down.

Maybe i'll put it a better perspective. One way to look at it is that with the demo they rendered 20 million polygons per frame there, which at 30fps is pretty near the maximum possible budget for a PS5. Just looking 4 triangles per clock * 2230 Mhz per second. Divide by 30 you get close to 29M/triangles per frame as a theoretical. But the resolution in this case is 1440p, which is 2560x1440, 3,686,000 pixels. That's a lot more triangles than pixels on screen per frame it culled a whack of triangles, like crazy amounts of culling. Basically reducing 20M triangles on screen at any given time to 3.6M triangles.

That's quite powerful perspective of the power of the engine to render only what it needs. Or with this engine PS5 can handle assets 5x greater than it's resolution.

I don't know how this will translate down, but I'm positive UE5 will run on both XBO and PS4.
 
Last edited:
Back
Top