Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Yeah you’re right. BVH stuff in DXR is a black box so all the partitioning and CLAS stuff has to be in the driver.
Yep, which is why this need API changes in the first place on PC. But the direction is obviously one that is designed to work well with Nanite and similar systems. Hopefully it can be standardized in DXR soon, but the NVAPI implementation is a very nice PoC and development vehicle in the mean time!

Agreed the implementation with Nanite and such obviously needs to be done and owned by the core engine itself; it's not something that should really live in a branch or plugin for the long term.
 
Yep, which is why this need API changes in the first place on PC. But the direction is obviously one that is designed to work well with Nanite and similar systems. Hopefully it can be standardized in DXR soon, but the NVAPI implementation is a very nice PoC and development vehicle in the mean time!

Agreed the implementation with Nanite and such obviously needs to be done and owned by the core engine itself; it's not something that should really live in a branch or plugin for the long term.

Makes sense. Hopefully it works well and Microsoft/Epic etc can integrate relatively quickly. Seems like a natural path for UE5 to take and make it a bit easier for devs to migrate to Nanite level geometry.
 
Yep, which is why this need API changes in the first place on PC. But the direction is obviously one that is designed to work well with Nanite and similar systems. Hopefully it can be standardized in DXR soon, but the NVAPI implementation is a very nice PoC and development vehicle in the mean time!

Agreed the implementation with Nanite and such obviously needs to be done and owned by the core engine itself; it's not something that should really live in a branch or plugin for the long term.
AMD has it's own cluster BVH research paper as well: https://gpuopen.com/download/publications/HPLOC.pdf

I do wonder if there's some holdup in someone's hardware keeping a standardized API access from appearing in DXR. Ohwell, others are still doing shadowmaps on mobile for many lights decently https://threadreaderapp.com/thread/1912719317759635662.html though I do suspect it should be even easier for distant lights.

As lights get more distant projected screen area should shrink to the point where you can just store shadowmap per light rather than bothering with per projected tile, if it's a meter per pixel (human scale) then memory per light shadowmap will be tiny assuming smallish light sizes (<= 16m radius), and hacks like screen space traced shadows less and less noticeable as artifacts come from parallax and thickness estimation errors. Only store static shadowmaps and hack dynamic objects with screen traces, 64k+ distant shadowed lights for cheap?
 
Agreed the implementation with Nanite and such obviously needs to be done and owned by the core engine itself; it's not something that should really live in a branch or plugin for the long term.

Who decides that? Is there a VP at Epic that makes decisions on what features get added to the engine? How are the implementation details decided upon?
 
Usually you have technical leads or principal engineers in specialized domains that would help make those decisions.

I do IT for a very large healthcare organization, and you’d be surprised at the lack of standardization of practices. A single mid-level developer can decide how something gets coded just because he got assigned the Jira ticket. Or a low level manager can decide which features get implemented and which don’t because he has to approve all work done by his team.

That’s why I’m curious how Epic does it.
 
Yep, which is why this need API changes in the first place on PC. But the direction is obviously one that is designed to work well with Nanite and similar systems. Hopefully it can be standardized in DXR soon, but the NVAPI implementation is a very nice PoC and development vehicle in the mean time!

Agreed the implementation with Nanite and such obviously needs to be done and owned by the core engine itself; it's not something that should really live in a branch or plugin for the long term.
Would mega geo be performant on AMD? Even with API support, that would be a requirement for widespread adoption.
 
Would mega geo be performant on AMD? Even with API support, that would be a requirement for widespread adoption.
And Intel as well. But I don't see why not; again, this is much less a hardware feature and much more just a change to how the API allows the application to interact with BVH management that the driver abstracts a chunk of on PC.

Who decides that? Is there a VP at Epic that makes decisions on what features get added to the engine? How are the implementation details decided upon?
If you are asking about my comment that it belongs in the core, portable engine rather than a branch or plugin, I don't think that's controversial... I imagine everyone including NVIDIA/AMD/Intel would agree there. If you're asking about priorities and timelines of graphics features... there's way too many factors to give any sort of general answer on that front unfortunately.
 
Last edited:
If you are asking about my comment that it belongs in the core, portable engine rather than a branch or plugin, I don't think that's controversial... I imagine everyone including NVIDIA/AMD/Intel would agree there. If you're asking about priorities and timelines of graphics features... there's way too many factors to give any sort of general answer on that front unfortunately.

Oh, I don’t think it’s controversial, I was just wondering who would make the call. Someone has the responsibility of deciding what features ship in the core engine, yes?

Do you have internal “customers” that set development priorities?
 
Oh, I don’t think it’s controversial, I was just wondering who would make the call. Someone has the responsibility of deciding what features ship in the core engine, yes?

Do you have internal “customers” that set development priorities?
As I said, it really just depends on way too many factors to give a general answer. I think you also may be assigning more structured planning to things in your mind than happens most of the time on the graphics team. I would say at least as many features come out of organic R&D ("I did this prototype and it seems like a good fit for release/licensee/internal/demo X so let's spend a bit of time to polish it up") than top-down planning. And what's in a given engine release on the graphics features front is often as much "when did it branch" as any specific planning.

That said, I expect other teams work in different ways and may be more (or even less?) structured in their approaches to deliverables. And that's not to say we don't also have some number of planned deliverables up front, more just that the priorities are affected by a lot of shifting factors and there's never an end to our ambitions/available work.
 
Last edited:
The game is running on Creation Engine. The graphics side is UE5. I'm not sure quite how CE talks to UE. Maybe we'll get a talk at some point.

Team Ninja did the equivalent with Ninja Gaiden 2.
People seem to think this is like... way more complicated than it really is, especially for a single player game. Most "gameplay" code is already something you write for a specific game. There's some example code and systems to get you started in engines, but it's very normal to not use any of that and build your own logic from scratch. It's just too dependent on the specific game to necessarily provide a bunch of reusable stuff in a broad engine designed for multiple genres.

Physics is probably the most commonly used piece in game engines on the gameplay side, but it's also pretty reasonable to use your own code for that if there's a reason to. The main reason people often end up using engine physics solutions is that they are often coupled a bit more tightly to networking, which is something that fewer people want to entirely roll their own. Obviously not an issue in a single player game.

So yeah it's completely reasonable and not particularly complicated to use all your own gameplay logic, including physics engine and so on, and drive all the UE rendering from that. It's possible to do that even in a networked game too, it just gets a bit more gnarly as physics and the like interact with netcode state and prediction and so on. If you bring your own netcode with your own physics though, that's obviously still nicely decoupled.

In any case there are also various game genres that I think are better suited to just rolling your own code than using the stuff that is in these general purpose engines. RTS is a big one, but also anything with deterministic gameplay/netcode is often something that's just easier to do yourself rather than try and make general engine systems deterministic.
 
Oblivion Remastered is UE5.3 apparently, the developer said the game is launching with every UE5 feature under the sun, including Hardware Lumen ray tracing.
Does it have Nanite? If so, that's a big step up over Starfield. I hope Bethesda is planning serious upgrades to the Creation Engine 2's renderer for TES VI, because it would be embarrassing if it was worse than this remaster.
 
Does it have Nanite? If so, that's a big step up over Starfield. I hope Bethesda is planning serious upgrades to the Creation Engine 2's renderer for TES VI, because it would be embarrassing if it was worse than this remaster.

Yes, it uses Nanite. There's no particular reason that Bethesda can't roll their own virtual geometry system for Creation Engine. Anvil and idTech are both using it in their latest iterations. Whether they chose to is obviously another matter.

Bethesda do roll in new technologies with each title. Starfield's real time GI can do a great job, just not consistently so. There's a really nice amount on geometry in more confined areas. The updated materials pipeline yields some very good looking objects.

It'll be interesting to see what they eventually deliver with ES6.
 
I am lost with this one. Normally remaster is just better assets and resolution bump. This looks like a remake for me. But I can be wrong.

Remaster to me means improved visuals and maybe re-recorded audio and cutscenes. Remake is a totally new game.

I’ve never played Oblivion or any of the ES games. How large is it compared to modern open world RPGs?
 
Back
Top