Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I find it sad on PC or consoles, imagines a game with mandalorian assets quality at 4k on a 3090.

The fun part is that for the various levels of storage and game resolutions the artist shouldn't have to change the original assets. That's someone else's or the tools job to run the "stored Nanite version quality sliders".
 
I find it sad on PC or consoles, imagines a game with mandalorian assets quality at 4k on a 3090.
That's a movie, so it's all facade. We can do this in games too. Make it all look like a movie... from a certain point of view. Well suited for the story focused and linear type of game Sony is known for.
But can we have this detail for open world / big MP games too? Yep, but only if we accept visible repetition and reduced variation. UE5 demo shows this, e.g. using rocks with very flat, right angled structure. That's not really natural, but it allows compromised art control to get pleasant, regular structures from the aligned rock planes, and ability to break grid like patterns with some variation of angle and placement. Still, it does not look like real rocks do. The result is too uniform, and it even breaks laws of physics here and there if we look close.

In this context the new tech does not improve over state of the art methods. Procedural placement tools will help a lot with this storage restrictions, but i'm a bit afraid the storage wall will help the idea of games as a streaming service. What happens if Google makes a game which is 100TB in size?

Well, if there's some console vs. PC fanboys wars going on, enjoy it as long as it lasts. Soon we'll all shed a tear of nostalgia when we find that dusty PS6 / RTX4K ovens in our basements, snifff... :/
 
Epic disagree:

Like mots features of previous version of UE, it's scaleable.

Yes UE5 is scalable, there's absolutely no argument there as I've pointed out several times. But Epic made quite clear in the Edge article that HDD's are a limiting factor of their next gen tech (specifically Nanite, I'm not too sure about Lumen). The following statements from the Edge article are specifically in reference to Nanite:

"Sweeney says. Their discussion wasn't just about graphics, but about the growing realisation that storage architecture in game hardware – having to load data from a hard drive, the huge amounts of latency between mass storage and a processor – was a limiting factor in Epic's and all developers' future plans for game-making"

"The previous generation of consoles required you to support spinning hard disks, which means you have variable latency time, and it's not totally predictable how long it's going to take to get a piece of data. With the flash storage, you have way, way more predictability and much, much lower latency to be able to get those reads into memory."

"It's a key unblocker for what Brian and team have built here," Sweeney confirms."


The statement you've linked by Sweeney: "You could render a version of this [demo on a system with an HDD], it would just be a lot lower detail," in no way says that Nanite is supported on HDD's - at least not in it's full form. It sounds much more like a confirmation of the earlier Sweeney tweet that I linked which said:

"The Nanite and Lumen tech powering it will be fully supported on both PS5 and Xbox Series X and will be awesome on both. *And high end PCs. * And with features for scaling the content down to run on current generation platforms using traditional rendering and lighting techniques"

Where are you getting this information about Direct Storage?

You're asking where I am getting the information about Direct Storage that it makes the process of IO more efficient and reduces CPU overhead? Basically any source of DirectStorage information on the internet. This is the main purpose of it's existence. Here's one of many examples:

https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

"The final component in the triumvirate is an extension to DirectX - DirectStorage - a necessary upgrade bearing in mind that existing file I/O protocols are knocking on for 30 years old, and in their current form would require two Zen CPU cores simply to cover the overhead, which DirectStorage reduces to just one tenth of single core."

Relative to what? :???:

Total CPU power of course. Or are you suggesting that 2 full CPU cores worth of IO overhead for a modest 2.4GB/s throughput isn't relatively high?

Disk operations should not be using significant CPU usage on any platform. Windows is the outlier here because of Window's architecture. But your average sustained I/O on any typical device shouldn't be using more than 5% of your CPU. Window's issue isn't that it causes high CPU usage, it's just the fundamental model bottlenecks I/O and introduces latency. You can test this on your own PC. Unless you're runing an IDE controller from the 1990s, you should not be having issues.

I'm really not sure what point you're trying to make anymore. We're discussing Windows IO performance/overhead and Direct Storage. I've provided you with direct quotes from Digital Foundry's interview with Microsoft stating specifically that the CPU overhead associated with streaming 2.4GB/s of data off the SSD is 2 Zen2 cores without DirectStorage. Are you suggesting Microsoft are wrong?

Can you provide a link to this? I've seen this oft quoted on resetera but never seen the article itself.

I already did, literally in the same post and response you're quoting here. I can't understand how you could have missed it. I've provided the same link and quote above.

You need to accept that he references to CPU usage are not about I/O but the way data is stored and the data flow for getting it into a usable state, not inherently the I/O itself. This is different.

You're simply splitting hairs. Again, read the Digital Foundry quote that I've provided several times now. It's completely explicit that the old IO protocols are resulting in unnecessarily high IO associated overhead on the CPU. The DirectStorage blog posted above by chris1515 describes in detail how they achieve this.

You also need to stop making assumptions about Direct Storage given Microsoft themselves have said very little about the implementation on Windows.

You want me to not assume that DirectStorage will reduce CPU overhead related to IO operations despite Microsoft explaining in intricate detail here and in the article quoted above that this is exactly what it will do? I'm starting to wonder if you have any idea at all what DirectStorage is.

Here is one of several examples from the blog:

"In addition, existing storage APIs also incur a lot of ‘extra steps’ between an application making an IO request and the request being fulfilled by the storage device, resulting in unnecessary request overhead. These extra steps can be things like data transformations needed during certain parts of normal IO operation. However, these steps aren’t required for every IO request on every NVMe drive on every gaming machine. With a supported NVMe drive and properly configured gaming machine, DirectStorage will be able to detect up front that these extra steps are not required and skip all the necessary checks/operations making every IO request cheaper to fulfill".

I really don't know how much plainer I can make this.

I'm stating neither. I've made literally no references to lumen or nanite. :???:

Then what exactly are you arguing with me about? Here is your original comment that sparked this entire debate, made in response to my stating that no-where in the Edge article is it mentioned that the UE5 demo is dependent on the PS5's specific IO:

Dsoup said:
If it's just an SSD, why is Tim Sweeney banging on about nextgen consoles and their I/O? Why did Epic just not demo this on a PC with a fast NVMe drive?"

You're clearly implying here that there is something about the next gen consoles IO that is enabling the UE5 demo which is not possible on current PC's. So if it wasn't your intention to imply this, and you in fact agree with me that the demo will (and does, according to multiple Epic sources) then what was the purpose of this comment? If it's to simply highlight that there is more to the consoles IO than a simple move from HDD to SSD then that has never been in dispute.

What's the big deal with the Chinese stream? They're playing an mp4 recording of the PS5 UE5 demo. It's not running on the PC :/

No-ones claimed that they were running the demo on a PC in the livestream. It's the comments that the Epic engineer makes about the demo running on his own PC that are under discussion. It has absolutely nothing to do with the video of the demo they played in the livestream.

And they basically said that running it on a 2070 with an SSD might be possible but it really requires a fast I/O subsystem coupled with a fast SSD. That's the tech that Epic were working with Sony to perfect, the ability to place high detail models straight into the right location in memory without the usual overhead.

This is absolutely not what the engineer said, at least not according to every translation I've seen. He specifically stated that it was running on his own development laptop at around 40fps. He also states that the streaming requirements of the demo aren't particularly high, and don't require something as fast as the PS5 IO.[/quote]
 
That's a movie, so it's all facade. We can do this in games too. Make it all look like a movie... from a certain point of view. Well suited for the story focused and linear type of game Sony is known for.

Theres hope they move away from that. I think HZD and DS and even GoT where 'ok' in that department. Still very much gameplay in that. GoW, spiderman, order1886, UC4.....

Well, if there's some console vs. PC fanboys wars going on, enjoy it as long as it lasts. Soon we'll all shed a tear of nostalgia when we find that dusty PS6 / RTX4K ovens in our basements, snifff... :/

Its the SSD that can stand a chance to whats available to the public today. Wars will be fought aslong theres hope on that one.... even though i think its already lost since DS made its entry awhile ago. We are at 7gb/s before decompression, and seeing how effective GPU's are at such tasks, i think theres where the future lies anyways. Have some of the compute from a GPU get the job done, much faster and much more flexible, directly on the GPU, where the rendering is done still. Seems more and more is being done on GPUs these days. Audio, physx, compute tasks that before where CPU ones....
Also, it seems that PS5/XSX are compared to last generation HDD in the consoles, the leap is much larger then when even compared to a HDD in a pc.

a game like tekken 8 on UE5 could go all out with the details of the arenas and backgrounds.

Maybe something like GT or wipeout aswell, or the next SSX ofcourse :p
 
a game like tekken 8 on UE5 could go all out with the details of the arenas and backgrounds.
Should be amazing for games like UFO as well.

Artists not having to worry as much about high/low poly workload and possible LoDs..
They could check which quality level is usually visible and let Nanite decimate/change 'mips' which would be stored to disk to get install size down.

Procedural nanite mesh generation would be huge, hopefully it happens as well. (Tessellation, object scattering and combining.)
 
Last edited:
Just a general comment regarding UE5 and the PS5. Isn't it worth keeping in mind that the initial (and I believe almost all data?) so far is based around what's almost certainly a cross marketing event in conjunction with the PS5. With that in mind we might want to be wary of specifics in terms of how UE5 will function with respect to other hardware configurations.

out of interest does anyone know how unreal licensing works with something like gamepass? since as far as I know the unreal licensing is on a revenue basis, which would be hard to determine from a subscription service.

It cant be the case that Microsoft gets to use unreal for 'free' if everythings just on gamepass, or is there a clause that I haven't seen?

Why do you feel revenue would be hard to determine? The streaming service would be providing specific revenue which would be counted.

https://www.unrealengine.com/en-US/eula/publishing

e. Revenue from advance payments for a Product (from a publisher or otherwise);
f. Revenue received in connection with a Product’s inclusion in a streaming, subscription, or other game-delivery service (e.g., Apple Arcade, Microsoft GamePass, or any similar or successor services), including without limitation development funds and bonuses; and
g. Revenue in any other form actually attributable to a Product (unless excluded below).

If you mean Microsoft specifically I'd assume they are on a scale large enough to have negotiated a custom license regardless.
 
Last edited:
Just a general comment regarding UE5 and the PS5. Isn't it worth keeping in mind that the initial (and I believe almost all data?) so far is based around what's almost certainly a cross marketing event in conjunction with the PS5. With that in mind we might want to be wary of specifics in terms of how UE5 will function with respect to other hardware configurations.



Why do you feel revenue would be hard to determine? The streaming service would be providing specific revenue which would be counted.

https://www.unrealengine.com/en-US/eula/publishing



If you mean Microsoft specifically I'd assume they are on a scale large enough to have negotiated a custom license regardless.


Yeah I did mean Microsoft specifically, especially with their growing first party that are largely using unreal engine.

I think that Microsoft might move to commercialize idtech and make it a more general use engine like unreal, they already have (or will have) a dedicated engine development team with id Frankfurt, maybe partnering with someone like EA to do engine development, I don't think they would force any of their first party to use this commercialised idtech, it's just important to have an unreal engine backup in their back pocket just incase something happens to epic; who knows maybe tencent buys the rest of epic and then the US bans all Chinese owned companies from trading with American owned ones, a long shot of course but after what happened in the US capitol yesterday I am firmly in the anything can happen camp.

Besides Microsoft and Amazon are copying each other in the cloud space to a large degree, and amazon has their lumberyard engine that hooks into all the AWS services, I could definitely see Microsoft wanting an engine for the same purpose.

Microsoft is investing billions into gaming, building off the backs of epic, its never a good idea to rely on someone else's foundation to build a business of the scope that they plan to have, even just from a monetary standpoint, if xbox revenue gets to what PlayStation is doing currently, ~$20B, the 5% cut is $1B. I know not every game they make uses unreal but I'm sure that it is a consideration for them, $1B in engine development per year would very quickly make a comparable engine to unreal, and soon surpass it.
 
Found the id engine to be more impressive every gen.... Seeing what eternal did its amazing, 60fps on the consoles aswell with the jaguars. Unreal engines have been good for demos.
 
Found the id engine to be more impressive every gen.... Seeing what eternal did its amazing, 60fps on the consoles aswell with the jaguars. Unreal engines have been good for demos.

idTech is rubbish. 11 yrs later and it's barely used across other Bethesda studios. ;-)
 
idTech is rubbish.

Seeing Eternal's performance on DF analysis, i really think it was/is a very impressive showcase of what the engine can actually do. I find on the other hand that unreal engine games often are nowhere close to what tech demos have shown. How this will be going forward remains to be seen.
 
Seeing Eternal's performance on DF analysis, i really think it was/is a very impressive showcase of what the engine can actually do. I find on the other hand that unreal engine games often are nowhere close to what tech demos have shown. How this will be going forward remains to be seen.

I was being silly. Modern Dooms are fantastic showcases. idTech is not a multi genre platform like UE though. UE has some shortcomings but it intimately boasts an amazing variety of genres and some fantastic looking titles.
 
idTech is rubbish. 11 yrs later and it's barely used across other Bethesda studios. ;-)


Yeah because they haven't put the resources/money into making it more approachable to a non-inhouse team. They likely only have the resources to make idtech great for fps games, and don't have the resources or inclination to make it a more general engine.
 
Yeah because they haven't put the resources/money into making it more approachable to a non-inhouse team. They likely only have the resources to make idtech great for fps games, and don't have the resources or inclination to make it a more general engine.

Agree. UE itself and Frostbite have both had rocky roads from their FPS origins and both continue to have issues. Is moving away from UE wholesale worth it from either a cost or tech perspective? Probably not.

This swerving OT, but not sure how much milage there is in a thread on "what should MS do with their in-house engine resource"
 
Agree. UE itself and Frostbite have both had rocky roads from their FPS origins and both continue to have issues.
I wonder if cars on new NFS still have guns, due the fps origins of Frostbite. (at least early ones did..)

It must be quite problematic to do big changes during production. (or design engine in a way that every kind of game would be plausible.)
 
Agree. UE itself and Frostbite have both had rocky roads from their FPS origins and both continue to have issues. Is moving away from UE wholesale worth it from either a cost or tech perspective? Probably not.

This swerving OT, but not sure how much milage there is in a thread on "what should MS do with their in-house engine resource"


Yeah it is a bit OT, sorry about that :D I'll get back on topic after this

No I don't think anyone is rushing to get off unreal, but I do think at the scale that Microsoft, sony, and people like EA and Ubisoft are operating it almost becomes irresponsible to not have an engine that they control, I am not saying shove it down all your devs throats like EA does with frostbite, but just to give you options and not leave you stuck between a rock and a hard place, just in case epic said F it and raised the revenue cut to 30% for instance.

Plus as a differentiator to get devs to be happy to use the engine you could add a bunch of asset generation and acoustic stuff Microsoft research has been working on.
 
idTech is rubbish. 11 yrs later and it's barely used across other Bethesda studios. ;-)

This is true for previous idtech iterations because of virtual texturing -- i wonder how that will bode for ue5?

The new engine (doom eternal) is clearly the most impressive clustered forward renderer out there (edging out the mw engine, its also some kind of forward + isnt it?) -- that's a tech a lot of people believe in and it makes at least some parts of the pipeline very simple. Wonder if we'll see it wider used within bethesda.
 
This is true for previous idtech iterations because of virtual texturing -- i wonder how that will bode for ue5?

/Ross voice It was a joke!

You're mixing mega and virtual textures. It's the former than caused issues and drama. Every* engine should use the latter. Nanite isn't a monalithic baked level. It's lots of objects residing in memory with only the required detail. Much more akin to virtual texturing than megatextures.

The commonality that could occur is in the need to compress assets into a give size. Rage in particular wasted artist time as the texture detail was squished out of the distributed game's textures.

*Give or take
 
/Ross voice It was a joke!

You're mixing mega and virtual textures. It's the former than caused issues and drama. Every* engine should use the latter. Nanite isn't a monalithic baked level. It's lots of objects residing in memory with only the required detail. Much more akin to virtual texturing than megatextures.

The commonality that could occur is in the need to compress assets into a give size. Rage in particular wasted artist time as the texture detail was squished out of the distributed game's textures.

*Give or take
Not to be pedantic, but "Megatextures" is a marketing term for a kind of virtual textures -- but I think you're right that the problems with idtech were probably more dow nto the implementation and tools specific to how they did megatextures than to the technology as a whole.
 
Back
Top