Unreal Engine 5 Tech Demo, [UE5 Developer Availability 2022-04-05]

Discussion in 'Console Technology' started by mpg1, May 13, 2020.

  1. Slifer

    Newcomer

    Joined:
    May 19, 2020
    Messages:
    16
    Likes Received:
    34

    There is no downgrade. He never said that. The lumen in the land of nanite was unchanged and not downgraded in anyway.

    Secondly the demo doesn’t require 64 RAM. It requires only 3 GB Ram and up to 7 GB VRAM (allotted) and only about 4-5 GB used according to @Dictator.

    It’s the editor that needs 32GB - 64GB. What makes the editor so heavy is because you are accessing all the source assets, source textures source materials, source shaders, uncompressed etc.
    This allows you to be able to manipulate things, change colors, materials, renders. This is why you need 32-64 GB memory.

    When you compile/package the project, you have none of the source assets, everything you have is compressed and baked out and can't be changed. So you don't have uncompressed source assets and shaders hanging in memory just in case you need to edit it. That's why the demo is 100 GB but the compile version is ~25GB.
     
    pjbliverpool, cwjs and PSman1700 like this.
  2. cwjs

    Regular

    Joined:
    Nov 17, 2020
    Messages:
    373
    Likes Received:
    733
    Hey, thanks for posting this, can you point to a link or timestamp where they mention it being heavier? I think it's pretty obvious looking at the demos but I couldn't find a source.


    ----


    Regarding ram talk: I wish we could analyze this granularly, I suspect a significant portion of it is textures and lumen stuff. It doesn't make sense to store a whole big tree for every object in the scene in ram if your whole goal is to only access a few leaf levels every time the content changes.


    It was, the poster artificially lowered their ram or the streaming pool. However, even so, it was the result of a bug:

     
    Deleted member 86764 likes this.
  3. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    I'm not claiming any downgrade in engine/features/perf. All those should be better. The asset quality was downgraded and reason given for that was to make it work on all platforms. Original demo used 8k textures and extremely detailed assets. One reason for asset downgrade is disk storage space and other part is the demo fits in 64GB ram.

    It is difficult to stream many small blocks from ssd efficiently in windows until we get directstorage in windows and directstorage integration in ue5 + hw/gpu based decompression. Likely epic has this done already but we don't have directstorage available yet in consumer windows.

    Once ue5 ships things should be fine. Consumer windows with directstorage will get released and polished. Just the same as ue5 is still WIP and being developed and optimized.

    edit. There also is the discrepancy between editor and original demo. If someone just looks at the footage it's easy to come to wrong conclusion due to editor not enabling all features

     
    #2223 manux, Jun 7, 2021
    Last edited: Jun 7, 2021
  4. cwjs

    Regular

    Joined:
    Nov 17, 2020
    Messages:
    373
    Likes Received:
    733
    What's the source on this? And why would this be a problem? Virtual texturing has been around for over ten years, we can put as big of textures as we want on hdds.
     
    PSman1700 likes this.
  5. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    Nope, its due to the editor.
     
  6. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    Brian Karis on one of the ue5 streams was talking about asset quality downgrade and reasons behind it. Unfortunately I don't have the exact timestamped link available. I didn't think this would become some kind of argument as it was clearly stated in stream.

    Streaming and decompressing small random blocks in windows is not very good until windows with directstorage ships. UE5 is all about streaming dynamically what is needed. Allow huge source level assets in disk and create appropriate lod dynamically. New demo circumvents need to stream by allowing whole scene fit ram in machines with 64GB or more ram.

    About DirectStorage

    About having the whole level in memory in pc to avoid streaming

    Epic could have released a demo with more detailed assets. It would have likely worked fine. But it would have required more disk space and ram or directstorage enabled windows which is not widely available yet.

    This is kind of crazy in the original demo. Also shows how the original content works in free roam debunking the on the rails a arguments.
     
  7. Slifer

    Newcomer

    Joined:
    May 19, 2020
    Messages:
    16
    Likes Received:
    34
    No it wasn’t.

    Valley of the ancient uses more texture data than lumen in the land of nanite.

    and no it’s the same assets from Quixel and Nanite isn’t affected by the amount or quality of asset you use.

    Stop spreading FUD. You have been corrected many times, even by the creator of nanite.

    The nanite data for the lumen in the land of nanite is 6.14 GB.

    You don’t need a super fast SSD.
    You don’t need direct storage.
    You don’t need RTX IO.
    You don’t need 32-64 GB memory.

    Valley of the ancient needs only ~3GB ram and 4 GB VRAM used out of 7 GB Alotted (in some case)

    There are no discrepancies.
    The demo as it is run on Brian’s PC. Period.
    You either accept the facts or you don’t.
     
  8. cwjs

    Regular

    Joined:
    Nov 17, 2020
    Messages:
    373
    Likes Received:
    733
    1. It's not very good, no, but it's plenty good enough to access the tree for virtual textures or nanite style virtual geometry. That's one of the main problems nanite is solving and why it's impressive -- the load is small enough for ssds to handle because it only has to pull a few leaf nodes per mesh. (You remember playing games with virtual textures on your hdd right? Textures are bigger than meshes, especially nanite meshes)
    2. The 64gb ram claim is completely false and has been disproven repeatedly. The whole demo is less than 30gb built, and vram loads are under ~5gb.
    You're totally right about everything in this post, but:
    tbf, its safe to assume his pc much better than yours or is. Dev computers on big engine teams usually are nuts, threadrippers with 128gb+ of ram. (But that's so they can do fast builds, not because its the minimum requirement to run a demo)
     
  9. Dictator

    Regular

    Joined:
    Feb 11, 2011
    Messages:
    681
    Likes Received:
    3,969
    It was not brian karis talking about it - it was the artist responsible for the demo (Galen Davis) describing how they used tiling detail normal maps instead of discrete ones for disk space reasons in distributing the demo's raw assets (it is 100 GB after all). It was not because a wide range of devices need to run it, but more a bandwidth concern of distribution.

    EDIT: 16:45
     
    #2229 Dictator, Jun 7, 2021
    Last edited: Jun 7, 2021
  10. LiveGamer

    Newcomer

    Joined:
    Jan 29, 2021
    Messages:
    29
    Likes Received:
    33
    Just curious is there any reason why the fire in the first shot is so pixelated? Also everything appears a bit soft/blurry I'm guessing that's the TSA?
     
    PSman1700 likes this.
  11. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    Why are people so convinced there are excessive or esoteric IO requirements for Nanite? Brian showed that demo running just fine on a regular PC in his live stream to try and quell some of these myths but they appear to persist.

    In general why do people even get the impression that lots of IO is even required? As noted in the live streams as well, the texture data in the demo exceeds the size of the geometric data and no one seems to think virtual texturing systems that have been in use for many years require DirectStorage or super high bandwidth SSDs.

    SSDs are definitely important (but not necessarily required) for latency reasons. Some amount of bandwidth is necessary to keep up with streaming while moving quickly through a world too, but in my experience it's not nearly as much as people seem to think it is. Where are people getting these impressions?

    I think it's safe for me to reiterate from the stream that there is nothing special going on in either of the demos on PC in terms of IO... it's doing the same thing as the consoles (consoles use the platform-specific final package compression stuff of course, but if anything that makes it even smaller there). You can go play with the cvars and arbitrary limit bandwidth and so on yourself now and see how they affect things.

    Also to reiterate from the live stream - the new demo is heavier than the old one. The overdraw is pretty nuts and there are more lights with full screen coverage in the dark world segment.

    But most importantly: the full engine and source code is out now; there's no more need to speculate on stuff. Please go test it (and report bugs/issues) and see for yourselves! (Minor aside/reminder: if you're interested in representative game-like resource usage be sure to be running a fully cooked build in shipping config with DX12.)
     
  12. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    I would be remiss if I didn't also note something that was already mentioned on Twitter - Nanite currently targets the output resolution *before* TSR upscaling in terms of its polygon sizes, thus you will see differences in the default config. You can adjust the nanite CVars to compensate if the goal is to compare TSR itself.

    There's not necessarily a "right" way to do this per se and it may change in the future. There are similar questions for virtual shadow maps (which resolution do you target), which also currently target the pre-upscaled resolution (and can also be modified with cvars). These are all choices a specific game could make for various platforms in any case, so I wouldn't take the current defaults too seriously.

    In any case I just wanted to note that in case people are trying to compare "just TSR itself". If you want to do that, you should probably also modify Nanite's and VSM's resolution targets to be equivalent to what they would be at 4k native.
     
    pjbliverpool and chris1515 like this.
  13. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    The whole point there was to point out there wasn't any trickery at all in the original demo. There was no "loading tunnel" and it can easily stream all of the nanite geometry/textures for the whole level dynamically. What he meant by "I have the whole level loaded" is that old demo did not use world partition (it wasn't ready yet, as he mentions in other places); it instead used classic UE4-style sublevels. He just "loaded" all the sublevels (in terms of UE4 assets, not in terms of Nanite geometry/virtual texture streaming!) for the purposes of flying around all of it. Nanite and virtual texture streaming still operate basically the same way in editor as they do at runtime, although the editor itself often keeps around more copies of uncompressed data for editing purposes. Hence the RAM cost in editor is higher, but it's not because of anything related to the cost of rendering the scene. As I noted above, if anything performance in editor is lower than a shipping build due to various factors.

    All of this is pretty orthogonal to Nanite though and the tldr was meant to be "all of these platforms can happily stream all this data just fine, as you can see". Nothing fancy required.


    Yeah it looks better now than it did then because we've improved a lot of stuff! :)

    Brian is just pointing out that the directional light direction isn't exactly the same when he was just fooling around because that was part of the scripting in the original demo (ex. as the roof opened and so on). That in no way would make performance better. Indeed he had the directional light enabled for the entire time he flew around, in addition to the spot lights the artists used for the cave sequence. In the original demo it was only ever one or the other as our shadow caching and optimization was not at the same level it is now.

    Why does everyone think there's a conspiracy here? Just go play with it yourselves and see! IMO it's pretty easy for even amateurs like me to drop in a bunch of Quixel geometry and get a similar sort of test case as the various demos. Indeed the exact same assets are available to drop in for free! Sure our test scenes will look like trash compared to real artist-authored stuff, but you can observe the performance characteristics for yourselves.
     
    #2233 Andrew Lauritzen, Jun 7, 2021
    Last edited: Jun 7, 2021
    turkey, Kugai Calo, chris1515 and 5 others like this.
  14. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    And my poor SSD space on dev machines with multiple of these sorts of projects synced. 16TB SSDs when....
     
  15. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,528
    Location:
    Pennsylvania
    Probably because the demo last year was all about how this was only possible due to the PS5's insanely fast IO.
     
  16. Just because the current demo loads everything to VRAM doesn't mean all future games built on UE5 will reside that way.

    It's a lot like Ridge Racer containing the whole game in the PS1's RAM. This isn't a game, it's an engine demonstrating two brand new technologies.

    Games certainly will use the SSD. Fully saturate it? Dunno, we'll have to wait and see.

    I want to see Wipeout on this engine.
     
  17. JoeJ

    Veteran

    Joined:
    Apr 1, 2018
    Messages:
    1,523
    Likes Received:
    1,772
    I guess it's things like those:
    Nanite shows insane detail, indicating storage cost is now insane too.
    Next Gen console marketing made SSD a big argument, indicating it is necessary for next gen experience.
    Those two points sum up well.
    Traditionally, open world games often stream in bigger chunks of world. Because chunks are not fine grained, one could think increasing detail also increases those IO loads accordingly.

    Edit: Maybe it's just this: It's not really obvious the advantages Nanite enables for gfx performance and memory requirements apply to IO just as much. Personally i would not have thought about that, until some days ago somebody mentioned this here.
     
    #2237 JoeJ, Jun 7, 2021
    Last edited: Jun 7, 2021
  18. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    Sure, but there's a difference between "we need/really want an SSD because they are hilariously faster and lower latency than HDDs and optical drives that previous consoles relied on" and "we need tons of bandwidth and lots of fancy stuff on top of that". Don't get me wrong, all the new fancy IO stuff is great and I'm sure will help improve things further in the future across the board in games, but I feel like Epic has been pretty clear from the start that this stuff works well on all of the target platforms and will continue to get better.

    That's the whole point in Nanite after all: virtualized geometry in the same way as virtual textures. You need not think about loading in big chunks of stuff at once, it just handles it automatically as you move around sticking to a fixed physical memory budget.
     
    #2238 Andrew Lauritzen, Jun 7, 2021
    Last edited: Jun 7, 2021
  19. https://www.gamesradar.com/epics-un...-ps5-vision-that-sony-has-only-told-us-about/


    Is he wrong?
     
  20. The video with the guy showing the original PS5 demo on a PC states that they changed the technology from a steaming one to something else. I can't recall the specifics and have a toddler climbing on me.
     
    Deleted member 13524 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...