Unreal Engine 5 Tech Demo, Release Target Late 2021

Discussion in 'Console Technology' started by mpg1, May 13, 2020.

  1. Recop

    Veteran Newcomer

    Joined:
    Aug 28, 2015
    Messages:
    1,290
    Likes Received:
    625
    He never said that but some of his statements could confuse the large audience and only the large audience. Developers obviously know anything about UE5 would work on other platforms.

    Edit : by other platforms i mean modern hardware = next-gen + PC.
     
    #701 Recop, May 17, 2020
    Last edited: May 17, 2020
  2. Pinstripe

    Newcomer

    Joined:
    Feb 24, 2013
    Messages:
    118
    Likes Received:
    73
    Did Epic explicitly say to support PS4/XB1/Switch with UE5?
     
  3. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,963
    Likes Received:
    14,914
    Location:
    Cleveland
    Pretty much on the official UE5 Blog that was linked early on. They're making it incredibly easy to port from UE4 to UE5. It will be nothing like porting from UE3 to UE4.. I'll link to it yet again: https://www.unrealengine.com/en-US/blog/a-first-look-at-unreal-engine-5

    Unreal Engine 4 & 5 timeline
    Unreal Engine 4.25 already supports next-generation console platforms from Sony and Microsoft, and Epic is working closely with console manufacturers and dozens of game developers and publishers using Unreal Engine 4 to build next-gen games.
    Unreal Engine 5 will be available in preview in early 2021, and in full release late in 2021, supporting next-generation consoles, current-generation consoles, PC, Mac, iOS, and Android.
    We’re designing for forward compatibility, so you can get started with next-gen development now in UE4 and move your projects to UE5 when ready.
    We will release Fortnite, built with UE4, on next-gen consoles at launch and, in keeping with our commitment to prove out industry-leading features through internal production, migrate the game to UE5 in mid-2021.
    They also made other statements that their engine scales from mobile to next-gen and that will continue on with UE5 revision. I think having their game Fortnite ported to UE5 is a testament as to the scalability of UE5.
     
  4. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,351
    Likes Received:
    2,832
    Location:
    Wrong thread
    This is very interesting!!

    I wonder what we're seeing here. It could be loading delay, but I think it could alternatively be a processing delay. By that I mean that the job of constructing the mesh (from data on the SSD) that you actually feed to the GPU is going to take some work, and that work may need to be spread over several frames (depending on how much has changed, what resolution you're at, and how much work you can do each frame).

    Ideally in any computing work you want to avoid as much work as you possibly can, and so you often try and re-use as much as you can (situation e.g. memory, cumulative errors etc permitting).

    Is what we're seeing here evidence of streaming limitations, or of the game's Nanite jobby having a cache of "draw ready" 3D data that gets added to / subtracted from each frame depending on the amount of processing available each frame?

    I could imagine both being responsible depending on circumstance.

    Edit: I explained that badly. What I mean is there has to be a limit on the number of verts you can add to or take away from the "one poly per pixel" representation of the base assets. If this number is too low to re-create everything every frame (and why would you want to if you didn't need to) then you may end up seeing this process happen across successive frame.

    This is separate (or could be) to the process of loading into memory the data you then want to create or alter the mesh you're actually going to draw.
     
    #704 function, May 17, 2020
    Last edited: May 17, 2020
    tinokun, PSman1700 and BRiT like this.
  5. Pinstripe

    Newcomer

    Joined:
    Feb 24, 2013
    Messages:
    118
    Likes Received:
    73
    More evidence then that UE5 = UE4 + Nanite & Lumen tacked on. For now at least.
     
  6. Inuhanyou

    Veteran Regular

    Joined:
    Dec 23, 2012
    Messages:
    1,116
    Likes Received:
    291
    Location:
    New Jersey, USA
    He didnt? I could have sworn hes said that both in interviews and responding to people directly on twitter
     
  7. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,963
    Likes Received:
    14,914
    Location:
    Cleveland
    That's not a bad thing at all. It should mean an even broader deployment and wider acceptance in the game development realm.
     
  8. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    3,108
    Likes Received:
    2,623
    Location:
    France
    would be cool to have decima + nanite + lumen ^^
     
  9. Pinstripe

    Newcomer

    Joined:
    Feb 24, 2013
    Messages:
    118
    Likes Received:
    73
    This just means it's a marketing ruse. They might just as well have continued with UE4.x enumeration, but the big 5 is a better sell.
     
    PSman1700 and BRiT like this.
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    I don't think the engine scaling means all feature will. eg. Unity supports some form of real-time GI processing, but you don't run it on mobile.

    I don't think people should expect Nanite and Lumen work on HDD based platforms or mobile until shown just because the engine will run on those platforms. Epic are going to want Fortnite in a single project to build and deploy for all platforms, so they are going to ensure the one engine scales okay, by chucking out features low-spec platforms can't use.

    We presently have a demo run from SSD, comments from Epic saying it needs an SSD, and a PC example running from a very fast SSD. Let's just look at that part of it for the time being.
     
    sir doris, Picao84 and PSman1700 like this.
  11. SlmDnk

    Regular

    Joined:
    Feb 9, 2002
    Messages:
    597
    Likes Received:
    224
    It's the same for lighting: takes a few frames to "initialize" after each cut scene change. Probably been discussed already, though.
     
  12. chris1515

    Veteran Regular

    Joined:
    Jul 24, 2005
    Messages:
    4,786
    Likes Received:
    3,744
    Location:
    Barcelona Spain
    Your reasoning is not logic Brian Karis told himself how he had this idea it begins by storing geometry as texture and the engine try to keep a 1 to 1 ratio between pixel and geometry, it means geometry is linked to the resolution if there is not enough power to draw geometry DRS will apply and reduce the resolution. And without DRS we will have a little dip in framerate.

    This is virtual geometry, the GPU was probably unable to have the geometry needed in RAM.

     
    BRiT likes this.
  13. scarythings

    Newcomer

    Joined:
    Dec 29, 2019
    Messages:
    20
    Likes Received:
    3
    Interesting info.

    He mentions how vital it is that the connection between SSD and VRAM to be able to stream in these 100's of gb's of data.
     
    egoless likes this.
  14. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,351
    Likes Received:
    2,832
    Location:
    Wrong thread
    There has to be a cost of maintaining your virtualised geometry, and I'm not sure that if your resolutions drops (because you can't maintain performance) that you'd want to throw out everything you have and start again that frame. You're already losing performance, that'd cost you more.

    There has to be a degree of independence between the virtualised geometry and the particular dynamic resolution from frame to frame. Or at least, that's the way I see it.

    For example, Rage (a criminally underrated game) used virtual textures that were maintained at one resolution, but also had a dynamic resolution that was independent of that.

    Edited last paragraph for clarity.
     
    tinokun, sammyc, BRiT and 1 other person like this.
  15. chris1515

    Veteran Regular

    Joined:
    Jul 24, 2005
    Messages:
    4,786
    Likes Received:
    3,744
    Location:
    Barcelona Spain
    Like every virtualised system the goal is to display what they have in memory if you don't have the level of geometry you want you display a lower one until it is resident in memory. Here difficult to say how many frame(one or two frame) video program it is hard to go frame per frame.

    https://www.gamedev.net/forums/topic/700703-mipmap-in-procedural-virtual-texture/

    EDIT:
    Someone find this because Andrew Maximov tried to find what they are doing
     
  16. Inuhanyou

    Veteran Regular

    Joined:
    Dec 23, 2012
    Messages:
    1,116
    Likes Received:
    291
    Location:
    New Jersey, USA
    Oh i am aware these two features are a product of the next gen advances irrespective of ue5s scalability in general. Im just saying the tech was also built for a variety of hw even if they cant use it due to not having enough power. So they arent going to make something xbsx and pc with their less advanced ssd tech cant use overall.
     
  17. chris1515

    Veteran Regular

    Joined:
    Jul 24, 2005
    Messages:
    4,786
    Likes Received:
    3,744
    Location:
    Barcelona Spain
    After it is impossible to see the loss of details at normal speed and the level of details is so high the streaming problem is not visible at all. After it probably depends of SSD speed but it shows that SSD is never fast enough.
     
  18. Recop

    Veteran Newcomer

    Joined:
    Aug 28, 2015
    Messages:
    1,290
    Likes Received:
    625
    Ok i did some researches and it's true that some statements made by EPIC are far more explicit :

    Tim Sweeney : “[The PS5] puts a vast amount of flash memory very, very close to the processor,” says Sweeney. “So much that it really fundamentally changes the trade-offs that games can make and stream in. And that’s absolutely critical to this kind of demo [...] This is not just a whole lot of polygons and memory. It’s also a lot of polygons being loaded every frame as you walk around through the environment and this sort of detail you don’t see in the world would absolutely not be possible at any scale without these breakthroughs that Sony’s made."

    https://www.ign.com/articles/ps5-ssd-breakthrough-beats-high-end-pc

    Nick Penwarden : "There are tens of billions of triangles in that scene, and we simply couldn't have them all in memory at once," he says, referring to a bunch of statues in the demo. "So what we ended up needing to do is streaming in triangles as the camera is moving throughout the environment. The IO capabilities of PlayStation 5 are one of the key hardware features that enable us to achieve that level of realism."

    https://www.gamesradar.com/epics-un...-ps5-vision-that-sony-has-only-told-us-about/

    But it's not clear if they really needed 5Gb/s or more for this demo, at least for some scenes. People are claiming a laptop was able to run the demo, so SSD yes but not especially the one in PS5.
     
    ethernity and pjbliverpool like this.
  19. szymku

    Regular Newcomer

    Joined:
    Mar 2, 2007
    Messages:
    296
    Likes Received:
    141
    You are aware that laptop could have 64 GB of RAM?
     
    disco_ likes this.
  20. j^aws

    Veteran

    Joined:
    Jun 1, 2004
    Messages:
    1,939
    Likes Received:
    42
    We may need new nomenclature to describe what is happening. A couple to describe first:

    - Epic mentioned that they use special normal maps for models. Not your usual type though. So what can it be? For REYES, you need a micropolygon map of your geometry.

    - Geometry maps - what can they be? Depending on granularity, they can be a map of fragmented meshes, or a map of the aforementioned micropolygons at a 1:1 ideal triangle to pixel quality.

    If texture is to texel:
    What is micropolygon to...? A microcel?

    If texture LOD is to mipmaps:
    What is mesh LOD...? A meshmap?

    They could store meshmaps to represent geometry LOD.
    They could store microcels (special normal maps) to represent geometry maps.

    At runtime, they will have a target resolution and load the appropriate texture and geometry LODs and maps.

    For every object to draw, they can test against filling their normal maps ( appropriate microcel, micropolygon map).

    Then keep a counter of its fill level for that frame. If falling behind, choose a contingency lower quality meshmap.

    Shade texture with appropriate texel and mipmap.

    So, if falling behind with your frames, you'll see geometry artifacts as shown earlier.

    Kinda like that is what I'm thinking...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...