Polygons, voxels, SDFs... what will our geometry be made of in the future?

Discussion in 'Rendering Technology and APIs' started by eloyc, Mar 18, 2017.

  1. Lurkmass

    Regular Newcomer

    Joined:
    Mar 3, 2020
    Messages:
    350
    Likes Received:
    392
    Not sure about his role but I think Unity DOTS/ECS involves working on things like their C# burst compiler which includes contributors like Neil Henning (former Vulkan rep for AMD & worked on shader compilers) but other than that he's a big believer in the technology ...

    What they're doing over there is a multi-year effort to refactor Unity into coping with much more complex projects for the future ...
     
    milk, chris1515, BRiT and 1 other person like this.
  2. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,566
    Likes Received:
    16,623
    Location:
    The North
    its definitely some really bad ass stuff. I completely forgot that they were moving into this direction as it's been in beta for so long. Still waiting for the whole ECS to enter official release - lots to do there because I think their renderer also needs to be updated to support it as well (at least the geometry/culling part)
     
  3. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    742
    Likes Received:
    419
    The distance field geometry is just an open source side project, primary job is head of low level optimization. Unity is definitely missing geometry and lighting for the high end, but the new material system for the "HDRP" looks great, especially as UE5 seems to introduce no new brdfs whatsoever despite switching to a visibility buffer.

    Still, I know global illumination is "on the roadmap". I don't know what, if anything, they're doing for geometry though. All that being said, I'm looking forward to Baldur's Gate 4 on HDRP... you know in like 5 years time :razz:. Though maybe closer, ECS seems straight up the alley of a Cities Skylines sequel; especially considering the only transportation simulation was the only really interesting sim part of that game.
     
    iroboto likes this.
  4. Warrick

    Newcomer

    Joined:
    Jun 5, 2003
    Messages:
    33
    Likes Received:
    37
    Location:
    Hong Kong
    I would have thought Unity would aim to add a similar HLOD mesh scheme designed specifically around mesh shaders and hardware ray tracing. Nanite is actually a pretty straight forward HLOD system to implement, with the LOD stitching and selection being the tricky part (There is a whole history of HLOD stitching options out there). Doubtless someone soon will look at the UE5 shaders and re-implement them on the Unity Asset store :p

    It wouldn't surprise me if nVidia gameworks and Simplygon provide an equivalent soon enough. I suspect nVidia will implement a skinned variant faster than Epic (Skinning complicates LOD selection and bounds for hierarchical culling) and eventually expose extensions to supplement this. Nvidia for sure will quickly iterate to iron out the kinks in that flow of rigid/skinned mesh HLOD (mesh shaders/compute) to BVH build and raytrace.

    On a tangent it could be interesting to consider how and when the Hull/Domain/Geometry shader hardware will be ripped out and at least partially emulated, so that die area can be better devoted instead to compute/mesh-shaders/raytracing (I am not sure if it is significant die area though).
     
    iroboto likes this.
  5. Warrick

    Newcomer

    Joined:
    Jun 5, 2003
    Messages:
    33
    Likes Received:
    37
    Location:
    Hong Kong
    I think Epic will hit back somewhat at ECS within the next year with the work they have been doing with Intel ISPC and their UnrealScript replacement. But yes Unreal Engine internally is an old creaking architecture with Nanite/Lumen lipstick, pitted against Unity trying to 3 point turn its Titanic ship with a habitually late ECS.
     
    iroboto likes this.
  6. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,566
    Likes Received:
    16,623
    Location:
    The North
    if you skip the unrealscript UI bit and go with standard cpp programming, is the core engine still largely single threaded? There is no ECS type package being worked on/or exists that is available on the UE marketplace?
     
    BRiT and Warrick like this.
  7. Lurkmass

    Regular Newcomer

    Joined:
    Mar 3, 2020
    Messages:
    350
    Likes Received:
    392
    ISPC is just a band-aid and doesn't obsolete the need for a good data oriented design as seen in Unity DOTS/ECS. Rebuilding Unity with a native ECS architecture was always going to be a multi-year effort so I'm curious (maybe skeptical) as to how Epic Games will have an equivalent response within the next year when Unity Technologies had a good couple of years of head start now in this area ?

    Even Intel who originally authored ISPC recommends that UE's animation system (over a decade old code) needs to be rewritten with a data oriented design. UE's game framework needs tons of deep changes to it's codebase for it to remain scalable in the future because it's current architecture won't keep up ...
     
    iroboto, BRiT and Warrick like this.
  8. Warrick

    Newcomer

    Joined:
    Jun 5, 2003
    Messages:
    33
    Likes Received:
    37
    Location:
    Hong Kong
    Unreal does have multiple threads in play (rendering is more multi-threaded than it used to be), and a basic job system but nothing to the extent of what I would classify as a fully jobified modern engine architecture - or even close to what Naughty Dog was using over 6 years ago for example. There Unity is definitely ahead of Unreal for now. The UE5 code still very much looks like UE2 code, and its layout is what you would expect to see of a code base that has evolved over so many years. It also of course has been tightly coupled more or less to one specific game type, which has caused friction if you needed to go against that. It's not easy to ditch Unreal's entity system and just go with its renderer for example, plus production/management wouldn't let that happen as it would break the illusion of a project seemingly being 90% done by dropping some shiny assets around a level :p

    I dunno if there is an 'ECS' on Unreal's marketplace, but I don't really see what that would get your over simply using C++ with Unreal if the engine architecture hasn't changed to suit.
     
    iroboto and BRiT like this.
  9. Warrick

    Newcomer

    Joined:
    Jun 5, 2003
    Messages:
    33
    Likes Received:
    37
    Location:
    Hong Kong
    I totally agree that their architecture needs deep reworking that will take time.. But no I wouldn't say a new UnrealScript based around ISPC is band aid. It's partly equivalent to what Unity is trying to with Burst, but you also have to factor Epic's metaverse aspirations in, and their better history with the networking plus scripting/blueprint side of things (And overall they do seem to execute and finish more efficiently than Unity). I think that is more their priority, it is foundational, and they don't care so much about parity with Unity ECS/DOTS marketing hype that mixes together quite a few things and isn't finished (or indeed that loved by every Unity developer).

    So I don't see them needing to have an exact equivalent in response, even if their animation system doesn't scale as well in the short term. I think they will provide a better 'forward looking metaverse suited' API via their new language and blueprints, make it a bit easier from that to write your own game specific ECS/DOTS like systems in tandem with that if you want on top via the language, and rewrite more internals such as the animation system underneath in stages over the longer term.

    Apologies if that went a bit off the threads topic!
     
    iroboto and BRiT like this.
  10. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    742
    Likes Received:
    419
    Maybe this could be spun off into its own thread?

    Regardless, I have a hard time seeing the "Metaverse" stuff taking off in the way people seem to hope. It was the brainchild of a sci-fi author 30 years ago before the internet even existed, and it feels like it's popular solely because a ton of programmers and tech people happened to read about it in Snowcrash.

    A "metaverse" is nothing but an OS with crappy interface. Everyone and their mom without a foundational consumer OS, eg everyone except MS, Google, and Apple, are trying to build some sort of crappy "metaverse" thing whether it be a super app or a game within a game or... you're just adding an extra stack on top of an OS and hoping no one will notice; and extra stack that's even more useless and walled garden and crappy than the current OSes are.

    If you want to make a game engine, make a bloody game engine. If you want to make it easy to use your game engine, then make it easy to use. Requiring that it's in a "metaverse" is just the same shitty tactics the OS guys already do, it adds exactly zero benefit to the user and is done solely for the edifice of business people. Roblox doesn't scale beyond kids because they realize "oh shit there's other, better things beyond roblox and it's easier to get to!" Second Life already failed trying to target older audiences. The only reason the Chinese "Super Apps" work is because basic OS functions are censored in China and a handful of companies were able to scale to monopolies quickly, much like Facebook.

    The metaverse stuff for UE5 just feels like an updated UE4 all over again. Nobody at the helm deciding what it is UE should actually be good at. That Lumen and Nanite are very clearly more beneficial for the sorts of higher end titles that used UE4 than anything else does suggest to me that its creaky parts desperately need replacement just as much, but who knows if their entirely unfocused business model that seems to run mostly off Fortnite profits will follow this need.
     
  11. JoeJ

    Veteran Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    1,306
    Likes Received:
    1,567
    I'm afraid of dying, maybe in 20 years, and still not understanding what 'metaverse' tries to mean :)
     
  12. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,347
    Likes Received:
    688
    Second Life succeeded beyond any reasonable expectation given its god awful UI and LL incompetence after the first few years.
     
    Warrick and Frenetic Pony like this.
  13. Warrick

    Newcomer

    Joined:
    Jun 5, 2003
    Messages:
    33
    Likes Received:
    37
    Location:
    Hong Kong
    More Lumen, but nothing really that isn't already known by now:
     
    Krteq likes this.
  14. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    742
    Likes Received:
    419
    What I don't get is the local grid probe for distant lighting. Oh the idea is sound, but uhhh, guys you have a single, low res, SDF to trace through. Where's the cone tracing and cone sharing? Oh it's probably for the fog so... hmmm. Still, where's the cone tracing is definitely a question.

    Regardless it does look like they're getting a lot of energy loss somehow, you can see it in the archviz house, that room that's completely dark and needs the emissive drapes whereas in life and a proper pathtracer it should be pretty well lit up.

    Why I thought they had low spatial bounces, but they do indeed have recursive surface caching with visibility somehow, I guess that'll be explained at Siggraph. Maybe its their caching bugs and noise they showed off a lot. I'm also not sure this caching scheme will work at all with foliage. The energy loss problem is recursive and exponential, which would create totally unrealistic areas of darkness in trees and already does so in the stalactites, which you can see are pitch black despite being incredibly spatially close to a giant hole in the ceiling and relatively super bright sunshine. Once you go with recursion problems get amplified recursively too!

    But even if thats fixable the heavy reliance on temporal caching all around, and weird way the caches are generated with cards, don't lend themselves to a bunch of moving branches. But you can't ship good looking games in a forest without that either, you'll get a ton of missing reflections and lightleak and etc. The end results are still great, on a 3090 cranked up to max :p . Hopefully these issues will get fixed sooner rather than later.
     
    #274 Frenetic Pony, Jun 11, 2021
    Last edited: Jun 11, 2021
    milk likes this.
  15. JoeJ

    Veteran Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    1,306
    Likes Received:
    1,567
    I do not get anything. Screenspace radiance cache, Cards cache, volume grid of probes cache. Rasterizing geometry to UV atlas of cards, cards being limited to convex objects. SS tracing, local SDF tracing, global SDF tracing, RTX tracing.
    If Nanite would be like this, they would need a unique algorithm for every level of detail, and a bunch of developers to maintain each of those. Plus another team gluing all that together, likely without having a single person with oversight of how all this precisely works.
    Then there is inability to scale down to less powerful platforms, which is no wonder.
    Totally broken. Can't be fixed. Has no future. But ofc. they can keep optimizing it forever. (I might be wrong, because i didn't get it)
     
    milk likes this.
  16. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,414
    Likes Received:
    1,963
    Location:
    msk.ru/spb.ru
    I think it will be reworked to a more lean GI approach down the line but they'll have to drop all but DX12U h/w support for that. Maybe it will be a separate quality tier.
     
    milk likes this.
  17. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,735
    Likes Received:
    3,828
    Funny you say that, as that is that the part that feel the least objectionable to me. Although thats comoing from somebody who is drunk, not a dev, and haan't watched the video yet (will do it now) but I just cant keep my impetus from ill informed opinion spilling at bay...
    Anyway, from what I gathered so far, the radiance (or irradiance, whatever is the proper term for outgoing light) is stored in a mix of a probe grid and a surface cache consisting of textured quads "boxing out" models in a very aproximate way. If anything about lummen is messy, that seems to me to be the most worthy source of complaint. For all that nanite is doing for generalizing complex problems and streamlining them from a production POV, that part of Lumen looks to me to be the most hacky and manual-tweak worthy one.

    I get it that the engine probably tries to find automaticaly, the best convex hull for each object, but even then I suppose it is often gonna do a bad enough job on the general case that dedicated devs will stilll have toassage the system and work around it a-plenty, to avoid a lot of the shortcomings.

    I think they are on the right track in using SDFs for visibility/occlusion. It seams to me like obe of the best "bang-for-the-buck' secobdary scene representations for random access ray tracing and cobe tracing. But that only solve occlusion, but still lacks some other representation to encode the outgoing light.

    For those purposes, I'm surprised Epic didn't rely entirely on a voxel grid prove systen. Even if it does not bring the best accuracy for the budget, it seems to me like the most constant aproach from a performance/memory budget point of view, which I assume are the prefered compromises for a commercialized engine that us trying to serve as many custumers as possible.

    The textured quad based surface caching just seems like such a hacky and unwildly solutiob to me.
     
    JoeJ likes this.
  18. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,735
    Likes Received:
    3,828
    Like, i figure, and am the first to admit this is all armchair hypothesis making, but with the sdf occlusion, one can go a long way with GI even if the light irradiance space is low res. What is more important to be high-detail is the light occlusion, but the bouced lighting, I think, can be way blurrier most of the time witout horrible results (of course edge cases always exist)

    They already have damn probe grid anyway, can't they perfect that one and expand on it while dropping the surface quads? That will save some memory and processing to double dow on the volumetric solution.

    Just inject reflective shadow maps into the the volume, and let recursive temporal accumulation do its magic to approximate many bounces.
     
  19. JoeJ

    Veteran Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    1,306
    Likes Received:
    1,567
    I assume they project the ray hitpoint to the card cage of the object, to lookup (or inject) radiance from the texels of the card.
    Not that bad - much less memory than having another volume beside SDF for that. The resulting triplanar projection causes some error on concave shapes, but could be acceptable.
    ... just guessing. With that in mind i don't get why the need to raster low lod versions into that kind of UV atlas (probably representing the cards textures), you will see in the video.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...