DirectX Ray-Tracing [DXR]

Discussion in 'Rendering Technology and APIs' started by BRiT, Mar 19, 2018.

  1. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,440
    Likes Received:
    565
    Location:
    Finland
    Not really sure what they raytrace on that demo, certainly not reflections.
    Most demos seemed to be 1080p, not really surprise if one sample ambient occlusion on single volta is ~5ms.
    For games, I wouldn't wonder if tracing resolution would be half for most FX.
     
    Shortbread likes this.
  2. eloyc

    Veteran Regular

    Joined:
    Jan 23, 2009
    Messages:
    2,266
    Likes Received:
    1,377
    Just lighting.

    And I agree with @Shortbread . The demo doesn't seem to "demo" properly what's supposed to showcase/not impressive. The interesting part is that it's a new Metro game and that it seem we'll have more and more ray-tracing methods in games.

    @mods: couldn't we retitle this thread so that it's a ray-tracing thread, not just Directx'?
     
    Cyan and Shortbread like this.
  3. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,219
    Likes Received:
    2,263
    Location:
    Australia
    Really need a before and after comparison to see the difference, not a good candidate for RTX showcase honestly. Games with tons of reflections and wetness would sure be more impressive. But again RT is probably not worth it all together consider the unhealthy amount of power requirement which would otherwise grant you tons of gpu particles, fluid dynamics and proper hair, clothes simulation which are all much more visually prominent to the players.
     
    pharma and milk like this.
  4. eloyc

    Veteran Regular

    Joined:
    Jan 23, 2009
    Messages:
    2,266
    Likes Received:
    1,377
    I agree, as I said in my previous post.
     
  5. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,278
    Likes Received:
    3,524
    It's just that static environments are not a good candidate to showcase GI, precalculated GI achieve the same thing or even better actually. They should have selected a more dynamic environments with moving spot lights, reflective surfaces, and shadow interaction.
     
    Shoujoboy, Shortbread, OCASM and 2 others like this.
  6. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    3,028
    Likes Received:
    554
    So when can I expect an actual demo I can run on my own hardware?

    MS official (presumably vendor neutral) API is good, mixed raster/raytrace is cool to the extent it can provide various effects noticeably better/cheaper than current raster tricks/approximations.
    But I find the fullscreen raytrace demo vids pretty underwhelming :-|

    We've seen realtime raytracing claimed to be the future & available to consumers Real Soon Now™ year after year as shown by various demo vids posted, yet still we're seeing demos that require $150,000 hardware & look pretty janky (at least in parts), especially so if they're only running @1080p.
     
    eloyc likes this.
  7. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    Yeah I was thinking that doing the actual ray tracing at lower resolutions would be yield a very significant speedup. For specular you could run it at half res and for diffuse you could probably get away with running it at a quarter res or even less for bounce lighting.
     
  8. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,412
    Likes Received:
    3,271
    A full screen ray trace with a meagre on ray per pixel, when properly filtered, is gonna look at least as good as a half-res one woth four rays per sample (which account for 4 final screen pixels). So under that light, full res withe less rays and more filtering is always best. It gives a broader spacial distribution of starting points for the rays.
     
  9. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    You could also use 1spp and filtering for the lower resolution buffers. Sure, there would be some quality loss but how noticeable could it be, specially for diffuse bounced lighting? It would still be far superior to current game GI techniques.
     
  10. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    4,093
    Likes Received:
    2,316
    This is the thing that kind of worries me about real-time ray tracing not being implemented correctly. That shadowing (especially multilayered) will not react appropriately between multiple lighting sources. I still haven't witness any proper real-time penumbra and antumbra shadowing in games, just some odd techniques or approximation of how it should look. I would like to see characters, NPCs, cars, buildings, and so-on, with actual proper shadowing down the road.
     
    #90 Shortbread, Mar 23, 2018
    Last edited: Mar 23, 2018
    OCASM and DavidGraham like this.
  11. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,162
    Likes Received:
    5,463
    Nice twitter thread about performance
     
  12. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,162
    Likes Received:
    5,463
  13. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,522
    Likes Received:
    15,977
    Location:
    Under my bridge
    Conceptually, quarter res with four rays stochastically sampled and jittered each frame should provide more information for reconstruction. Well, I guess jittering the single rays per frame at native res would accomplish the same sort of thing. Rasterising an ID buffer would provide decent object edges. The concern is material blurring across borders I guess, but with suitable samples you could reject those samples from the wrong surface ID. Probably.

    Let's be honest - there are far smarter people tackling this than me!
     
    Cyan likes this.
  14. Jupiter

    Veteran Newcomer

    Joined:
    Feb 24, 2015
    Messages:
    1,495
    Likes Received:
    1,045
    Read the comment which is a few comments above. Four Tesla V100s wouldn't cost $150,000. In the price comparison it was listed for $10,000. That would still be significantly cheaper than the $150,000.

    Anyway, I was right about the $150,000 warning way too much.
     
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,522
    Likes Received:
    15,977
    Location:
    Under my bridge
    Which demo is that? Someone was saying that the original SEED demo was run on IBM Power7s, according to 'developers at the show talking on Discord.'
     
  16. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,568
    Likes Received:
    14,149
    Location:
    Cleveland
    Maybe they were counting the projected price with crypto-mining market impacts?
     
    digitalwanderer, nutball and Malo like this.
  17. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    921
    Likes Received:
    874
    Looking at the PICA PICA slides, is there any point in doing AO when you're already doing raytraced diffuse GI?
     
  18. Ike Turner

    Veteran Regular

    Joined:
    Jul 30, 2005
    Messages:
    1,969
    Likes Received:
    1,993
    "Reflections" which is the ILMxLabs/Epic demo = 4x Voltas. Pica Pica (SEED) can run on 1 Volta..but apparently Nvidia instead on supplying DGX-1 stations for GDC..
     
    pharma, Cyan and OCASM like this.
  19. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,412
    Likes Received:
    3,271
    Thats exactly my point. 4rpp at 1/2^2 res vs. 1rpp at full res still provides the same amount of rays for every screen pixel. The only difference is on the second case your rays are more evenly distributed spacially, which is more info for your filtering (it will just need to have a sapling area twice as wide as the quarter res buffer's filter to provide somilar results)
    Spacial and temporal jittering go without saying of course.
     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,522
    Likes Received:
    15,977
    Location:
    Under my bridge
    Technically there's nothing stopping you from offsetting a ray's start. You could have a buffer 4x4 pixels and sample at each pixel's centre. You could get the same sampling 2x2 pixels with four rays per pixel, each a 'half pixel' separated (-0.25, -0.25), (0.25, -0.25), (-0.25,0.25),(0.25,0.25).

    Of course, that ends up being the same number of rays, so doesn't make any difference to the ray tracing effort! It'd just save some RAM requirements storing the buffer, but you'd lose fidelity too. Makes zero sense - just render 'native pixels' and do whatever you want with the raw per-pixel resolution data.

    I think the real take-home is the concept of pixels is even more meaningless in ray tracing. You have a set of rays cast into the scene and mapped onto a 2D grid of 'pixels', but each sample can originate from anywhere and in any direction (VR Picasso, here we come!). A super funky sampling might cast 1 million uniformly-random-location rays per frame to produce some sampling data and transform that into a 1080p or 4K 2D display buffer using ML'd reconstruction. Before the reconstruction AI turns on the user and kills them.

    Would enable things like true fish-eye though. Should be better for VR projections too as truer to the optics.
     
    jlippo and milk like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...