GPU Ray Tracing Performance Comparisons [2021] *spawn*

Discussion in 'Architecture and Products' started by DavidGraham, Mar 29, 2021.

  1. JoeJ

    Veteran

    Joined:
    Apr 1, 2018
    Messages:
    1,523
    Likes Received:
    1,772
    A bigger cache is what RT needs to help with it's random access problem - it's not a 'rasterization' feature. And doubling fp32 and improving RT cores does not give us more compute units.
    My impression is 'NV invests in specialization, AMD invests in generalization.', which maybe is contrary to yours.

    uuuhhh... this really does not sound popular ;) I'd add to this: Even with turning RTX on, lighting still is not dynamic in very most titles. So the arguments that followed are more about promised advantage than what we got for real.
     
    w0lfram likes this.
  2. w0lfram

    Regular

    Joined:
    Aug 7, 2017
    Messages:
    304
    Likes Received:
    59
    I think a good many of you are not being objective.

    We have lights that cast shadows and move.. that are not ray traced. We do not need ray traced light, to have realistic lighting. Again, what RAY TRACING allows is for Game Developers to make games EASIER... and not have to spend so much time with baked-in lighting.

    Ray tracing doesn't add to the realism of games... because not all lights in games are traced...! (Leaving gaps in the realism).



    Today (2021), Ray tracing and ray traced reflections are just a novelty. They do not do anything for the Gamer, except sap performance...
     
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    It is objectively true that RT produces the most accurate lighting in games. So you must mean that you're happy with the limitations of alternative methods (SSR, irradiance fields, Lumen). That's fine but people have different standards.

    So ray tracing will only be useful once everything is raytraced? And we should be happy with lesser methods until then?
     
    Florin, HLJ, PSman1700 and 1 other person like this.
  4. w0lfram

    Regular

    Joined:
    Aug 7, 2017
    Messages:
    304
    Likes Received:
    59

    It is objectively true, that Ray Tracing saps considerable performance... and not all the lighting in those games is being ray-traced, so even those game are NOT realistic....!

    End of story....



    BTW, I am happy with the PERFORMANCE that non-RT lighting gives me....
     
  5. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,761
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    The DF video on Doom shows how a combination of RT and SSR currently produces results that either could not achieve on their own. There will be a lot of gray area and lack of absolute truths for the foreseeable future.
     
    HLJ, Lightman, CarstenS and 2 others like this.
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    It's fantastic that you're satisfied with current lighting in games. We should all be so lucky :)

    Is the need for SSR due to the relatively large steps used in ray marching or something more fundamental limitation of RT?
     
  7. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    347
    Likes Received:
    126
    The same argument would imply that pre-computed lighting doesn't add to the realism of games either – since it doesn't capture the contribution of moving objects to the propagation of light. So why waste time with all that pre-computation in the first place? Just have flat, artificial, lighting on everything. And why bother trying to shadow all these dynamic light sources if they already looked ok without shadows in 90's games?
     
  8. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    347
    Likes Received:
    126
    It's a reductio ad absurdum. The point is that it's ridiculous to think that just because a technique doesn't achieve complete realism in a given scene, it can't add to the realism of that scene. That's literally the history of computer graphics – a series of ever more realistic approximations.

    So now in Spider-Man where you can see the world moving in the glass you are climbing up, instead of a crude static approximation via cube maps, do we think "oh, this isn't any more realistic at all, because there is some barely perceptible reduction in geometric detail due to simplification of the BVH structure"? Or do we think "wow, now I can see the pedestrians and the cars and what looks like a living city"?
     
    Newguy, pharma, HLJ and 9 others like this.
  9. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,240
    Likes Received:
    3,393
    It's a result of rasterization performance optimisations. RT can't see things which are "faked" - done with transparent textures or in screen space. Thus SSR help since it's running in screen space and see all these "fakes". It is a matter of performance for the most part.
     
    Arwin, HLJ, Putas and 3 others like this.
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Oh yes, that's right. I don't see that changing anytime soon.
     
    pharma and HLJ like this.
  11. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    Looking at just minecraft RT.....







    Most of these minecraft RT videos go in the high range of views, millions in many cases, not to forget the comments on how blown away people are. The general public is impressed, to say the least.
    And thats just minecraft. CP2077 has shown the use of five different RT methods going on, all impressively done on todays hardware.

    RT is the real gamechanger, along with AI/ML deep learning reconstruction to gain enormous performance boosts with no IQ loss, even better IQ in some cases. RTX IO/direct storage flies along nicely though.
     
    pharma, HLJ and xpea like this.
  12. milk

    milk Like Verified
    Veteran

    Joined:
    Jun 6, 2012
    Messages:
    3,977
    Likes Received:
    4,101
    Would you not say The Last of Us II is realistic?
     
    Scott_Arm likes this.
  13. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    138
    Likes Received:
    388
    Every single scene in that game is meticulously hand-authored. I believe most of the lighting is baked-in. So this is exactly what @trinibwoy was talking about, i.e., yes you can bake lightmaps on everything and make it look great but you lose dynamism. And so you can't have dynamic time-of-day. Furthermore, even with a fixed time of day you can't bake lighting on unpredictably-moving objects, so you need to resort to probes which have their own caveats (light leaks). So yes, despite all the effort from an AAAA studio (if there ever was one) the game still has inconsistent lighting on dynamic objects.

    Contrast this with Metro EE (from a much smaller and lower-funded studio) which basically discarded all the fakery and replaced them with RT sources and you end up with much more consistent-looking lighting throughout the game. They just don't have ND's art budget. RT doesn't remove the need for a lighting director, it just removes the need for meticulous hand-placement of fake lighting and careful use of fragile heuristics to realize the director's vision.
     
    DegustatoR, xpea, pharma and 5 others like this.
  14. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    138
    Likes Received:
    388
    The disparaging of progress towards real-time RT is truly astonishing. And these "all-or-nothing", "it's not as performant as the traditional path so it shouldn't even exist" arguments are just garbage. Yes, RT is expensive and needs hardware that's different from the way we've been building GPUs. It needs painful changes to lighting engines, especially during these intermediate years when we'll have to delicately navigate the balance between simulation and fakery. It hurts Nvidia, AMD, Intel, Epic, Microsoft, Sony. And yet, they have all decided to move in this direction. Why? Because the experts running the show aren't luddites who would be excited getting out of bed at the prospect of building the next thing that will just do the same old, same old but 35% faster.

    Are the current hardware architectures the be-all-end-all solution? Of course not. They will evolve. But we've taken the first steps and there's no looking back despite the frothing hysteria from those that are offended at the perf cost. Just turn the damn feature off -- while you still can.
     
    Newguy, milk, DegustatoR and 12 others like this.
  15. JoeJ

    Veteran

    Joined:
    Apr 1, 2018
    Messages:
    1,523
    Likes Received:
    1,772
    I don't see customer feedback telling me the feature is not yet convincing as 'garbage'. Instead depicting them as hysteric fools and ignoring them, we better try to improve RT so they are happy with it's win to cost ratio. Claiming self appointed expertise won't convince anybody.
    Telling them to turn RT off is no solution but just ignorant, and pointing towards evolving hardware is just committing incompetence. Sorry i'm not impressed from your arguments, note i'm assuming developers would communicate it somehow this way which would be very bad.

    So, beside my critique on API limitations, there also is the question on how to use RT to get most benefit.
    To me, it diverges in 3 groups:
    * Full RT lighting, all dynamic, GI + reflections + shadows: Exodus is the only example (ignoring Minecraft or Quake). IMO that's impressive, something new and worth the high perf. cost.
    * Many effects: Control, CP 2077. GI still mostly static. IMO looks better, but not really enough to be worth the cost. I'm totally not convinced and would indeed turn off RT after checking it out.
    * Single effect: Eternal or CoD. Just reflections or soft shadows. Hit on perf. is small. IMO that's good and i would happily enjoy RT in those titles.
    Then we've had some mediocre titles like Godfall or RE8. I only mention them to say i do not count them to the third category. Guess we all agree that's not what we want.

    To me it's interesting i have this 'valley of disappointment' in the middle. And this second category is also most commonly used. Thus my overall feeling about RT in games so far is disappointment, although it would not have to be. (Really meaning just the visual results, not the API stuff).
    I guess my opinion is a minority here? Reaction on Control / CP was overall very positive, and i can relate. But after looking back it feels like: 'Bolt on too much, not really getting the best from both worlds.'
     
    milk and techuse like this.
  16. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Of course RT will improve, just like every other aspect of 3D rendering has improved with time. That’s not the issue.

    For those who think RT isn’t worth the performance hit they should simply not use it. For those who think current lighting methods are good enough they can turn it off. That’s why options are great. Literally nobody has had RT forced on them in any game so far.

    There does seem to be a bit of hysteria around the limitations of the first iteration of an advanced feature that can be turned off if needed. Which I find ironic because I think we should be celebrating the impressive adoption rate of a long desired capability that fundamentally improves the rendering pipeline and artist workflows. It’s impressive because we’ve had RT hardware on the market for less than 3 years which is nothing in terms of game development timeframes. With consoles now in the mix we could be in for a treat this generation. So yeah I don’t understand the Debbie Downers either.
     
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    What's numbing is that GPU compute has only advanced by a factor of about 10 over the past 10 years:

    AMD Radeon HD 6970 Specs | TechPowerUp GPU Database
    2.7 TFLOPS FP32

    AMD Radeon RX 6900 XT Specs | TechPowerUp GPU Database
    23 TFLOPS FP32

    Don't look at bandwidth growth or you'll cry.

    I believe Ampere represents the easy part of the curve in improving ray traced performance. There's nearly no growth left in terms of BVH storage efficiency (unless NVidia hasn't implemented the compression techniques first described years ago) and VRAM bandwidth is going nowhere fast, too. Compute is looking like it has a miserable growth curve from here.

    So, welcome to the next ten years of kludges to make RT work better than it currently does.

    I wonder how much of Nanite was possible 10 years ago...
     
    Lightman and Man from Atlantis like this.
  18. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    I think a lot of GPUs during that time lacked 64-bit atomics because it's needed for applying depth testing to Nanite geometry ...
     
    PSman1700 likes this.
  19. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    God you must hate the new consoles, the huge performance impacts RT has there and the minimal visual return of it, holy crap. Im glad the most of the population appreciates ray tracing lol.

    Were at 36TF's worth of raw BW now, with AMD's next going much higher than that. Bandwith aint a problem anywhere, we have SSD's now.
     
  20. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    I'm sure there are more aspects of Nanite's specific algorithm that weren't possible then. That's why I qualified my question with "how much of" ;)

    Direct Compute was relatively primitive back then and geometry was stuck in tessellation wars along with geometry shader being a disaster ("just don't use it").

    So, much hackery required.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...