GPU Ray Tracing Performance Comparisons [2021] *spawn*

Discussion in 'Architecture and Products' started by DavidGraham, Mar 29, 2021.

  1. DegustatoR

    DegustatoR Veteran

    A) MS and Sony "stick" to what is cheaper generally.
    B) RDNA2 does not "repurpose TMUs for RT acceleration".
    C) Tensor cores have nothing to do with RT. And if you've meant RT cores then RDNA2 has them, just cut down to BVH traversal only from Turing's traversal and evaluation. Which is coincidentally the main reason for A) and for RDNA2 RT h/w being considerably slower in practice than that of Turing and Ampere.

    With how power situation is evolving in general I'll be rather surprised if AMD won't adopt similar to Nv approach around RDNA3 or 4. Ampere is beating RDNA2 in power efficiency in RT by some multitude of times atm.
     
    PSman1700 likes this.
  2. iroboto

    iroboto Daft Funk Legend Subscriber

    Yea I think it’s safe to say all major companies. All companies have tensor cores. Right down to Google. AMD has tensor cores just because they aren’t in mainstream consoles doesn’t mean they don’t have them. And MS and Sony still went forward with ML based features on their compute units which is a compromise for dedicated silicon.

    To be fair, MS stripped away its ROPS and Sony did away with some FPU on the CPU as well, silicon was clearly at a premium for both companies.

    RT mobile is driven by PowerVR and as you say we will see RDNA2 on mobile.

    The reason we don’t see it’s prominence in cheaper devices is because it’s not cheap enough for it to come down to that device. But given a long enough period the technologies that drive the high end eventually are brought to the low end.

    RT methods have to compete with 20+ years of T&L tooling and skill sets. It’s going to be a while for it to become the mainstream just because games still need to be made and deadlines met. You can’t pay a studio to sit there and just learn and solve problems, they ultimately are required to deliver a product too.

    ML techniques as they are today are significantly sped up by tensor cores, when ML techniques are completely integrated into software stacks then tensor accelerators will likely be the solution here.

    We see AI accelerator chips in a variety of phones and mobile devices today because they have a greater need for them. In games, perhaps ML is still largely in its infancy, but over time I expect to see ML arrive there too, and not just in a form of super resolution.
     
    PSman1700, pjbliverpool and pharma like this.
  3. neckthrough

    neckthrough Newcomer

    You are inserting these adjectives to artificially create an insurmountable wall. Why does it have to be universal? Why not a majority? What is the definition of substantial? However you choose to define it so that it never passes your test?
    Weasel adjective. As others have observed you can get 4K/60fps with a 3080 on Control and others. You can get 1440p/60fps on Spiderman on a freaking $500 console.
    Another weasel adjective. What is affordable? Are you complaining about the pandemic/crypto prices? Yes they suck. But that's a universally sucky story and has nothing to do with this narrative you're trying to weave.
     
    PSman1700, DegustatoR, pharma and 2 others like this.
  4. techuse

    techuse Veteran

    Quake 2 isn't very compelling outside of being cool from a tech standpoint. Metro EE is a huge improvement over the initial RT implementation and makes it the only truly compelling usage of RT so far IMO. Control is much more on the side of image refinement than anything else.
     
  5. OlegSH

    OlegSH Regular

    Of course he has to be when his words are referenced as ground truth.
    Has he already proved these theses with real facts? Is he knowledgeable enough to even make a proper testing of RT quality? Did he test proper scenes at all? Does he know how ground truth image should look like? Did he compare against ground truth? If not, did he explain theory and math behind his conclusions?
    The answer to these questions would be no, so using his words as arguments is simply disrespectful to people here.

    Of course people are free to do whatever they want, but imagine an utterly silly situation where one would criticize a whole science direction with arguments taken from entertaining YT type of content, would you expect scientific community give up and accept the critics?

    These images must be made in proper places, must be compared to reference or must be explained theoretically, that's how education process happens.

    It seems you didn't get the joke.

    To be quoted and referenced he has to be an expert and his defence better be good.

    Honestly, not going to waste even more time explaining obvious stuff.
     
    PSman1700 and pharma like this.
  6. Wesker

    Wesker Regular

    Come off it. You're far too intelligent to actually believe this.
     
  7. DegustatoR

    DegustatoR Veteran

    2060S was very affordable for about two years and it can do RT in 60 fps just fine in most of RT titles.
    He is just peddling his agenda which has nothing to do with reality.

    I don't get the distinction between being "cool" and "compelling".

    Just no. EE's improvements to RTGI over the original RTGI implementation are substantially smaller than the difference between non-RT and RTGI graphics in the original release.

    Again don't think I understand the point. RT is "image refinement" even when doing full on path tracing.

    There were compelling RT implementations for years now, and it is nothing but admittance of that on HUB's part now, when they've started using RT in their regular benchmarks.
    Nothing has changed in RT landscape with the release of MEEE and F1 2021.

    Believe what? I've heard him saying that many times in different videos on their channel. And it was rather funny when he suddenly started saying the opposite couple of months ago.
     
    PSman1700 likes this.
  8. Dampf

    Dampf Regular

    They want to keep their 5700XT fanbase happy. That's why they downplay RT on every opportunity they get and also ignore the possibilities with DX12 Ultimate in the future. They heavily recommended the 5700XT over Turing in 2019, so if they would admit that the 5700XT was already outdated at its release and especially for true next gen games, they would lose their face in front of their audience.

    Look at the 5700XT hype comments in their Far Cry 6 video and you have solid proof of my theory. It's the only explanation. They know the 5700XT is very popular with their fanbase. This has nothing to do with Ampere or RDNA2.

    But it's probably wrong to generalize the whole HW Unboxed team. Tim is pretty objective, I feel like this is coming mostly from Steve.
     
    Last edited: Oct 13, 2021
    PSman1700, pharma and DavidGraham like this.
  9. troyan

    troyan Regular

    You have never played any games with Raytracing, huh? "Deliver us the Moon" is 18 months old and has one of the best Raytracing implementation. Raytracing is transforming the visual and solves one of the biggest problem for such games: Missing reflections.
     
    PSman1700 likes this.
  10. PSman1700

    PSman1700 Legend

    2077 with all five ray tracing modes at rainy night time in night city is probably the most impressive i have seen.
     
    pjbliverpool likes this.
  11. trinibwoy

    trinibwoy Meh Legend

    Shadows, fog, particles, volumetric lighting, shaders, tessellation, ambient occlusion, reflections, DOF, AA are all refinements. They all add to the final result.

    The irony is that one of the main reasons people don't fully appreciate RT yet is that games aren't designed with RT as a baseline. If you look at a scene with one or two shadow casting lights it's going to be hard to see where RT helps because advanced shadow mapping (and caching) techniques look really good. But games don't include scenes with many dynamic shadow casting lights because shadow mapping would be laughably slow. So when people say RT shadows are blah it's understandable because they have no frame of reference for what real-time shadows "should" look like in more complex scenes.
     
  12. Svensk Viking

    Svensk Viking Regular

    I have not seen anyone referencing his words as ground truth, what I do see however is that all the time someone links to his videos, and multiple tests and diversity are actually good for making decisions and ought to be encouraged, the pro-raytracing camp constantly tries to invalidate his videos because they believe they are The Authority deciding what settings people are allowed to use. People are free to choose their settings in PC games, and if one or two single reviewers say they decide not to use raytracing because of too little gains for the performance cost, I do not see why it is so hard to simply accept their existance and go on with life, especially when there are multiple other sites that do test with raytracing as a standard.

    To be frank, Hardwareunboxed has probably been getting alot more attention on here than they otherwise would because of you guys' obsession with them.

    It is hilarious that you actually are prepared to throw away decades worth of benchmarks if they cannot show proof that they are actual experts.

    Their videos are not made for presentations at tech conferences or any kind of expert audience.
    Their videos are directed towards gamers wanting to see benchmarks and performance for various games and graphics cards. The average joe looks at comparisons and the performance, and no theory or math can save the product if they not are impressed by what they see.

    But yeah, I do not want to waste more time on this specific ridicilous criteria either.
     
  13. techuse

    techuse Veteran

    Control with RT offers minor improvements while slashing performance in half. Metro EE looks entirely different while costing a smaller amt of performance. It is not offering up a similar image with some refinement here and there.
     
  14. trinibwoy

    trinibwoy Meh Legend

  15. iroboto

    iroboto Daft Funk Legend Subscriber

    20 years of training gamers about static lighting and static shadows.
    People are just used to it because they've never seen the alternative. And I don't blame them, they've never seen what they've never seen. Minecraft RT is like the only title to portray this correctly and most people will not play it.

    Every game leading up to now including even some RT titles have light sources in the most awkward of places.

    Find a hidden passageway? No worries, we've conveniently left candles lit for you to rummage about.
    Find a hidden tomb that's over 1000 years old, we're still burning fire here and there just for you!

    Whenever you have to make a compromised for baked lighting, you get the most awkward of setups. Game design based around graphics all start to feel the same because ultimately, lighting forces the game to be played a very specific way. All scenes need to be lit, even the dark ones. Collapsing environments and buildings that allow light to fill a dark room? Never going to happen. Gamers have been trained to ignore these and instead focus on texture details, model and geometry density, and resolution as being the indicators of higher quality. Lighting and shadows and physically based rendering was the major item of last generation, but dynamic lighting/GI and shadows is never a something people bring up. And that will be for this coming generation.

    When we finally let go of baked lighting, we're going to see some really different gameplay scenarios, people should be excited for what RT brings to the table, and it's going to be a lot more than just 'marginal improvements on existing' ultra settings.
     
    PSman1700, trinibwoy, xpea and 3 others like this.
  16. neckthrough

    neckthrough Newcomer

    That second shot is the sauce. I've experimented with RT on and off in Control quite a bit and I think that shot objectively captures the improvements that RT brings to this game. Subjectively though, I don't know if it communicates the sum-of-its-parts effect to someone who's never experienced it in motion. To me playing it with RT off feels like playing a very good-looking video game. Playing with RT on feels like guiding a toy character through a physical miniature world. The stable reflections on vinyl floors and wooden walls, the contact shadows that ground objects to surfaces, and Jessie's self-shadowing just makes everything feel *solid* and not a bunch of texture-mapped triangles. And all that at 4K/60fps.
     
  17. Davros

    Davros Legend

    You know I actually prefer the look of the original
     
  18. DegustatoR

    DegustatoR Veteran

    PSman1700 likes this.
  19. DavidGraham

    DavidGraham Veteran

    It's understandable that some regular run of the mill dude will often not give a flying fuck about graphical progress and enhancements, but it's not understandable when the tech elite and graphics enthusiasts do that exact same thing in a forum dedicated to graphics and techs.

    But I guess that's what happens when you affiliate yourself to a certain vendor, you convince yourself that RT brings nothing to the table so that you can buy current/previous gen AMD hardware and not feel bad about missing out.
     
    DegustatoR, OlegSH and PSman1700 like this.
  20. DavidGraham

    DavidGraham Veteran

    No, it's because he has ruined the reputation of a once venerable site: TechSpot.

    People, not tech reviewers, those represent products and what they are capable of.

    There is a standard called DX12U, with it comes new graphics features called ray tracing, not testing your new hardware with the new graphics standards represents the most stupid and misleading thing I have seen in tech journalism in decades.

    HBU had no problem using early DX12 tests in their reviews despite them not offering anything new to the table over DX11, not even performance enhancements (on the contrary, it often made performance worse compared to DX11), but now when DX12 has new graphics features they don't test because it doesn't matter!! That's not tech journalism, that's hypocrisy.

    And it bit them in their assess, now they have to go through hoops to justify favoring the featureless RDNA1 GPUs over Turing, to save face in front of their audience who followed their words without much thought.
     
    Last edited: Oct 13, 2021
    Rootax, OlegSH and PSman1700 like this.
Loading...

Share This Page

Loading...