GPU Ray Tracing Performance Comparisons [2021-2022]

God you must hate the new consoles, the huge performance impacts RT has there and the minimal visual return of it, holy crap. Im glad the most of the population appreciates ray tracing lol.
For those who think RT isn’t worth the performance hit they should simply not use it. For those who think current lighting methods are good enough they can turn it off. That’s why options are great. Literally nobody has had RT forced on them in any game so far.

There does seem to be a bit of hysteria around the limitations of the first iteration of an advanced feature that can be turned off if needed. Which I find ironic because I think we should be celebrating the impressive adoption rate of a long desired capability that fundamentally improves the rendering pipeline and artist workflows. It’s impressive because we’ve had RT hardware on the market for less than 3 years which is nothing in terms of game development timeframes. With consoles now in the mix we could be in for a treat this generation. So yeah I don’t understand the Debbie Downers either.

Guys, no need to invent a minority of haters, Debbie Downers, hysterics, etc. You can just continue your sycophancy by joining a made up hype train of success and self gradualate to appear smart, without offending those which just keep thinking for themselves, or dare to disagree in a discussion forum.
Just wake me up if you have something to say for real.
 
* Many effects: Control, CP 2077. GI still mostly static. IMO looks better, but not really enough to be worth the cost. I'm totally not convinced and would indeed turn off RT after checking it out.
Control GI isn't static albeit it's not using RT AFAIK, CP2077 GI is dynamic and can use RT. Both provide a sizeable visual upgrade with RT over the non-RT paths, enough for people to consider using upscaling options like DLSS instead of turning RT off. So you're definitely in a minority of those who are not convinced with these titles. RT in Control in particular is great.

* Single effect: Eternal or CoD. Just reflections or soft shadows. Hit on perf. is small. IMO that's good and i would happily enjoy RT in those titles.
Hit on perf is between 30 and 50% in such titles, and it tends to be even higher on AMD h/w. Not sure how this is "small". Their performance in general is very high though since both are targeting 60 fps on XBO/PS4 and thus even with such hit they remain rather performant with RT.
For Eternal specifically RT is optimized to have smaller hit by cutting off RT reflections based on material roughness. Remove that and the hit will be even bigger.
I'm also unsure how these games can be enjoyed with RT while the likes of Control or CP2077 can't - they have reflections and shadows done by RT too and you can have only them being done with RT if you so prefer.

Many weird takes all around really. It is looking a lot like people just don't like RT for whatever reason and try to think of some arguments why. I'd imagine that it's as simple as "I don't have h/w powerful enough to run it so it must be bad".

Guys, no need to invent a minority of haters, Debbie Downers, hysterics, etc. You can just continue your sycophancy by joining a made up hype train of success and self gradualate to appear smart, without offending those which just keep thinking for themselves, or dare to disagree in a discussion forum.
Just wake me up if you have something to say for real.
So we can say something which paints current gen RT as bad but not something which makes it look good?
 
Guys, no need to invent a minority of haters, Debbie Downers, hysterics, etc. You can just continue your sycophancy by joining a made up hype train of success and self gradualate to appear smart, without offending those which just keep thinking for themselves, or dare to disagree in a discussion forum.
Just wake me up if you have something to say for real.

All ive got to say, your in the minority with your thoughts on ray tracing for sure. PC and console users embrace the technology, Intel, Sony, Nvidia, MS, AMD etc all are having some form of (hardware) RT support, and almost ALL new/modern next gen games have it (as an option or not). The general public loves it as evident by the amount of views and positive reactions across.
Samsungs next Galaxy smartphone is said to have hardware ray tracing.....

And to clarify, yes ray tracing has great visual return and enables features and effects just not possible with last gen tech. CP2077, Rift Apart, minecraft RT, and many many more, i really think the implementations are worth it there, minimal in some cases but still enhances graphics enough to be calling it well worth it.
 
Control GI isn't static albeit it's not using RT AFAIK, CP2077 GI is dynamic and can use RT.
Control is static. RT is used to add close range dynamics, but overall falls back to the non-RT approach.
CP seems very similar. For dsitant GI i saw no difference in videos, but did not follow anything the devs have revealed about it.

Hit on perf is between 30 and 50% in such titles, and it tends to be even higher on AMD h/w. Not sure how this is "small".
Doom runs 60fps on 2060 (iirc at 1080p). So yes the cost is still big, but going from 180 fps down to 120 on beefy GPU... who cares? So as a player i do not feel any slow down, which is great.

I'm also unsure how these games can be enjoyed with RT while the likes of Control or CP2077 can't - they have reflections and shadows done by RT too and you can have only them being done with RT if you so prefer.
Yes, i can use gfx options to adjust perf / visual gains to my needs and HW. But psychologically this also involves some 'arrgh - my GPU is too slow, damn!', so disappointment. And i'm really lazy at tuning gfx options.
It's really a minor issue, but together with the above reason enough i'm more convinced about Doom or CoD. Even if most people seem disappointed about Doom having 'just reflections', i like it because i think they ensured to do what's possible without compromising gameplay.


Many weird takes all around really. It is looking a lot like people just don't like RT for whatever reason and try to think of some arguments why.
I thought we are beyond that already, maybe we are not.
Well, for me it's pretty difficult. I worked on realtime raytracing since >10 years, so i'm not one who just wants to talk RT down. And while i'm very critical about current HW RT, this goes hand in hand with proposing / requesting improvements. So i'm no fan, but i'm interested and willing to use it myself.
And still, people confuse me to just hate it, or be conservative, against progress, etc. That's really annoying so i'm pretty pissed and i react offending myself.
Though, that's expected and i could deal with it.
The really bad thing is this: People just sit there, claiming RT will improve. Next gen HW and NV experts will solve all those 'minor' problems. But this is bullshit. We want to solve problems now and on current hardware. Praising how awesome current RT games are hinders RT progress much more than my rant on how limited it is.

So we can say something which paints current gen RT as bad but not something which makes it look good?
Sure. I can say Exodus is great and the first game with dynamic GI. That's awesome, and i've said so before.
But as a developer it isn't my job to praise, because this won't give me (or us) anything. It's my job to point out what's wrong and what should change, form my perspective.
Maybe i should just focus on that and reduce comments on Nvidias marketing promises, or RT costs being just big, GPUs being expensive, etc. I'd appear less of a hater then i guess, but i want some fun too :D
 
All ive got to say, your in the minority with your thoughts on ray tracing for sure. PC and console users embrace the technology, Intel, Sony, Nvidia, MS, AMD etc all are having some form of (hardware) RT support, and almost ALL new/modern next gen games have it (as an option or not).
Agree on RT being here to stay. I've said so in one of my first posts.
But we both make assumptions about minorities and majorities. To improve those assumptions towards educated guesses, any opinion has it's weight and has to be summed up to an objective average. It's not about agreement or right / wrong.
Outside of tech sites, i see really many people not being impressed from RT, even being disappointed. This does not mean i would think RT is useless or should be removed. It just means there is some work to do.

You know, usually we would not need to argue. All this is obvious. Differing opinions, first APIs not being ideal, etc., it's all expected.
The reason this discussion is so heated is NV marketing, which is dictatorship of self appointed innovation lead. Some swallow that, others don't. Argue is unavoidable with that background. They are guilty! :D
 
The reason this discussion is so heated is NV marketing, which is dictatorship of self appointed innovation lead. Some swallow that, others don't. Argue is unavoidable with that background. They are guilty! :D

Lol, how you come to that conclusion is beyond me. If only NV would be supporting ray tracing ye you could be worrying about that. But now everyone's supporting it and enabling it.

I don't see anything on consoles which makes us stuck. I only i see that on PC.

Hows that even possible? The consoles litterally have weak RT abilities not even matching what we got in 2018, eating in their performance while having limited visual return. And thats for the coming seven whole years.
 
Guys, no need to invent a minority of haters, Debbie Downers, hysterics, etc. You can just continue your sycophancy by joining a made up hype train of success and self gradualate to appear smart, without offending those which just keep thinking for themselves, or dare to disagree in a discussion forum.
Just wake me up if you have something to say for real.

I assume you meant that as a compliment :) I'm definitely a sycophant for unlocking progress in 3D rendering and starting to move beyond current limitations. I'm also a sycophant for higher texture resolutions, higher geometry detail, better AI and better physics. It's a lot more fun that being in the peanut gallery.

You're right that we don't need to invent a demographic that is already so well represented.
 
Lol, how you come to that conclusion is beyond me.
I'm sure it is.
Hows that even possible? The consoles litterally have weak RT abilities not even matching what we got in 2018, eating in their performance while having limited visual return. And thats for the coming seven whole years.
To me as developer performance does not matter. It is what it is, i measure it and see what i can do with that.
But restrictions do matter. Knowing i could get rid of BVH building times, knowing i could support LOD for geomtery, but API does not let me do any of this sucks. It sucks because i know it could be much more efficient, but my hands are tied due to some stupid software decisions of others.
On console, no such issues. Their HW may be weaker (and cheaper), but i can implement efficient solutions on them, likely more than compensating the perf. difference.
Thus we are stuck, not they.
I'm definitely a sycophant for unlocking progress in 3D rendering and starting to move beyond current limitations. I'm also a sycophant for higher texture resolutions, higher geometry detail, better AI and better physics.
Yeah, but applauding won't help with achieving goals. Criticizing may.

Well, i better do some work. Sorry for the heat again, and peace! ;)
 
Control is static. RT is used to add close range dynamics, but overall falls back to the non-RT approach.
I believe that Northlight GI is dynamic since Quantum Break but it doesn't use RT and rely on the typical probes approach. RT is used for local light bounces in Control though so it's a hybrid.
But in case of Control at least the global lighting itself is fairly static since the game is 100% in doors. The system is dynamic though if QB is of any indication.

CP seems very similar. For dsitant GI i saw no difference in videos, but did not follow anything the devs have revealed about it.
CP2077 have an option of doing GI fully with RT at "psycho" detail level, it does produce some nice results. Otherwise it's the same probes approach as everywhere else.
It is also fully dynamic in both approaches since the game has day-night cycle. Not sure why you think that it's not.

Doom runs 60fps on 2060 (iirc at 1080p). So yes the cost is still big, but going from 180 fps down to 120 on beefy GPU... who cares? So as a player i do not feel any slow down, which is great.
Who cares? The same people who are saying that the performance hit from RT is too high? It's still there after all.

Yes, i can use gfx options to adjust perf / visual gains to my needs and HW. But psychologically this also involves some 'arrgh - my GPU is too slow, damn!', so disappointment. And i'm really lazy at tuning gfx options.
It's really a minor issue, but together with the above reason enough i'm more convinced about Doom or CoD. Even if most people seem disappointed about Doom having 'just reflections', i like it because i think they ensured to do what's possible without compromising gameplay.
CoD's usage of RT is borderline invisible in most cases so I definitely wouldn't consider that a good example of RT or something which people would point to as an example of such.
Your idea here seems basically to say that RT in these titles is good because there's not much of it happening. The absolute of that approach is to say that RT is good when it doesn't exist.

The really bad thing is this: People just sit there, claiming RT will improve. Next gen HW and NV experts will solve all those 'minor' problems. But this is bullshit. We want to solve problems now and on current hardware. Praising how awesome current RT games are hinders RT progress much more than my rant on how limited it is.
I mean wanting things is nice and all but here's some perspective on how things are going for you.
A couple of years ago we thought that RT won't even be a thing on new consoles. Yet it is.
3-4 years ago people were seriously expecting Nv to drop RT h/w from their next GPU family. All GPU vendors are adding it instead, mobile SoCs including.
Solving problems takes time and mind share. I doubt that you or any of us here is smarter than Nv or AMD engineers who plan for future RT h/w evolution, so the idea of suggesting to them something which they didn't think about themselves yet sound rather self indulgent.
Your doubts in RT h/w improving are weird as well since it's like as if someone would doubt that GPUs will improve further around TNT/Rage128 times.
What we have now with RT is miles better than anyone has thought it would be just a couple of years ago. No reason to think that whatever issues we have with RT now won't be solved in another couple of years down the line.
Well, that performance hit will always be there and I'd actually expect it to become even bigger. Because modern 3D graphics do have a huge performance hit when compared to the likes of Quake and Asteroids 3D - it's just how things work.
 
I'm sure there are more aspects of Nanite's specific algorithm that weren't possible then. That's why I qualified my question with "how much of" ;)

Direct Compute was relatively primitive back then and geometry was stuck in tessellation wars along with geometry shader being a disaster ("just don't use it").

So, much hackery required.

The most unique implementation detail about Nanite's renderer compared other visibility buffer implementations is that there are "3 splits" to it's shading pipeline ...

On a traditional deferred shading pipeline, we usually have the lighting pass separated from the geometry pass so we pay overdraw for attribute interpolation and material shading in this case.

With other visibility buffer pipelines, it's common to see a visibility pass with a coupled material and lighting pass which isn't all that conceptually far off from forward shading so we can still realistically apply MSAA and transparencies at a relatively cheap cost too since we have a thin G-buffer.

Nanite tries to evolve deferred rendering into it's natural direction with the visibility buffer so that we can defer the attribute interpolation/material shading pass along with the lighting pass. Generate the visibility buffer to avoid overdraw during material shading but keep the G-buffer so that we can create specialized shaders to control the register pressure which will further minimize our shading cost! The downside is that MSAA or transparencies become very expensive in this scenario compared to other visibility buffer pipelines because we need a fat G-buffer as well ...
 
The reason this discussion is so heated is NV marketing, which is dictatorship of self appointed innovation lead. Some swallow that, others don't. Argue is unavoidable with that background. They are guilty! :D

I really hope it isn’t true that people are hating on RT as a proxy for hating on Nvidia. RT existed long before Nvidia did and is bigger than any one company. Yes Nvidia were first out the gate with hardware support but so what. That ship has sailed and now everybody and their dog has access to RT hardware. That’s like hating on higher resolution textures just because some company you don’t like sold cards with more vram. But maybe you’re right. It would explain some of the strange sentiment when it comes to RT in games.
 
I really hope it isn’t true that people are hating on RT as a proxy for hating on Nvidia. RT existed long before Nvidia did and is bigger than any one company. Yes Nvidia were first out the gate with hardware support but so what. That’s like hating on higher resolution textures just because some company you don’t like sold cards with more vram. But maybe you’re right. It would explain some of the strange sentiment when it comes to RT in games.

You would be naive if you think the brand has nothing to do with it.
It has been very obvious since Turing.

RT will only be "okay" for some...if AMD beats NVIDIA's performance.
RT will only be "okay" for some...if the consoles beats the PC.

And then we have the other end.
Some will look at console RT...and declare games better looking than anything on PC (Look at posts about Spiderman MM)...despite the glaring lower I.Q. compared to eg. CP077.

All to do with brand and nothing about the tech.

You even see people hating DXR...because they confuse it for the RTX label...and even call it RTX, not DXR.

It is all about the brand.
 
CP2077 have an option of doing GI fully with RT at "psycho" detail level, it does produce some nice results. Otherwise it's the same probes approach as everywhere else.
It is also fully dynamic in both approaches since the game has day-night cycle. Not sure why you think that it's not.
Dynamic time of day is not yet 'dynaimc GI' to me. For that, also all dynamic geometry would need to be supported. For a indoor game like Control this would make a big difference e.g. because doors opening and closing. IIRC, QB also was static in that sense, using octree of probes, so the probes could not move with a dynamic object. IDK if they still use a similar system for Control. But playing the non RT version some funny things did happen, e.g. destroying furniture, but it's baked shadows still sticked on the floor.

Who cares? The same people who are saying that the performance hit from RT is too high? It's still there after all.
If gamers can play at 60 fps, on GPU they can afford, with acceptable resolution / IQ, then they have no more point to say RT is too slow. But i can still complain if i think API restrictions prevent efficiency.
Sometimes i also have the impression people complain against RT just to complain, but that's rarely happening on a forum like this.

CoD's usage of RT is borderline invisible in most cases so I definitely wouldn't consider that a good example of RT or something which people would point to as an example of such.
Your idea here seems basically to say that RT in these titles is good because there's not much of it happening. The absolute of that approach is to say that RT is good when it doesn't exist.
Yeah, my own impressions here are surprising to myself, thus i ask.
CoD just looks really good to me because i like baked high res lightmaps. Same for games like TLoU2 or HL Alyx. Beside the static limitation there is not much left to improve visually, so soft shadows are a good choice. Subtle, but it's something i have not seen before in games, and perf. is fine, so i like it.
Exodus in comparison does not have this accuracy of baked lightmaps, but it's fully dynamic. For that i'll accept a noticeable frame drop or resolution decrease, and i like it too.
Control / CP look good and advanced even without RT. I know it's a lot of hacks and tricks and i also can see that. And RT just adds more hacks and tricks, together with dropping fps. It's not really progress towards 'get rid of fakery', which is what i expect to get from RT.

I doubt that you or any of us here is smarter than Nv or AMD engineers who plan for future RT h/w evolution
I think everybody is smart just within some narrow space. So without discussion and contribution from many people it's easy to overlook / underestimate something which might turn up important shortly after. With DXR this has happened in terms of LOD. Getting this right before AMD and Intel join would have been easier than doing so after that.
Your doubts in RT h/w improving are weird as well since it's like as if someone would doubt that GPUs will improve further around TNT/Rage128 times.
It's because of you guys here! My critique on missing LOD support is true without doubt, and i explain in detail again and again, and still it feels like nobody else has any interest on LOD support at all. Thus my optimism i'll ever get it shrinks, because if nobody requests it, no reason to open up the black box.
Well, that performance hit will always be there and I'd actually expect it to become even bigger. Because modern 3D graphics do have a huge performance hit when compared to the likes of Quake and Asteroids 3D - it's just how things work.
No problem with that. There are two related perf problems: 1. Tracing rays. In case of NV this is full HW so not our concern. 2. BVH management. This is a problem, because we have no way to control it. Just turning on RT support has a high cost, even if we manage to get nice stuff from only tracing few rays after that. This is about our data, our scenes. And we need options to precompute and adapt detail dynamically so we gain control over performance and achieve proper scaling. We also have interest for spatial lookups for dozens of other reasons than RT.
Until we get this, any discussion about 'bad RT perf.' is totally pointless. To one half it's like discussing which graphics driver can make better guesses on lottery numbers from a set of input which can't be even reduced to what's actually needed.
 
Yeah, but applauding won't help with achieving goals. Criticizing may.

Yes, we need healthy criticism so that limitations are addressed over time. We also need appreciation so that further investment is encouraged. Most importantly it’s not that much fun to stand by and criticize a less than perfect solution without taking a moment to enjoy the incremental benefits along the way.
 
I think everybody is smart just within some narrow space. So without discussion and contribution from many people it's easy to overlook / underestimate something which might turn up important shortly after. With DXR this has happened in terms of LOD. Getting this right before AMD and Intel join would have been easier than doing so after that.
I disagree. It is better to get the widest possible support of the base level before introducing forks in the standard which will be supported by the newest h/w only.
I'm sure that MS and all IHVs think similarly because otherwise we'd get more than DXR 1.1 (compatible with 1.0 h/w) in this generation already.
Those who want more than the standard allows for can either resort to compute for now (see UE5) or wait till the standard will evolve.
This doesn't mean that the standard is bad though because without it we'd be looking at probes and SSR but in higher resolution now.
RT even in DXR 1.0 form is a disruptive change to graphics evolution, and I'm pretty positive that this will be really apparent by the end of this console generation, when most games will be using it in their rendering with no fallbacks for non-RT h/w.

My critique on missing LOD support is true without doubt, and i explain in detail again and again, and still it feels like nobody else has any interest on LOD support at all.
I believe there were several replies to you from those who don't see LOD support as important as you - especially since it would require a huge rework of current BVH building and processing possibly resulting in a complete loss of performance advantages we have in presently available RT h/w.
What's the point of a feature which adds flexibility while simultaneously removing the performance advantage to a point where a pure compute based approach would show similar (unusable for real time graphics) results? You can just as well use compute now on any GPU currently available.
I'm sure we'll get there eventually with RT h/w as well when h/w will become both faster and smarter.

2. BVH management. This is a problem, because we have no way to control it. Just turning on RT support has a high cost, even if we manage to get nice stuff from only tracing few rays after that. This is about our data, our scenes. And we need options to precompute and adapt detail dynamically so we gain control over performance and achieve proper scaling. We also have interest for spatial lookups for dozens of other reasons than RT.
Again why do you think that such developer control of BVH structures would actually result in speed ups compared to what we have now? We don't have any data yet to base this assumption upon. Your idea that more flexibility here will result in higher overall performance may be right - but it also may be wrong. I'd assume that if there were such low hanging fruits there they would be included into DXR from the start, and the fact that they are not makes me think that the benefits you're implying simply aren't there to begin with.
But we'll see how things will evolve over the next several years.
 
I wonder how much of Nanite was possible 10 years ago...
The answered 64 bit atomics only affect achieving 'insane detail'. But the other important advantage of 'scaling down' while using larger triangles would have been possible.
Traversing a hierarchy and generating indirect draws for HW rasterizer is not really a big load on compute, and then no visibility buffer is needed. The heavy work of generating triangle clusters is on the preprocessing tools.
Though i'm not sure how much of a win it would have been in which cases, and larger triangles likely also means artifacts, e.g. visible seams on UV chart boundaries.
 
What's numbing is that GPU compute has only advanced by a factor of about 10 over the past 10 years:

AMD Radeon HD 6970 Specs | TechPowerUp GPU Database
2.7 TFLOPS FP32

AMD Radeon RX 6900 XT Specs | TechPowerUp GPU Database
23 TFLOPS FP32

Don't look at bandwidth growth or you'll cry.

I believe Ampere represents the easy part of the curve in improving ray traced performance. There's nearly no growth left in terms of BVH storage efficiency (unless NVidia hasn't implemented the compression techniques first described years ago) and VRAM bandwidth is going nowhere fast, too. Compute is looking like it has a miserable growth curve from here.

So, welcome to the next ten years of kludges to make RT work better than it currently does.

I wonder how much of Nanite was possible 10 years ago...
Yes, it's very frustrating to witness Moore's law finally tapering out just when we're discovering amazing new things to do with all those transistors (this is even more true in the ML space with its insatiable appetite for both compute and bandwidth).

The "solution" is basically looking at the problem as a full stack instead of relying on the foundries to give a magical perf/$ bump every couple years. I.e., a mix of process/packaging (3D stacking), architectural specialization (more "xx-cores"), architectural optimizations to reduce the cost of data movement, and developers relying more and more on heavily vendor-optimized software libraries. Unlike Moore's Law in its prime, this is not an infinite well that keeps giving, but there's a lot of water there that we haven't yet pumped out.

As long as there's a hunger for compute we'll find a way to make it happen.
 
Back
Top