GPU Ray Tracing Performance Comparisons [2021-2022]

He doesn't find the current implementations very compelling outside of a select few titles, which is very reasonable.

It's reasonable because it's the hard truth. ¯\_(ツ)_/¯
And despite that, he still recommends a RTX 3080 over a RX 6800XT if you can find them at similar prices.
 
He doesn't find the current implementations very compelling outside of a select few titles
Is he a graphics quality expert or a game developer?
He is not, so why should anybody care of what he is thinking on that matter?

Apparently, he had not had any moral dillemas with using DX12 games for reviews back in 2016 / 2017 even though the only "benefit" for NVIDIA users was mostly lower performance in comparison with DX11 without any quality improvements, but now he doesn't find the current implementations very compelling outside of a select few titles, how funny.
 
Is he a graphics quality expert or a game developer?
He is not, so why should anybody care of what he is thinking on that matter?

You do not have to be a graphics quality expert or a game developer to compare a scene with two different settings and tell your viewers which one you do prefer and the reasons for it. The video listed above has 207 thousand views and their channel has 819 thousand subscribers , so "anyone" is apparently more people than you seem to think.

If you actually require the reviewers to show their legit qualifications, the risk is your list of approved reviewers will be short.

No idea why are local RT proponents so sensitive about one reviewer standing out of the line they leave the topic all the time.

I also do not get why some users get so upset with one specific reviewer. Choices are good and today you have a great chance of finding your favourite reviewer focusing on the aspects you yourself consider most important.

One of the worst thing that would happen IMO is if all reviewers just decided to do identical tests. Or forums getting their own committees vetoing what reviews are valid or not, which is the feeling I sometimes get is occuring in these types of threads.
 
One of the worst thing that would happen IMO is if all reviewers just decided to do identical tests. Or forums getting their own committees vetoing what reviews are valid or not, which is the feeling I sometimes get is occuring in these types of threads.

The problem with HBU is that they complained publicy about nVidia not supplying them with review cards anymore because they didnt put enough effort in using Raytracing and DLSS.
Nobody has a problem with reviewer buying the hardware themself.
 
The problem with HBU is that they complained publicy about nVidia not supplying them with review cards anymore because they didnt put enough effort in using Raytracing and DLSS.
Nobody has a problem with reviewer buying the hardware themself.

In all honesty, I do not see why that is a problem at all. They raised the alarm so people got aware of Nvidia trying to influence published benchmarks behind the scenes. If other sites went along with Nvidia's wish however, that greatly reduces their credibility.
A key aspect of having third party reviews is that they are independent from the company behind the product.
 
And yet he didnt use the better implementations like Battlefield 5, Control or Metro EE for the raytracing section in the 6800XT review

How dare they not even bother to travel forward in time to show in a November 2020 video review the results for a title that wouldn't come out until April 2021!




Is he a graphics quality expert or a game developer?
He is not, so why should anybody care of what he is thinking on that matter?
Yes they're so irrelevant, them and their 819 000 subscribers (and rising).


Or Hardware Canucks saying the exact same thing with only >1 680 000 subscribers:
All this to say is that ray tracing is far from perfect, but we are starting to see traces of its future potential, like the LEGO where everything is ray traced, it's absolutely stunning, but the performance is so bad that there's absolutely no way that you can do anything more complex than like a few blocks of LEGO on the screen.
hardwarecanucks.png




Or Linus Tech Tips with a measly 14 000 000 subscribers

The overall trend from our test is pretty simple: the people with knowledge of 3D rendering, and especially with first-hand experience gaming on with RTX at home were way more likely to nail it. As for everyone else, telling the difference isn't easy. The truth of the matter is that with current titles that support ray tracing, it's pretty difficult to tell whether it's on or off without pixel peeping.
Even when we told people to look closelier, it was extremely challenging and the results were quite similar.
It's clear from our experiment today that, in the right hands, rasterization is still a potent tool and this industry transition is going to take many years to complete. So maybe don't feel too bad if you still haven't managed to get your hands on a new fangled RTX GPU.


But what do these popular outlets' opinions, catering to an aggregated amount of over 16 million subscriptions, matter to anyone? They're not game developers nor tech experts!
And I'm not even picking up on all the non-tech massive videogame influencers who constantly tell people to generally turn all raytracing off because it kills performance.



I think B3D users should make a better effort to break out of some echo chambers created here. This has been happening in no small part thanks to DF's videos - though it's obviously not their fault to report on what they do best.
Real time raytracing, in its current and short-term future form, did not have the impact on the general gaming population that some have convinced themselves to have.
 
1 - Define "great use".
+
2 - At what rendering resolution and what base generation/type of games? RTX Minecraft and RTX Quake may be nice demos, but they aren't exactly why people are spending money on new gaming systems.
+
3 - Using hardware that is accessible / owned by what percentage of a multiplatform game's total addressable market?

Is a 2060 at 1080p (or 1440p with DLSS) accessible enough? Or should we only look at 4K ultra to decide what’s playable?

https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss
 
He doesn't find the current implementations very compelling outside of a select few titles, which is very reasonable.
Ugh.
Fully path traced in real time Q2RTX isn't compelling?
Metro Exodus's real time RTGI wasn't compelling for him but suddenly became so in EE? Why I wonder?
Control's top notch RT implementation wasn't compelling either?
I mean come the fuck on. The guy is changing his own goal posts now because they were blatantly wrong in the first place, it's clear as day.

It's also funny how borked FC6 RT implementation is being dismissed by them since apparently "AMD's RT h/w simply isn't up to par so what can you do move on nothing to see here".
And this is after they've spent several years describing to us in great detail how Metro Exodus's RTGI "isn't very compelling".
Jeez.
 
Ugh.
Fully path traced in real time Q2RTX isn't compelling?
Metro Exodus's real time RTGI wasn't compelling for him but suddenly became so in EE? Why I wonder?
Control's top notch RT implementation wasn't compelling either?
I mean come the fuck on. The guy is changing his own goal posts now because they were blatantly wrong in the first place, it's clear as day.

It's also funny how borked FC6 RT implementation is being dismissed by them since apparently "AMD's RT h/w simply isn't up to par so what can you do move on nothing to see here".
And this is after they've spent several years describing to us in great detail how Metro Exodus's RTGI "isn't very compelling".
Jeez.
They can't hold that opinion forever. As long as the limits of graphical prowess wants to be pushed more power is required. Physics is hitting a wall here with the amount of power draw they can allow in a chip, you can only rely on general compute power for so long before you need dedicated accelerators to pick up the heavier loads. I know a lot of people don't think Turing and Ampere aren't big deals, but I think over time the market is headed that way regardless. Perhaps NVidia were a little early to market, but honestly I don't see a way forward going backwards.

All companies are moving back towards accelerators provided people want to see better graphics and people will eventually have to come to agreement with that over time; the better accelerators will have the best performance in games.
 
You do not have to be a graphics quality expert or a game developer to compare a scene with two different settings and tell your viewers which one you do prefer and the reasons for it.
In order to influence and educate people he has to be an expert in areas where he makes his claims, otherwise he will simply misinform his audience. He is obviously not an expert, so his claims regarding RT's relevence are irrelevant.

The video listed above has 207 thousand views and their channel has 819 thousand subscribers , so "anyone" is apparently more people than you seem to think.
Very strong argument, but PewDiePie has 100 millions of them, by your criteria he is a stronger expert than Steve, so PewDiePie's opinion on RT must be ultimate and undisputable.
 
In order to influence and educate people he has to be an expert in areas where he makes his claims, otherwise he will simply misinform his audience. He is obviously not an expert, so his claims regarding RT's relevence are irrelevant.

He does not have to be an expert in anything, he is free to show his videos and tell his arguments, and people watching it are free to agree with it or not. The images for comparisons are there, the performance figures are there, that is all you need as a gamer to get an estimation of what you prefer to use. Just like it has been for every single benchmark throughout the ages.

Are you actually prepared to write down all reviews throughout the decades written by people that cannot show you they have formal qualifications as useless?

An engineer being able to coat it in all kinds of good words is not worth a dime if people looking at the end product are not impressed.

The story is always the same, just go back in history, developers claim DX11 is much better, they claim DX12 is better, they claim RTX is much better, the first benchmarks are released, and the forums get full of topics stating they are just performance hogs.

Very strong argument, but PewDiePie has 100 millions of them, by your criteria he is a stronger expert than Steve, so PewDiePie's opinion on RT must be ultimate and undisputable.

I have never even claimed that a reviewer needs to be an expert, that argument is fully on you, and I do wholeheartedly find it utterly silly. However, your argument about "anyone" not caring, that was clearly false.
I got curious though, since I am not a follower of Pewdiepie, is all of your argument on Pewdiepie loving raytracing based on Minecraft RTX, or has he made alot of videos proclaiming its greatness in other games too?

The only one who would have the weight in pushing RT as ultimate and undisputable are the studios when they decide on what minimum feature sets they are going to support. And right now, the only games that actually require DXR are still just updated versions of old games. And we even saw raytracing be removed in the last Battlefield game.



How dare they not even bother to travel forward in time to show in a November 2020 video review the results for a title that wouldn't come out until April 2021!

I think B3D users should make a better effort to break out of some echo chambers created here. This has been happening in no small part thanks to DF's videos - though it's obviously not their fault to report on what they do best.
Real time raytracing, in its current and short-term future form, did not have the impact on the general gaming population that some have convinced themselves to have.

If there is one thing I think ought help to settle this debate for the near future, it would be if Microsoft and Sony, or the individual studios, published their statistics over how many of their players are using raytracing . On the various console boards, I get the impression that alot end up playing Performance mode.
 
Last edited:
If there is one thing I think ought help to settle this debate for the near future, it would be if Microsoft and Sony, or the individual studios, published their statistics over how many of their players are using raytracing . On the various console boards, I get the impression that alot end up playing Performance mode.
Engines and tooling are far away from being able to release RT only candidates. To date we only have a few titles that don't work if RT isn't on, and they certainly run on weaker console hardware.
A lot of the argument against RT (wrt what I've read in these arguments) don't take into account the inertia of engines. RT only lighting is a massive switch over. Content creation tools, pipelines, lighting artists etc all need an overhaul. Further, there are other features that are simultaneously being added in like supporting better texture streaming via IO, new features added to GPUs etc.

It's a long process and it's really only begun. All the games that managed to ship with some form of RTGI have completed the first step to RT only future. Their next title would strip away the traditional T&L method - and we've seen 4A Games do this first, but I suspect it will not be long for this to be the normal transitionary process for a lot of studios.
 
Is a 2060 at 1080p (or 1440p with DLSS) accessible enough? Or should we only look at 4K ultra to decide what’s playable?

https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss
I finished Control. You think Control is a game-changer in visual immersion?!
Can you compare Control with maxed out RT with e.g. Demon's Souls 2020?


We're still missing what it takes for RT to truly take off.

Here's the case with real-time raytracing. There are 3 important factors:

- Raytracing being used at a level that is universally perceivable as being substantially better than a rasterization trick (and without sacrificing everything else, like you see in e.g. LEGO or RTX Minecraft)

- Getting good enough performance with raytracing

- Using affordable hardware

In 2021, you pick two. You can't pick three. I hope you don't think Control is where you get all three.

Given how we're in Q4 2021 and the next generation of Nvidia and AMD videocards are expected to cost an arm and a leg, picking all three won't happen in 2022 either, and I doubt 2023 will be much better.


Now I’m really curious to know which graphics tech you think has been more transformative in a similar timeframe (3 yrs).

RT is up there with 3D acceleration and unified shaders.
Texture filtering, pixel shaders and then unified shaders (which eventually gave way to compute shaders). Next one for me is definitely virtualized geometry for "unlimited" geometry detail, without a shred of doubt.
Raytracing is super cool but it's just not feasible to use on a large scale of hardware in a meaningful way, so I think RT's "transformative 3 years" won't start until 2023.



They can't hold that opinion forever. As long as the limits of graphical prowess wants to be pushed more power is required.
Of course they can't and they won't!
All those videos I shared from the past 6 months present opinions based on what they have now and what is expected to come up to 2 years from now. They all say that in some form or another.



All companies are moving back towards accelerators provided people want to see better graphics and people will eventually have to come to agreement with that over time; the better accelerators will have the best performance in games.
All companies? Intel and Nvidia are, for their dGPUs.
But Microsoft and Sony, whose consoles move the majority of AAA game sales at launch, decided to stick to an architecture that repurposes the TMUs for RT acceleration and don't use dedicated tensor cores.
The first RT implementation in a smartphone is apparently coming in the form of Exynos 2200 which uses RDNA2 just like the consoles. The first RT implementation in a handheld console is arguably the Steam Deck, with another RDNA2 GPU.
Is Apple expected to use exclusively dedicated RT units in their future iGPUs? Is there a chance Nintendo will actually pay for the footprint of a SoC that has dedicated RT units, for their next handheld?
 
Last edited by a moderator:
I finished Control. You think Control is a game-changer in visual immersion?!
Can you compare Control with maxed out RT with e.g. Demon's Souls 2020?

You asked for an example of a game with a great RT implementation that runs well on accessible hardware and I provided one.

We're still missing what it takes for RT to truly take off.

We’re not missing anything. Just like every other feature adoption increases over time. The notion that a new 3D feature needs to be everywhere overnight in order to be successful is pure fantasy. This has never happened.

In 2021, you pick two. You can't pick three. I hope you don't think Control is where you get all three.

Clearly this is false. There are several examples of RTAO and RT reflections being obviously superior than the older methods while being playable on mid tier hardware.

Texture filtering, pixel shaders and then unified shaders (which eventually gave way to compute shaders). Next one for me is definitely virtualized geometry for "unlimited" geometry detail, without a shred of doubt.
Raytracing is super cool but it's just not feasible to use on a large scale of hardware in a meaningful way, so I think RT's "transformative 3 years" won't start until 2023.

RT isn’t on quite the same level as shading but is right up there.
 
I finished Control. You think Control is a game-changer in visual immersion?!
Can you compare Control with maxed out RT with e.g. Demon's Souls 2020?


We're still missing what it takes for RT to truly take off.

Here's the case with real-time raytracing. There are 3 important factors:

- Raytracing being used at a level that is universally perceivable as being substantially better than a rasterization trick (and without sacrificing everything else, like you see in e.g. LEGO or RTX Minecraft)

- Getting good enough performance with raytracing

- Using affordable hardware

DLSS support in general has gotten better but I always find the examples of games being released first and then the DLSS patch comes around half a year later getting excused alot.
DLSS, not RT, is the one feature I definitely would tell people to try out if supported in their games however.
 
Back
Top