GPU Ray Tracing Performance Comparisons [2021-2022]

I'm sure it is.
To me as developer performance does not matter. It is what it is, i measure it and see what i can do with that.
But restrictions do matter. Knowing i could get rid of BVH building times, knowing i could support LOD for geomtery, but API does not let me do any of this sucks. It sucks because i know it could be much more efficient, but my hands are tied due to some stupid software decisions of others.
On console, no such issues. Their HW may be weaker (and cheaper), but i can implement efficient solutions on them, likely more than compensating the perf. difference.
Thus we are stuck, not they.

The consoles have a much and much less capable ray tracing solution, if anything its them holding back what would be possible on modern pc hardware. In every way.
I keep hearing PC aint flexible and all, but thats not really true, even NV's solution actually is 'flexible'. That and RDNA2+ exists in the PC space you know, and since its on the consoles too, anyone's free to utilize that to max. Thing is, AMD pc gpus are almost three times as capable already now.

You would be naive if you think the brand has nothing to do with it.
It has been very obvious since Turing.

RT will only be "okay" for some...if AMD beats NVIDIA's performance.
RT will only be "okay" for some...if the consoles beats the PC.

And then we have the other end.
Some will look at console RT...and declare games better looking than anything on PC (Look at posts about Spiderman MM)...despite the glaring lower I.Q. compared to eg. CP077.

All to do with brand and nothing about the tech.

You even see people hating DXR...because they confuse it for the RTX label...and even call it RTX, not DXR.

It is all about the brand.

I strongly doubt its mostly PC IHV fans struggling along here. From what i can spot its mostly PS5 users appearing in the pc tech threads.
 
If gamers can play at 60 fps, on GPU they can afford, with acceptable resolution / IQ, then they have no more point to say RT is too slow. But i can still complain if i think API restrictions prevent efficiency.
Sometimes i also have the impression people complain against RT just to complain, but that's rarely happening on a forum like this.
Clearly playing devil's advocate here, but:

Is RT in your mind feasible for example, if gamers
- use upscaling-tech to get that 60 fps?
- use non-maxed settings other than RT?
- have a non-UHD display?
 
60 fps isn't a maximum limit from several years ago anymore. We're at 360 Hz with displays already and 240 Hz displays are becoming somewhat widespread. Most PC gamers would say that they vastly prefer 120+ fps to 60. So why use 60 and not 30? 30 is a line which defines playability after all, everything above that is just comfort.
 
Yes, it's very frustrating to witness Moore's law finally tapering out just when we're discovering amazing new things to do with all those transistors (this is even more true in the ML space with its insatiable appetite for both compute and bandwidth).

The "solution" is basically looking at the problem as a full stack instead of relying on the foundries to give a magical perf/$ bump every couple years. I.e., a mix of process/packaging (3D stacking), architectural specialization (more "xx-cores"), architectural optimizations to reduce the cost of data movement, and developers relying more and more on heavily vendor-optimized software libraries. Unlike Moore's Law in its prime, this is not an infinite well that keeps giving, but there's a lot of water there that we haven't yet pumped out.

As long as there's a hunger for compute we'll find a way to make it happen.
A 10 TFLOPS GPU has ~20,000 FLOPS per pixel at 4K 60fps, or 10,000 shader instruction cycles. Not to mention all the free math that comes with texture sampling and filtering.

To me it's insanely wasteful the way games have used GPUs this past ten years.

At least, with ray tracing, we're seeing developers realise from day one that they cannot brute force their way to pretty. They have no choice but to be smart because brute force is a couple of orders of magnitude too slow.

But if developers end up just all re-using the same few ray tracing algorithms, they'll end up following a similar course to that seen with deferred rendering (shading): which has gone practically nowhere in 15 or so years. This was 2004:


via:

The Early History of Deferred Shading and Lighting - Rich Geldreich (google.com)

At least NVidia is being aggressive with pure real time ray tracing research.
 
The consoles have a much and much less capable ray tracing solution, if anything its them holding back what would be possible on modern pc hardware. In every way.
I keep hearing PC aint flexible and all, but thats not really true, even NV's solution actually is 'flexible'. That and RDNA2+ exists in the PC space you know, and since its on the consoles too, anyone's free to utilize that to max. Thing is, AMD pc gpus are almost three times as capable already now.



I strongly doubt its mostly PC IHV fans struggling along here. From what i can spot its mostly PS5 users appearing in the pc tech threads.

It's about the API/DXR.
 
The consoles have a much and much less capable ray tracing solution, if anything its them holding back what would be possible on modern pc hardware. In every way.
I keep hearing PC aint flexible and all, but thats not really true, even NV's solution actually is 'flexible'. That and RDNA2+ exists in the PC space you know, and since its on the consoles too, anyone's free to utilize that to max. Thing is, AMD pc gpus are almost three times as capable already now.
Even Series S can run Exodus Enhanced, AFAIK? Obviously it's fast enough even for the most advanced RT showcase we have.
Why do people hope for RT coming to consoles for adoption, then they get it, and then they complain about 'bad performance'? What is a real world example of consoles holding back PC masterrace at the beginning of a new console generation, where console specs are higher than average PC specs, and 2. every console has RT support but only 15 % of PCs have it right now?
What matters is RT support yes or no, not how fast it is. Scaling to various perf. targets is nothing new to developers.

And no, i can not utilize RDNA2 RT to the max on PC. I can not call intersection instructions directly, only indirectly using DXR. Thus i can not implement custom traversal code, which is possible (and has been already utilized) on consoles. Notice this is another point, beside BVH access which also only is possible on consoles. So on console i could optimize to the max. On PC i can't even use RT at all, in my case of having geometry with fine grained LOD.

You make strawman phrases ignoring all those facts. If NV would release some RT extensions demonstrating dynamic LOD and did some marketing, you would praise it. As long as they don't, you ignore the topic all together. Other platforms which already fixed those problems are 'holding back'.
What's your personal advantage with this? Fun to ignite PC vs. console flame wars? Self-confirmation paying 1200 for a GPU was worth it? The truth is: Those 500 bucks boxes have way better RT support than we have on PC, no matter what's the vendors brand. Higher perf. can't fix the limitations, but lower perf. can be scaled in many ways.

Clearly playing devil's advocate here, but:

Is RT in your mind feasible for example, if gamers
- use upscaling-tech to get that 60 fps?
- use non-maxed settings other than RT?
- have a non-UHD display?

Yes to all of that. To me things like DLSS are very welcome to put an end on resolution craze, which eats up all our HW improvements for little benefit.
Personally i still play at 1080p on a 1440p screen, because it helps against artificial sharpness of computer graphics. I would not use upscaling to improve fps, i would just lower resolution and be fine with that.
Max settings rarely give me better image quality. Sometimes i even reduce texture resolution if it feels too high in comparison to geometry resolution. (Does not happen anymore recently, but some years ago pretty often.)
Quality slider of AO? No thanks. But i'm glad to turn it off completely in cases. Black bands in corners are just ugly.
Sadly Max settings never allow we to fix any issues, e.g. rim lighting effect from reflection probes missing proper occlusion, which was the main artifact of last gen to me.

I did not want to say RT isn't feasible for gamers yet. But because it's costly, we need to find sweet spots. And maybe those spots are not in the middle where i would expect them, but work better with either subtle or full blown improvements. Not sure - it's just some observation from recent titles.
RT not being feasible for my development yet is another story, but i'm unable to prevent slipping on this topic again and again...
 
Yes to all of that. To me things like DLSS are very welcome to put an end on resolution craze, which eats up all our HW improvements for little benefit.
Personally i still play at 1080p on a 1440p screen, because it helps against artificial sharpness of computer graphics. I would not use upscaling to improve fps, i would just lower resolution and be fine with that.
Max settings rarely give me better image quality. Sometimes i even reduce texture resolution if it feels too high in comparison to geometry resolution. (Does not happen anymore recently, but some years ago pretty often.)
Quality slider of AO? No thanks. But i'm glad to turn it off completely in cases. Black bands in corners are just ugly.
Sadly Max settings never allow we to fix any issues, e.g. rim lighting effect from reflection probes missing proper occlusion, which was the main artifact of last gen to me.

I did not want to say RT isn't feasible for gamers yet. But because it's costly, we need to find sweet spots. And maybe those spots are not in the middle where i would expect them, but work better with either subtle or full blown improvements. Not sure - it's just some observation from recent titles.
RT not being feasible for my development yet is another story, but i'm unable to prevent slipping on this topic again and again...
Thank you for explaining. To keep it short: I agree to many of your points, except for texture sharpness (just a notch away from artifacting) and AO, which I (personal preference) do like very much.
 
I disagree. It is better to get the widest possible support of the base level before introducing forks in the standard which will be supported by the newest h/w only.
I'm sure that MS and all IHVs think similarly because otherwise we'd get more than DXR 1.1 (compatible with 1.0 h/w) in this generation already.
Those who want more than the standard allows for can either resort to compute for now (see UE5) or wait till the standard will evolve.
Yep, likely i'd end up using HW RT for geometry around the camera at constant LOD, and implementing compute tracing of surfel hierarchy with LOD after some distance.
That's a lot of work for a hack which hopefully remains temporary. So i better wait...
I believe there were several replies to you from those who don't see LOD support as important as you - especially since it would require a huge rework of current BVH building and processing possibly resulting in a complete loss of performance advantages we have in presently available RT h/w.
What's the point of a feature which adds flexibility while simultaneously removing the performance advantage to a point where a pure compute based approach would show similar (unusable for real time graphics) results? You can just as well use compute now on any GPU currently available.
I'm sure we'll get there eventually with RT h/w as well when h/w will become both faster and smarter.
What you say again is true for the general case, or let's call it 'state of the art for last gen'.
Assuming Nanite becomes the new standard, meaning other engines add a form of continuous LOD support too, it all becomes completely different:
We need hierarchies over geometry. We will precompute them offline and stream them. In my case, this hierarchy is built with RT in mind, because i primarily use it for GI which uses a from of RT. Nanite with its mesh clusters is not built for RT, but even that looks very similar to a good BVH.
There is no chance the gfx driver can built a BVH at similar quality in just a millisecond. Thus, even if we do not respect specific HW details, chances are that converting our offline BVH to GPU format gives higher or at least equal tracing performance.
Assumptions only NV can build a good BVH for NV hardware are unfounded. If we want, we can do better due to offline, and we can remove BVH building costs, and we cn even improve tracing performance. Pretty sure of that, but ofc. it's work and will have complications.
Most important: Future faster / smarter HW is not relevant. It's a pure software topic. In the worst case we get HW BVH builders which would be underutilized in my scenario, but likely i'd still build faster than those units could do. Because it's only a conversation of offline data and no full build.
Thus, HW buliders may turn out worthless and should not be added at all. Risk RT evolves into the wrong direction is high, and 'waiting for flexibility' increases this risk. So i express my arguments aggressive, because i think it's urgent.

Looking at those things from perspective of current state of the art, my arguments surely appear far fetched, exotic and not urgent.
But considering how important spatial acceleration structures are, for any kind of point and range queries, usid for physics, AI, audio, etc., it appears just stupid to hide such structures.
It must be accessible for both reads and writes, otherwise this technology is just ridiculous. Not on the long run, but already now at current day.

Again why do you think that such developer control of BVH structures would actually result in speed ups compared to what we have now?
I think i've explained this once more just above. Offline vs. runtime.

I'd assume that if there were such low hanging fruits there they would be included into DXR from the start, and the fact that they are not makes me think that the benefits you're implying simply aren't there to begin with.
Which is why i'm so happy UE5 came to illustrate the problem in practice. LOD is not a low hanging fruit, and it was ignored for decades because GPU brute force power allowed us to do so.
However, there are other applications of BVH as well, it's not only about LOD. By standardizing a BVH API interface HW vendors loose some flexibility on their side. They can not use kd tree, for example. But i think BVH already is RT standard for everybody for many years, and developers need flexibility too. Differences like branching factor and compression schemes seem the only problem, which is solvable. But i'm happy with vendor extensions too for a start.
 
Even Series S can run Exodus Enhanced, AFAIK?

Sure it can, at a much lower resolution, for starters.

Obviously it's fast enough even for the most advanced RT showcase we have.

Ye lets start talking last gen games....
Thats your humble opinion though. Its impressive, for sure, but to say the console version is the most advanced rt showcase we have, nah lol. The PC version perhaps thats running at higher settings, resolution etc. Even then, i'd go for CP2077 being the most advanced showcase (even DF thinks so).

Why do people hope for RT coming to consoles for adoption, then they get it, and then they complain about 'bad performance'?

Perhaps they hoped for something much more performant, you know, back in the pre-spec release days we had no idea what kind of ray tracing to expect in the consoles. Perhaps they wanted Ampere-like ray tracing performance.
Yet now AMD's dGPU's are trying to compete with Turing from 2018 in the ray tracing performance department, and thats AMD pc gpus, which pack ALOT more power to begin with to cope with the ray tracing in games. The consoles are quite below what we already have gotten in 2018.

A game going for true next generation will have its RT limited to mere upscaled reflections (rift apart), and ye well lol, i can agree its not much, but, its something, and its well worth the implementation there aswell, anyway.

What is a real world example of consoles holding back PC masterrace at the beginning of a new console generation, where console specs are higher than average PC specs, and 2. every console has RT support but only 15 % of PCs have it right now?

Nope. Not even close, theres close to 100 million PS4 users out there, not many millions have a PS5 even now, 6 months AFTER release. Theres many more owning RTX gpus then there are next gen console users, let alone just the PS5. 15% of all gaming pc's is actually more then there are next (or current) gen consoles out there.
Heck, just RTX gpus are in the 20 million range, thats without taking AMD RDNA2 gpus into consideration.

Consoles are both weaker in ray tracing (and everything else basically), and there exist fewer of them aswell. Their holding back quite much, as usual, and thats just at the start.
See back when Crysis made its debut, when your actually pushing the PC as the main platform, or nowadays CP2077. Lets see how that runs on your PS5, lel.

What matters is RT support yes or no, not how fast it is.

PS2 supported RT..... haha. Obviously, speed does matter. Not that support is a problem, as about every AAA game starting from 2018 has some kind of RT support on pc. PC users where way ahead in the RT game here.

And no, i can not utilize RDNA2 RT to the max on PC. I can not call intersection instructions directly, only indirectly using DXR. Thus i can not implement custom traversal code, which is possible (and has been already utilized) on consoles. Notice this is another point, beside BVH access which also only is possible on consoles. So on console i could optimize to the max. On PC i can't even use RT at all, in my case of having geometry with fine grained LOD.

Its not about what you can do, cause thats totally irrelavant to me, i dont care a single bit what you can do or not. Its what the devs can do. If the RDNA2 on pc is so restricted due to an API, then why the hell arent we seeing it being utilized on the PS5 for starters? Im sure talented studios like ND etc would have come with something that wouldnt be possible on pc's due to its restrictions right?

Self-confirmation paying 1200 for a GPU was worth it? The truth is: Those 500 bucks boxes have way better RT support than we have on PC, no matter what's the vendors brand. Higher perf. can't fix the limitations, but lower perf. can be scaled in many ways.

Lol, what i paid for a GPU back in 2018 has no relevance to today. A 2060 will do better than the PS5, anyway, as has been proven. Nah, they dont have better RT support than on PC, theres actually more RT titles on pc now then there are on consoles, and that wont change. Not to mention that the RT on pc games actually is superior.
Things can certainly be scaled in many ways, the PS2 could do ray tracing, too, so could many other older machines.
 
Last edited:
PC masterrace

You make strawman phrases ignoring all those facts. If NV would release some RT extensions demonstrating dynamic LOD and did some marketing, you would praise it.

Fun to ignite PC vs. console flame wars? Self-confirmation paying 1200 for a GPU was worth it?

Holy crap, tone done that shit, because this kind of behaviour wont last. You will be taken care of by any of the garbagemans out here, soon or late. Il hold back for now, but offending others will only escalate things and derail topics. Just keep on topic, the technical discussions. Not what your doing.
 
Sure it can, at a much lower resolution, for starters.
Of coarse. It's a 4TF GPU. Lower res, lower fps, but it proofs my point devs can scale to various performance targets, and in case of Exodus there seems no compromise of features. Enough proof to ignore titles where S has no RT support, and only X has.

Ye lets start talking last gen games....
Thats your humble opinion though. Its impressive, for sure, but to say the console version is the most advanced rt showcase we have, nah lol. The PC version perhaps thats running at higher settings, resolution etc. Even then, i'd go for CP2077 being the most advanced showcase (even DF thinks so).
I see no difference between PC and console versions of Exodus.
But interesting CP is perceived more advanced. Will watch that DF video...
15% of all gaming pc's is actually more then there are next (or current) gen consoles out there.
Does not matter to devs. 15% means we can not make a game which requires RT yet. But we can do so for a next gen console exclusive.
The rest is about comparing discrete GPUs which has the same cost than a whole console. Getting more perf. for more money is the least we can expect, but factoring in prices, consoles are optimized for games than PCs. Or do you think Cerny & co just order from newegg to build their stuff?
Its not about what you can do, cause thats totally irrelavant to me, i dont care a single bit what you can do or not. Its what the devs can do.
I do the same work engine developers do. If you believe i'm capable, or if my concerns also matter to others is up to your understanding of what i try to say. And i don't care about that either, since you're an end user i guess. Just, this is not an end user forum, so tech topics are looked at in detail, and marketing promises are identified as just that.
PC users where way ahead in the RT game here.
Nice use of past tense :D But seriously - i use consoles to illustrate flexibility is possible, that's all. I'm no console developer / warrior / user myself.
A 2060 will do better than the PS5, anyway, as has been proven.
In terms f RT? What's the proof then? However, i predicted next gen RT perf. would be at this level before launch, and i was right. Chances are i'm right with some more things too.
Holy crap, tone done that shit
Scroll up and see who of us two has started it.
 
Of coarse. It's a 4TF GPU. Lower res, lower fps, but it proofs my point devs can scale to various performance targets, and in case of Exodus there seems no compromise of features.

The features are there, but at much lower resolutions, which does impact RT performance cost alot. Also, scaling does exist on pc aswell, it has its very nature there.

I see no difference between PC and console versions of Exodus.
But interesting CP is perceived more advanced.

Thats your view on it. Also, Exodus is clearly a last generation game with RT bolted on. CP2077 is more of a next generational game (ranked 1 on DF list for graphics). Lets see how CP2077 runs on your PS5 with all five RT modes set to Ultra or even Psyco, while not going below 1080p or something.

Does not matter to devs. 15% means we can not make a game which requires RT yet. But we can do so for a next gen console exclusive.
The rest is about comparing discrete GPUs which has the same cost than a whole console. Getting more perf. for more money is the least we can expect, but factoring in prices, consoles are optimized for games than PCs. Or do you think Cerny & co just order from newegg to build their stuff?

Clearly it does, since RT support on pc is great, with all games that have RT on consoles having it on PC aswell, but with the ability to scale well beyond what the consoles are doing.
Those 15% still equal many more than the available PS5 userbase now. What next gen console exclusive is 'requiring RT'? PC gamers had Ray Tracing in their games some three years before consoles even made it to the living daylights.
What has cost to do with this discussion? Theres obviously a large market for RT capable GPUs, seeing the numbers (outnumbering that of consoles).
'Consoles are optimized for games more so than pc's'..... lol, quite amazing the pc thats not so optimized for it is doing games well, alot better. I dont care a single fuck what Cerny has ordered, it has nothing to do with this discussion again,
Your changing goals, instead of technical RT discussions your wandering off the path to 'pc's arent optimized and how many have RT capable GPus etc.

I do the same work engine developers do. If you believe i'm capable, or if my concerns also matter to others is up to your understanding of what i try to say. And i don't care about that either, since you're an end user i guess. Just, this is not an end user forum, so tech topics are looked at in detail, and marketing promises are identified as just that.

If your all that capable, than honestly, what are you doing here platform warring, cause thats what this is. Highly doubt ND, Cdproject, rockstar devs that work with RT etc are hanging on forums stating how bad the pc is.

ce use of past tense :D But seriously - i use consoles to illustrate flexibility is possible, that's all. I'm no console developer / warrior / user myself.

Your honestly coming over as one, and if your no console developer, isnt it abit of a stretch to compare PC vs console RT development, in special so early in the game as we are now?

In terms f RT? What's the proof then? However, i predicted next gen RT perf. would be at this level before launch, and i was right. Chances are i'm right with some more things too.

Turings more performant in ray tracing than PS5 GPU is, see Doom DF analysis using a RTX2060.

Scroll up and see who of us two has started it.

I dont call for peasants, or claim shills etc etc. I think if we keep that stuff away things wont escalate as easily. Its just hardware, plastic boxes and circuit boards.
 
I dont call for peasants, or claim shills etc etc.
Don't put words in my mouth - i did not say any of that. I doubt somebody would hire you for shilling.
Also, scaling does exist on pc aswell, it has its very nature there.
Neither did i put the term scaling into a 'console context'. Ofc. we need it on PC much more than on console, and reading further through the post i see you just don't get my message in general.
Exodus is clearly a last generation game with RT bolted on. CP2077 is more of a next generational game
No. Exodus has completely replaced its lighting pipeline with RT. It's not bolt on anymore.
CP looks bolt on to me, even after watching DF video. I don't think RT replaces the non RT GI solution entirely. Differences are too small. But IDK - i only know what Exodus is doing.
Both are prev gen games, both have (or will have) next gen versions. You say CP is 'more next gen' without any argument to back this personal impression, as usual. Which is one typical behavior to identify a 'strawman'.
NV marketing claims or DF opinions are no arguments, for example. But you constantly present them as such, thus i called you that.
The comment made me loosing politeness was this:
God you must hate the new consoles, the huge performance impacts RT has there and the minimal visual return of it, holy crap. Im glad the most of the population appreciates ray tracing lol.
Maybe i got it wrong, but it makes assumptions about my intent, what majority of population thinks, and uses some strong language. I don't know why, but it's just bullshit and does not add anything to discussion.
So your advise of 'keeping that stuff away' is only as good as you hold on to it yourself, no?
I guess we're done with that little dispute and just accept each others opinions as is.
 
I doubt somebody would hire you for shilling.

I doubt anyone would hire you as a dev by complaining on the largest userbase ray tracing implementation. Probably the reason why your spending your time here.

No. Exodus has completely replaced its lighting pipeline with RT. It's not bolt on anymore.
CP looks bolt on to me, even after watching DF video. I don't think RT replaces the non RT GI solution entirely. Differences are too small. But IDK - i only know what Exodus is doing.
Both are prev gen games, both have (or will have) next gen versions. You say CP is 'more next gen' without any argument to back this personal impression, as usual. Which is one typical behavior to identify a 'strawman'.
NV marketing claims or DF opinions are no arguments, for example. But you constantly present them as such, thus i called you that.
The comment made me loosing politeness was this:

Yes but Metro is a last generation game. Theres a reason why rift apart is only having upscaled RT reflections at a cost. CP2077 is next gen enough to me, for now. Theres nothing else really looking superior technically.
Yeah i can refer to DF or Nvidia, no idea why you think your comments have more value.

Maybe i got it wrong, but it makes assumptions about my intent, what majority of population thinks, and uses some strong language. I don't know why, but it's just bullshit and does not add anything to discussion.
So your advise of 'keeping that stuff away' is only as good as you hold on to it yourself, no?
I guess we're done with that little dispute and just accept each others opinions as is.

Just keep your discussions on-point and dont dive into pc masterrace and NV fanboy arguments cause they obviously have nothing to do with this topic.
It is what it is, consoles are way behind in ray tracing.
 
Back
Top