GPU Ray Tracing Performance Comparisons [2021-2022]

Control just isn't a great looking game.

Personal opinion. On the other hand, Digital Foundry thought it was a great looking game. From my perspective, i think Demon souls remaster isnt all that of a great looking game, in special not in the context of next generation.

9th-gen console hardware (i.e. similar to medium-high PC specs)

Wouldnt say medium/high range at all. 5700XT/6600XT is quite much low to med range hardware in terms of GPU. CPU wont fair much better im afraid, seeing the clocks, old architecture and the latest information we have gotten regarding cut-downs.
Again, 5700XT aint much faster than a 2070S though, no, its quite close to it in general performance. Which puts the PS5.... at around 2070/S performance when theres no ray tracing going on ofcourse.

Top 3 goes to

Its just another poll though. Last of us 2 shouldnt even be on that list. DF's picks where alot different and more in line with what i have seen on social media (but again, polls are just that, it depends on where you start them).

You have to take that poll with a pinch of salt though
eg: if 400 of the 900 own RDR 2 (it was a very popular game) and only 100 own Control even if control does look better it's going to struggle for votes plus how many of the 100 played it on RT capable hardware ?
A pint rather. This kind of poll is merely a vote for whether or not people know the game and if so, how much those people do like the game.

The problem isnt that such polls exist, but the fact it is being shared here and attempted to be used as evidence of some sort.
 
I'm pretty sure your Unreal video didn't have transparent reflections showing reflected dynamic geometry while also showing the objects in the next room behind the glass
Sorry whats dynamic about the geometry in that screenshot the shape of the environment doesnt change and so you want an example of a reflective transparent surface with realtime reflections like water pretty sure I could find an example not using RT
heres something similar

Sorry I dont agree with your premise that RT cant be faked
 
Last edited:
Not at all. Everyone is entitled to their own taste in what overall look they like in a game. However, I would be really interested to hear from folks who would prefer missing or inconsistent lighting in their games regardless of the art style.



Yep, I'm struggling to understand the relevance to this topic.
RT reflections look better of course, they just don’t change a game’s end visual impact that much in the current implementations to my eyes. You can pick out specific scenes where the impact can be larger but over the course of long play sessions it isn't something I’d yet turn on unless I had extra performance to spare. Id much rather have 120 + fps. I feel the same about current uses of RT for shadows. Other limitations in visuals are profound enough that reflection/shadow hacks and issues aren't what generally stand out to me. This will of course change at some point so maybe my opinion will change somewhat when we end this cross gen period.
 
Last edited:
Sorry I dont agree with your premise that RT cant be faked
Okay. Let's talk about this fakery. But before that let's talk about what makes RT so expensive.

RT is just an accurate way of modeling light transport. But it's expensive as heck on vanilla GPU cores because the algorithm is (a) extremely divergent and (b) very memory latency sensitive. It's divergent because if you shoot 1000 rays into a scene it's very likely that they will start doing very different things depending on what objects they hit (or don't hit) and where they go after reflection. It's latency-sensitive because the sheer act of figuring out if you hit something involves walking through a data-structure called a BVH sitting in GPU memory, and there's not much compute work to do in between steps of this walk. So the speed at which you walk through the data structure is determined by how quickly the memory is able to return the next element of the walk, but you can't get to the next step until you've finished the current step. This is called a classic "pointer-chasing" workload in computer science. Oh, and the BVH walk itself is highly divergent as well. Now, GPUs are built for extremely high parallelism on code that *doesn't* diverge (e.g., running the same pixel shader on thousands of pixels), and extremely high bandwidth to supply data to those shaders. But they are terrible at divergent, latency sensitive code.

CPUs are actually *excellent* at divergent, latency sensitive code. But they just don't have enough parallelism. In theory, an array of 1000 CPU cores could be an awesome BVH traversal + object intersection engine but it would be too expensive and would need a personal power station to run. And that's why we need dedicated RT hardware to perform this task.

Okay I now realize all that was a tangent but what the heck I felt like lecturing so you have to suffer.

Back to fakery. Yes, under certain constrained situations you can use a variety of more GPU-friendly approximations to avoid tracing rays through world space. For example, you can model reflections in screen space (a la SSR) instead of world coordinates, which is *much* cheaper and works beautifully as long as reflected objects are on screen. You can even render the entire scene twice for reflections as long as you're willing to pay 2x the render cost. You can use baked lightmaps (which are pre-rendered offline on CPUs for hours), which with sufficient resolution give you perfect GI as long as there are no dynamic objects or light sources. You can use probe based GI for dynamic light sources as long as you're willing to tolerate light leakage. You can use SSAO to apply mascara over the leakage and it'll look kinda okay as long as you don't look too closely.

All of these hacks/approximations/fakery "almost work" as long as you meet the constraints. If you can build your art around these constraints then it will look good. But the moment your content starts stressing or outright violating constraints the approximations break down. As our art is becoming more detailed and photorealistic in other dimensions (higher geometry, physically based materials, more dynamism) it's becoming harder and harder for the approximations to keep up. The seams begin to show. Can you still in theory pile on a crap ton of approximations to desperately avoid the accurate simulation of light? Probably.

It's stupid though. Eventually it becomes cheaper to do the right thing, both in terms of human cost to develop the hacks and compute cost to run them all. And our hardware vendors have decided to pay the engineering and silicon cost of dedicated RT hardware to dramatically accelerate this eventuality.

I mean, it's hilarious to see people complaining about a 30-50% performance dip (especially when you have reconstruction techniques to get that back, and then some). The cost was considered to be 100x only a few years back. If you ever took a Computer Graphics undergrad course you would know that real time ray tracing is a 30-year old dream for the field. Everyone knew the cost. Everyone knew a theoretical RT accelerator could narrow the gap, but researching, developing and productizing it into a profitable commercial product seemed unachievable. It wasn't just about the hardware, you need an entire ecosystem for the thing to survive.

But anyway, it happened. And all those people who did take that Computer Graphics 101 course and went on to have careers in game development houses, hardware vendors and platform vendors are now seeing this unfold. It's unreal.

Don't get me wrong, approximations will stay for a *long* time. But we will be slowly switching gears into *different* kinds of approximations. Instead of talking about screen space effects and cascaded shadow maps we'll be talking about denoising, importance sampling, better image reconstruction. And probably many other techniques that I'm not aware of. But these approximations are all centered around the fundamental operation of tracing some number of rays through world space, which is now achievable in real time. And unlike screen space hacks, they don't break down catastrophically but degrade gracefully.

These are good times.

(PS: just don't think about Moore's Law.)

(PPS: disclaimer: CG is not my primary field so I may have butchered a bunch of stuff. I beg forgiveness from the real gurus here.)

(PPPS: the cake is a lie.)
 
Eventually it becomes cheaper to do the right thing, both in terms of human cost to develop the hacks and compute cost to run them all. And our hardware vendors have decided to pay the engineering and silicon cost of dedicated RT hardware to dramatically accelerate this eventuality.
If i had to pick, I think this is the key point. Yes, you could fake probably everything w/o RT. But you have to meticulously build your scene around these fakes. Which makes developing games with realistic graphics (there are other art styles where this doesn't matter!) more and more expensive. With a robust RT, many if not most things just work: Shadows, GI, reflections/refraktions.

And given that we're just 1.5 gens into RT hardware on desktop graphics cards, the perf impact is pretty OK, i'd say. Remember, where we started out with 3D-games just before the turn of the millenium. We were not talking about 4K res, Quake was deemed playable in 640x480 with 24-30 fps. Comp gamers turned it down to 512x384 or 300x400, hoping to be able to spot the pixels that were the opponents. Before that, Ultima Underworld ran it's 3D-world inside a window of the full screen GUI - a hybrid approach.

We're not there yet, RT-hardware is not the be-all-end-all, graphics cards are faster than ever at Rasterization. Some are putting more emphasis on raster, some on rays. We've got more choices than in a long time, next year another big player will enter the market an possibly shake things up further.

Now we only need sane pricing again.
 
RT reflections look better of course, they just don’t change a game’s end visual impact that much in the current implementations to my eyes. You can pick out specific scenes where the impact can be larger but over the course of long play sessions it isn't something I’d yet turn on unless I had extra performance to spare. Id much rather have 120 + fps. I feel the same about current uses of RT for shadows. Other limitations in visuals are profound enough that reflection/shadow hacks and issues aren't what generally stand out to me. This will of course change at some point so maybe my opinion will change somewhat when we end this cross gen period.

You can say the same thing about pretty much any graphics feature. Very few graphics settings change a game’s visual impact in isolation. It’s incredible to me that people consider completely missing reflections to be no big deal yet fuss over imperceptible increases in screen or texture resolution or framerate. Even better is the argument that it doesn’t impact gameplay as if other graphics settings do. It’s an inconsistent position to hold IMO.
 
You can say the same thing about pretty much any graphics feature. Very few graphics settings change a game’s visual impact in isolation. It’s incredible to me that people consider completely missing reflections to be no big deal yet fuss over imperceptible increases in screen or texture resolution or framerate. Even better is the argument that it doesn’t impact gameplay as if other graphics settings do. It’s an inconsistent position to hold IMO.
Very few graphic settings cut performance in half or close to it. That level of performance penalty calls for significant improvement to the net visual experience for me. Metro EE delivers that in spades to my eyes. Completely changes the visual experience. 60 to 120+ fps also completely changes the experience to me. I’m not sure what inconsistencies you are referring too. I have never been a proponent of the endless resolution chase nor have I championed the incredibly marginal improvements of HD texture packs.
 
Last edited:
Very few graphic settings cut performance in half or close to it. That level of performance penalty calls for significant improvement to the net visual experience for me. Metro EE delivers that in spades to my eyes. Completely changes the visual experience. 60 to 120+ fps also completely changes the experience to me. I’m not sure what inconsistencies you are referring too.

The performance cost is a valid concern. I feel the same way about native 4K which at tv viewing distances adds almost nothing over 1440p or 1600p. It’s a waste of rendering performance.

The inconsistency is in the standards for what should be considered material visual impact. It’s one thing to say oh that’s nice but I turn it off cause it’s too expensive which is a reasonable stance. It’s another thing to argue that improvements in lighting, shadows and reflections are no big deal. That’s the really confusing thing to hear on a graphics forum.
 
I mean, it's hilarious to see people complaining about a 30-50% performance dip. (especially when you have reconstruction techniques to get that back, and then some)

Why is it hilarious that people are complaining about >50% performance dips, in a time where GPU prices are soaring and the great majority of PC gamers can't find and/or afford a GPU powerful enough to cover that delta?

Let's even ignore the fact that DLSS is stuck to only one IHV (because some people love when tech is not only monopolized but also segregated to higher price brackets I guess) and let's only look at cards from that one IHV.
The RTX 2060 is going for >$650 right now. Is it so funny that people are choosing to stick to their Pascal and GTX Turing cards because they can't afford an upgrade to a RTX?
 
A pint rather. This kind of poll is merely a vote for whether or not people know the game and if so, how much those people do like the game.
The poll was created in a resetera post called "The Ultimate Graphics Poll".
People can watch videos and see screenshots of each game. It's the closest I found on gathering a general opinion on perceived graphics quality.

Plus that poll is from December 2020. How could Demon's Souls 2020 get 13% of the 900 votes if very few people had a PS5 at that point?
 
The poll was created in a resetera post called "The Ultimate Graphics Poll".
People can watch videos and see screenshots of each game. It's the closest I found on gathering a general opinion on perceived graphics quality.
I believe that both the poll was created and that you linked it in good faith. I just pointed out the possible skewing of such polls. Even when people have the chance and actually can be bothered to watch videos or look at screenshots, the fact alone that "Tetris Effect" has more votes than Watch Dogs Legion, Battlefield V, Ass. Creed Valhalla, COD Black Ops Cold War, Metro Exodus and Wolfenstein Youngblood combined shows that "best graphics" is a very subjective term at best. Comments in the resetera thread also point towards very heterogeneous backgrounds and standards by which to go of the participants, ranging from "out of the list, the only two I've played" over "best lip sync and facial animation" and "I hate XYZ’s art direction and the attention to detail is just not there." to "on a beefy PC (if that counts)".

We did similar things back where I worked years ago regarding the quality of our articles and what we've found was, that even despite being asked to evaluate the articles themselves, people more or less rated whether or not they were interested in their particular topics when asked directly about it.

Plus that poll is from December 2020. How could Demon's Souls 2020 get 13% of the 900 votes if very few people had a PS5 at that point?
I was not debating it's outdatedness. But to answer your question: By pre-selection bias as it was one of the titles presented in the opening post there already.

But it's an interesting poll to look at, for sure.
 
I feel the same way about native 4K which at tv viewing distances adds almost nothing over 1440p or 1600p. It’s a waste of rendering performance.
So I have a weird dilemma with this one. I'm happy with 1440p for games but I need 4K for crisp text. Problem is, sub-native display resolution still doesn't look that good on LCDs**. I tried integer scaling from 1080p but it's is a little too low. So basically I have to either render at 4K or have some good reconstruction up to 4K display resolution.

**I do wonder if I have a preconceived bias here. An A/B comparison between native 1440p and 1440p->4K bicubic upscale on otherwise identical monitors would be an awesome experiment. Not sure how I'd pull that off.
 
Why is it hilarious that people are complaining about >50% performance dips, in a time where GPU prices are soaring and the great majority of PC gamers can't find and/or afford a GPU powerful enough to cover that delta?
Yeah I'm with you there. There's nothing funny about these prices. I'm hoping the madness normalizes within a few months. We're probably never going to get back to Pascal/Polaris price tiers but I have some hope we'll be able to buy xx70/x700XT grade cards at the $500 price point again.
 
Oh i think rasterization “ultra” settings are almost always a waste too.
Thing is, you can use that 20% of fps to augment shadows with some RT, this will lead to shadow coverage for much more objects on the screen, better shadow accuracy and resolution, and more shadow casting light on screen, for almost the same performance cost, many RT games did exactly that.

RT is just the new Ultra settings for PC games, but this time it adds major visible differences, if fps is your concern, you can easily drop down your game to High, gain a 20 to 30% of fps with no noticeable loss to visuals then activate RT on top of it, gain a noticeable boost to image quality at a reduced fps cost, because part of the cost was already covered when you dropped from Ultra to High.
 
Oh i think rasterization “ultra” settings are almost always a waste too.

I certainly concede that in many cases, enabling certain RT settings over a select number of "ultra" options will probably result in better IQ at iso framerate.

HardwareUnboxed made an excellent video on that, recently.
 
I certainly concede that in many cases, enabling certain RT settings over a select number of "ultra" options will probably result in better IQ at iso framerate.

HardwareUnboxed made an excellent video on that, recently.
Cannot stress this enough: Don't just blindly push all sliders to the max. Use your brain, find your own optimal image quality setting. They might be very different from person to person.
 
Oh i think rasterization “ultra” settings are almost always a waste too.

Atleast for now. No games so far that really target 'true next gen hw', once that happens i think theres quite abit left in the tank for rasterization. But going forward we'l have to appreciate new rendering technlogies (and we are doing that, even low end hw like consoles and entry gpus).
This entire generation will likely go for hybrid approaches though. And thats okay.

Yeah I'm with you there. There's nothing funny about these prices. I'm hoping the madness normalizes within a few months. We're probably never going to get back to Pascal/Polaris price tiers but I have some hope we'll be able to buy xx70/x700XT grade cards at the $500 price point again.

Imagine the sales numbers if higher end gpus would be around the 500 mark again..... (if producing allows for that ofc). The current situation is affecting all ranges of HW, low end gpus, mid, to high and consoles. Dunno whats worse, not being able to get anything at all or be able to get it at fantasy prices.

Thing is, you can use that 20% of fps to augment shadows with some RT, this will lead to shadow coverage for much more objects on the screen, better shadow accuracy and resolution, and more shadow casting light on screen, for almost the same performance cost, many RT games did exactly that.

RT is just the new Ultra settings for PC games, but this time it adds major visible differences, if fps is your concern, you can easily drop down your game to High, gain a 20 to 30% of fps with no noticeable loss to visuals then activate RT on top of it, gain a noticeable boost to image quality at a reduced fps cost, because part of the cost was already covered when you dropped from Ultra to High.

Mods already spawned a thread around this 'Certain users using Ultra settings are dumb'. Thats for now, but things could look differently when games target next gen gpus, Ultra settings could look a whole lot different to High or very high.

Cannot stress this enough: Don't just blindly push all sliders to the max. Use your brain, find your own optimal image quality setting. They might be very different from person to person.

True. For some games the difference aint that huge, but more next gen oriented titles like 2077 (without RT enabled) i think Ultra does give extra graphics fidelity that might be worth it.
 
Back
Top