GPU Ray Tracing Performance Comparisons [2021-2022]

They really need to figure something out for ray-tracing in Windows so it's not so cpu heavy. Not sure if there are really any avenues unless there's some breakthrough in bvh. Most people are going to buy low to mid-range cpus, and ray tracing just has a massive performance hit on the cpu side. I'd expect most people to turn it off. It'll be hard for DXR to take off unless CPUs get massively better.
 
They really need to figure something out for ray-tracing in Windows so it's not so cpu heavy. Not sure if there are really any avenues unless there's some breakthrough in bvh. Most people are going to buy low to mid-range cpus, and ray tracing just has a massive performance hit on the cpu side. I'd expect most people to turn it off. It'll be hard for DXR to take off unless CPUs get massively better.
Yeah but "low to mid-range" CPUs are 6C to 12C these days which are generally well above what you'd need for a game without RT. I don't think that this is a big issue.

That being said let's see what next gen GPUs will bring to RT pipeline.
 
Yeah but "low to mid-range" CPUs are 6C to 12C these days which are generally well above what you'd need for a game without RT. I don't think that this is a big issue.

That being said let's see what next gen GPUs will bring to RT pipeline.

Yah, that's the problem. You can get a low to mid-range cpu and get very high framerates in a lot of games, but you turn on RT and your performance tanks because the cpu can't handle it. So most people will just turn RT off. They could get a 4080 and their RT performance wouldn't be any higher, if nothing on the cpu side changes.
 
Yah, that's the problem. You can get a low to mid-range cpu and get very high framerates in a lot of games, but you turn on RT and your performance tanks because the cpu can't handle it. So most people will just turn RT off. They could get a 4080 and their RT performance wouldn't be any higher, if nothing on the cpu side changes.
I honestly can't think of any game with RT where performance "tanks" because of CPU. Even relatively old CPUs are enough for modern games with RT to hit 60+ fps. You're much more likely to be GPU limited when running with RT, and you would need a seriously old CPU (like a 6000 series 4C i5 or something) to be mostly CPU limited.
 
I honestly can't think of any game with RT where performance "tanks" because of CPU. Even relatively old CPUs are enough for modern games with RT to hit 60+ fps. You're much more likely to be GPU limited when running with RT, and you would need a seriously old CPU (like a 6000 series 4C i5 or something) to be mostly CPU limited.

I've got an rtx 3080 and a ryzen 5600x. Downloading Spider-Man now and I'll let you know. Digital Foundry showed it to be very cpu-limited on a ryzen 3600x. Switching to a 12900k with the same gpu basically doubled performance with ray tracing on. There are probably a lot of people with that class of cpu, and even if they use lower end RTX cards they may be more cpu-limited than gpu limited. Not sure.
 
I've got an rtx 3080 and a ryzen 5600x. Downloading Spider-Man now and I'll let you know. Digital Foundry showed it to be very cpu-limited on a ryzen 3600x. Switching to a 12900k with the same gpu basically doubled performance with ray tracing on. There are probably a lot of people with that class of cpu, and even if they use lower end RTX cards they may be more cpu-limited than gpu limited. Not sure.
This has probably more to do with the fact that this game's RT was developed for an 8C Zen2 CPU of PS5. People seem to think for some reason that console CPUs won't be used to their fullest this generation but they will, the further we are from previous gen the more often this will happen. It should eventually lead to cases where you will in fact need an 8C Zen2 CPU to hit even 30 fps reliably.
 
The 3080Ti is 45% faster than 6900XT at both 1440p and 2160p using max RT settings.


Don't tell NX Gamer who claims you need a 3070 to match the RDNA2 based PS5's performance.
 
I've checked the game, and I'm absolutely GPU limited on my 3080 even in DLSS Performance option of reaching 4K (so 1080p native resolution) and even when running PS5 performance RT settings. I'm hitting 80-100 fps which is ~2-2.5x more than what PS5 shows here.

CPU load is impressive though - all 24 threads are at 35% minimum with a bunch of them hitting 50-70%. Still if I would have to choose I would opt to upgrade my 3080 to get better results here. The most beautiful mode of native 4K+DLAA with maxed out RT runs at ~40 fps on 3080 while 80-100 fps I'm seeing in the top range of results is plenty for this game to be fully playable (I've beaten it at 40 fps on PS5 after all).
 
I'm 1440p, and I can make the gpu hit 100% if I turn RT object distance down to around 4, but it's frequently less than 100%.

120 fps or bust. I seem to be jumping around the 90-120 range. Lots of settings to play around with. Curious to see what I can come up with for a 5600X.
 
I'm 1440p, and I can make the gpu hit 100% if I turn RT object distance down to around 4, but it's frequently less than 100%.

120 fps or bust. I seem to be jumping around the 90-120 range. Lots of settings to play around with. Curious to see what I can come up with for a 5600X.
So same performance range with half the cores? Doesn't look like its a CPU limitation to me. Bus or memory b/w maybe? @Dictator does your ADL system use DDR5?
 
Digital Foundry showed it to be very cpu-limited on a ryzen 3600x. Switching to a 12900k with the same gpu basically doubled performance with ray tracing on.
Aren't you missing half the context here?
DF also said that they got in touch and said their specifically working to optimize and improve that. Hopefully within next couple patches.

So I wouldn't use this as an indication of how bad BVH on pc is yet.

Truth is RT on RTX is massive step up compared to RDNA2 based hardware (including consoles).
Even taking into account CPU.
 
Yeah but "low to mid-range" CPUs are 6C to 12C these days which are generally well above what you'd need for a game without RT. I don't think that this is a big issue.

That being said let's see what next gen GPUs will bring to RT pipeline.
low to midrange cpus are 4-6 cores, not 6-12 cores.
 
With 6C being ~$150 now I beg to differ.
Sorry, I forgot the E-cores. If we count those your estimate would be closer.
AMDs midrange is Ryzen 5, that's 6 cores. Intels midrange is Core i5, that's 6 cores except for two models with 4 E-cores on top of those. (on desktop, mobile core counts differ)
 
There are no E-cores in a 6C/12T 12400F. That's low end.
And Intel's midrange is 12C/20T right now.
AMD has 6 to 8 cores in the same segment which is also more than 4.
Yes 6C/12T but you said 6-12C, which is different
Low-end includes more than the top model, most of them are 4
 
Back
Top