GPU Ray Tracing Performance Comparisons [2021-2022]

BTW: Computerbase has Raytracing numbers from the Callisto Protocol: https://www.computerbase.de/2022-12...bschnitt_rdna_3_in_aktuellen_neuerscheinungen
A 4090 loses 100FPS in 4K and it performs worse than Cyberpunk with Pyscho RT settings on a 4090...
What a complete and utter shitshow. Performance that drops by 60-70% on every card and it's even worse on NVIDIA hardware where the RT impact should be much lower. The RT isn't even that good yet hurts performance more than Cyberpunk does with all effects enabled.
 
Oh so it seems the heavier the RT, more RDNA3 is taking a hit. I say that because in the previous round of review, rdna3 was showing a level around 3090/3090ti in RT, sometimes a little above. But when things get serious it seems it reversed ?
Yeah, the more you push the ray tracing, the more the 3090Ti gets boosted above the 7900XTX, and consequently all Ada GPUs too.

You can see that in path traced games (Quake 2 RTX, Minecraft RTX, Portal RTX), or in heavy RT titles (Dying Light 2, Cyberpunk 2077, Guardians of Galaxy).
 
I can see Intel pushing AMD in to 3rd place at this rate.

AMD just seem so inconsistent with their GPU's and architectures.

I was truly hoping RDNA3 would be their next RV770 moment.
Intel is impressive in RT. The 770 and 750 outperform the 3060 and even reach 3060 Ti levels of performance in heavy RT workloads. If they sort out their driver issues and continue improving, they could realistically leapfrog AMD.
 
I was talking about the gap between rasterisation performance and RT performance. The 7900 XTX is far behind the 4090 in rasterisation performance so no one expects it to compete in RT.
I was wondering yesterday how much of this can be attributed to 7900XTX hitting CPU limitations without RT? We know that 4090 hits them quite often even in 4K unless the game is really heavy on graphics or light on CPU.
 
Some more unique/intense RT testing from more RT unique/intense areas at 4K.

Fortnite HW Ray Tracing:
6950XT: 27
7900XTX: 39
4080: 45
3090Ti: 52
4090: 69

The 4080 is just 15% faster than 7900XTX, 3090Ti is 33% faster, and the 4090 is 75% faster.
The 3090Ti is also 92% faster than 6950XT, a bigger difference than 4090 vs 7900XTX!


Minecraft RTX:
6950XT: 16
7900XTX: 25
3090Ti: 36
4080: 44
4090: 61

The 3090Ti is 44% faster than 7900XTX, 4080 is 75% faster, and the 4090 is 2.4x times faster.

Bright Memory Infinite:
6950XT: 13
7900XTX: 20
3090Ti: 22
4080: 27
4090: 40

The 3090Ti is 10% faster than 7900XTX, 4080 is 35% faster, and the 4090 is 2x times faster.

Control:
6950XT: 23
7900XTX: 35
3090Ti: 41
4080: 45
4090: 68

The 3090Ti is 15% faster than 7900XTX, 4080 is 28% faster, and the 4090 is 94% faster.

Metro Exodus:
6950XT: 27
7900XTX: 42
3090Ti: 41
4080: 51
4090: 74

The 3090Ti is tied with the 7900XTX, 4080 is 20% faster, and the 4090 is 76% faster.


Cyberpunk 2077 (North Oak area):
6900XT: 10
7900XTX: 16
3090Ti: 22
4080: 26
4090: 38

The 3090Ti is 37% faster than 7900XTX, 4080 is 62% faster, and the 4090 is 2.35x times faster.

Dying Light 2 (Pilgrim's Path):
6900XT: 11
7900XTX: 22
3090Ti: 26
4080: 31
4090: 47

The 3090Ti is 18% faster than 7900XTX, 4080 is 40% faster, and the 4090 is 2.15x times faster.



Chernobylite:
6950XT: 18
7900XTX: 30
3090Ti: 35
4080: 42
4090: 62

The 3090Ti is 15% faster than 7900XTX, 4080 is 40% faster, and the 4090 is 2.05x times faster.


Guardians of the Galaxy:
6950XT: 31
7900XTX: 47
3090Ti: 60
4080: 69
4090: 93

The 3090Ti is 25% faster than 7900XTX, 4080 is 45% faster, and the 4090 is 97% faster.


What the heck is going on with Fortnite there? The 3090Ti is comfortably faster than the 4080 and not far off the 4090.

VRAM bandwidth bottleneck? Does NV have an optimised driver for Fortnite right now?

Between this and Calisto Protocol there's some pretty weird stuff going on right now. And it doesn't help Nvidia that those are two of the arguably biggest graphical showpiece available right now with Fortnite also being a potential harbinger for many future games.

I wonder if there are any Matrix City Sample demos that we can compare with.
 
Here's another Fortnight comparison looking more in line with expectations here. Perhaps the earlier one is just a testing error?

4090 is clearly running into CPU limits here (they're using a i7-10700K and testing with TSR performance enabled). Also interesting to see the difference here between HWRT and SWRT.


Slide16.JPG
 
Here's another Fortnight comparison looking more in line with expectations here. Perhaps the earlier one is just a testing error?

4090 is clearly running into CPU limits here (they're using a i7-10700K and testing with TSR performance enabled). Also interesting to see the difference here between HWRT and SWRT.


Slide16.JPG
Ada is still loosing more performance here when switching to h/w RT than either RDNA or Ampere even.
And absolute fps here are also weird - 4090 is certainly running CPU limited.
I'd say this benchmark confirms that there's some sort of an issue at play here if anything.
 
What the heck is going on with Fortnite there? The 3090Ti is comfortably faster than the 4080 and not far off the 4090.
I suspect they have the 3090Ti values switched with the 4080 values. At any rate, at 4K native HW RT the 4090 is commanding a huge lead of almost 2x over the 7900XTX.
 
The 3080Ti is 38% faster than 6900XT in The Witcher 3 RT Max settings at 4K.

6900XT: 18 fps
3080Ti: 25 fps
4080: 36 fps
4090: 47 fps

 
The 3080Ti is 38% faster than 6900XT in The Witcher 3 RT Max settings at 4K.

6900XT: 18 fps
3080Ti: 25 fps
4080: 36 fps
4090: 47 fps

47fps on a 4090? This seems a bit ridiculous. At least there's DLSS to save the day.

Edit: Seems it's like Callisto Protocol and Gotham Knights. NVIDIA cards lose an abnormal amount of fps when turning on RT. The 2080 Ti being matched by a 3060 Ti is strange. It's usually on par with a 3070.
 
Last edited:
The 2080 Ti being matched by a 3060 Ti is strange. It's usually on par with a 3070.
Not that strange, we've observed this in Quake 2 RTX, Minecraft RTX and Portal RTX, the 3070 is anywhere from 15% faster to 50% faster than the 2080Ti. The 3070Ti is even faster. The more you push ray tracing, either through path tracing or through piling up ray traced effects, the more the 3070/3070Ti comes up on top.
 
What a complete and utter shitshow. Performance that drops by 60-70% on every card and it's even worse on NVIDIA hardware where the RT impact should be much lower. The RT isn't even that good yet hurts performance more than Cyberpunk does with all effects enabled.

The RT in Callisto is better than most in a way because the art was actually made for it without going overboard. There's a tendency in games with RT so far to either crank stuff like reflectivity way too high just to show off that "hey we have realtime reflections!!!" or to not change the art at all and have stuff look wrong, like bricks that are noticeably reflective somehow. Years from now when people are over it being a novelty Callisto will look far better than lot of other recent games, much like artists vastly overdid bloom or lightshafts when those first came about.

That being said the game does appear f*ing borked from a performance perspective still, probably to meet the arbitrary deadline while shipping simultaneously on PC and across console generations. I'm glad someone at Avalanche just convinced their publisher to delay Hogwarts cross platform (last gen consoles come later now). Maybe the game will actually work right from the get go.
 
NVIDIA's official benchmarks for Portal RTX shows the 4080 being 2.4x times faster than 3090Ti, while the 4090 is 3.4x times faster at native 2K! Guess the game is pushing every Ada RT advantage in the book!

3090Ti: 29 fps
4080: 71 fps
4090: 101 fps

View attachment 1670973269952.png
 
Back
Top