GPU Ray Tracing Performance Comparisons [2021-2022]

Ray tracing hardware acceleration with alternative world space transforms - NVIDIA Corporation (freepatentsonline.com)

Accordingly, some embodiments provide for taking a scene and rays being cast at the scene by the application and, transparently to the developer of the application, improve the efficiency of ray tracing for the scene by transforming the rays to an alternate world space that reduces false positive hits with AABBs during traversal of the acceleration structure for that scene. As noted above, this optimization can be automatically and dynamically performed for selected individual frames or sequences of frames in a series of frames that are otherwise traversed by default in a world space.
 
Updated Ray Tracing testing using the 3090Ti vs 6900XT LC (liquid cooled and overclocked), done by PCGH. Note: testing is done using FSR Ultra Quality where applicable, also ReBAR is enabled.

Dying Light 2 (FSR) :
3090Ti is twice as fast (200%) @ both 2160p and 1440p than 6900XT LC

Cyberpunk 2077 (FSR):
3090Ti is 80% faster @2160p, and 65% faster @1440p than 6900XT LC

Guardians Of The Galaxy:
3090Ti is 70% faster @2160p, and 65% faster @1440p than 6900XT LC

Lego Builder's Joruney (FSR):
3090Ti is 70% faster @2160p, and 55% faster @1440p than 6900XT LC

Doom Eternal:
3090Ti is 60% faster @2160p, and 45% faster @1440p than 6900XT LC

Ghostwire Tokyo (FSR):
3090Ti is 50% faster @2160p, and 40% faster @1440p than 6900XT LC

Metro Exodus Enhanced:
3090Ti is 50% faster @2160p, and 38% faster @1440p than 6900XT LC

Riftbreaker (FSR):
3090Ti is 50% faster @2160p, and 60% faster @1440p than 6900XT LC

F1 2022 (FSR):
3090Ti is 35% faster @2160p, and 30% faster @1440p than 6900XT LC

Far Cry 6:
A draw

https://www.pcgameshardware.de/Gefo...UVP-Release-Benchmark-Specs-kaufen-1391700/2/
 
Last edited:
Updated Ray Tracing testing using the 3090Ti vs 6900XT (the regular one) done by TPU.
https://www.techpowerup.com/review/msi-geforce-rtx-3090-ti-suprim-x/34.html

Cyberpunk 2077:
3090Ti is 220% faster @2160p, and 200% faster @1440p than 6900XT (twice as fast)

Control:
3090Ti is 90% faster @2160p, and 75% faster @1440p than 6900XT

Metro Exodus:
3090Ti is 70% faster @2160p, and 50% faster @1440p than 6900XT

Doom Eternal:
3090Ti is 65% faster @2160p, and 45% faster @1440p than 6900XT

Deathloop:
3090Ti is 55% faster @2160p, and 50% faster @1440p than 6900XT

Resident Evil Village:
3090Ti is 40% faster @2160p, and 25% faster @1440p than 6900XT

Watch Dogs Legions:
3090Ti is 50% faster @2160p, and 25% faster @1440p than 6900XT

F1 2022:
3090Ti is 35% faster @2160p, and 25% faster @1440p than 6900XT

Far Cry 6:
A draw

We then double that down with testing from the Sweclockers.
https://www.sweclockers.com/test/34...0-ti-snabbt-dyrt-och-laskigt-effekttorstigt/5

Metro Exodus:
3090Ti is 70% faster @2160p than 6900XT

Control:
3090Ti is 90% faster @2160p than 6900XT

Battlefield V:
3090Ti is 65% faster @2160p than 6900XT

Some more tests from KitGuru.
https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/22/
https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/21/
https://www.kitguru.net/components/...ss/nvidia-rtx-3090-ti-review-ft-msi-palit/20/

Cyberpunk 2077:
3090Ti is 200% faster @2160p and @1440p than 6900XT (twice as fast)

Metro Exodus Enhanced:
3090Ti is 220% faster @2160p (twice as fast) and 95% faster @1440p than 6900XT

Resident Evil Village:
3090Ti is 30% faster @2160p and @1440p than 6900XT


Some more tests from Golem.
https://www.golem.de/news/geforce-rtx-3090-ti-im-test-nvidias-ampere-brechstange-2203-164068-2.html

Cyberpunk 2077:
3090Ti is 230% faster @2160p than 6900XT (twice as fast)

Metro Exodus Enhanced:
3090Ti is 85% faster @2160p than 6900XT

Lego Builder's Journey:
3090Ti is 90% faster @2160p than 6900XT

Riftbreaker:
3090Ti is 48% faster @2160p than 6900XT
 
Well… thats great. But it also costs 100% more than a 6900XT and sucks as much power than a heat pump Heating for an entire home.

This is simply pervers
 
Well… thats great. But it also costs 100% more than a 6900XT and sucks as much power than a heat pump Heating for an entire home.
Slash 5% of the 3090Ti's performance advantage and you get the 3090 results, with comparable power consumption to the 6900XT, better yet slash 10% and you get the 3080Ti, with comparable price.

Even after the slashing, the gap in ray tracing performance is simply HUGE and vast, it wouldn't matter if you even slashed 20%.

Any game that uses multiple RT effects will make Ampere soar to the 50% performance advantage and beyond, that's not taking into account path traced games such Minecraft RTX or Quake 2 RTX.

Also the first comparison is comparing against the 6900XT LC, which is priced at 1600$ to 1800$.
 
Last edited:
Is there a ray-tracing frames per watt measurement out there?

Not sure if there are edge cases but power consumption should just whatever the power limits set at assuming it's a GPU limited scenario. Performance would be whatever number you want to use.

In terms of edge cases, would say that Cyberpunk 2077 (at 200% more perf) result possibly mean very heavy under utilization (and therefore power consumption) on the say the 6900xt? I haven't seen an example of this. Otherwise the numbers I've seen just show both Ampere and RNDA2 GPUs just hitting their power limits no different than non ray trace.
 
6900 XT uses a bit more power when running RT, about 10-20 watts over rasterization. A 3090 uses quite a bit more power across a variety of games. They are not in the same ballpark.

 
Last edited:
6900 XT uses a bit more power when running RT, about 10-20 watts over rasterization. A 3090 uses quite a bit more power across a variety of games. They are not in the same ballpark.


I'm not sure if that's the right interpretation. The RTX 3090 has a higher TDP at 350w vs 300w and therefore uses about 50w more power in GPU limited gaming scenarios, regardless of what those are. Modern cards basically just all clock up and then run at their limits. It's not like the pre Kepler/GCN era where you could have extremely wide ranging power consumption especially if you ran something like Furmark. The video in question is basically showing this to be case, all the tests essentially hit roughly the same ceiling without much significant deviation.

Also how reliable is that youtube channel? Some of those youtube channels, especially the one's that only show gameplay loops with stat overlays with zero other footage, are somewhat controversial/questionable in terms of veracity. The 275w limit shown by the 6900xt in this sample seems to be different than other publications that are more vetted -

https://www.tomshardware.com/reviews/amd-radeon-rx-6900-xt-review/4
https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/31.html
https://www.igorslab.de/en/grasps-a...with-benchmarks-and-a-technology-analysis/13/ (igor's does have both raster and ray trace data)

All three of those that measure power at the GPU essentially show them roughly in spec with what their official TDP power limits are. Which make sense since the power governors are set in mind based on other engineering/safety considerations and would cause issues if one type of workload ends up consistently way above maximum spec.
 
Modern cards basically just all clock up and then run at their limits.

Yeah that’s been my experience in GPU limited scenarios. Current cards are really good at holding at the power limit. So if they’re reading lower it’s due to a cpu limitation or other fps cap. A 3090 power limited to 300w would probably still wipe the floor with the 6900xt.

We have enough data points now that it’s clear RDNA2 is at a significant disadvantage in DXR. Given the rapid adoption of RT it’s almost guaranteed that AMD will revisit their approach with RDNA3.
 
Back
Top