GPU Ray Tracing Performance Comparisons [2021-2022]

So with RTXDI, NVIDIA posted some interesting performance numbers for ray tracing vs raster in Unreal Engine 5.

RTXDI enables hundreds of ray traced lights per scene, all shadow casting ray traced shadows, with area lights, point lights and emessive properties .. the shadow cast are ray traced soft shadows as well. RTXDI renders those lights in a single pass, significantly boosting performance .. Traditional ray tracing lights and shadows are much slower in comparison. Obviously raster has much less visual capabilities, shadow casting lights use shadow maps, and the lights has to be culled and even streamed according to distance, because otherwise raster can't handle this many lights.

Scene 1: hundreds of lights, comparing RTXDI vs traditional ray tracing.
RTXDI: 80 fps
Raster: 30 fps

RTXDI is 2.6X times faster than traditional ray tracing.

Scene 2: one light source, RTXGI, ray traced shadows, ray traced AO and ray traced reflections vs raster with no dynamic GI, or soft shadows, or reflections.
RTXGI: 73fps
Raster: 63fps

Ray Tracing is 15% faster than raster, while delivering much higher image quality.

Scene 3: hundreds of lights, with RTXDI, RTXGI, ray traced shadows and ray traced AO, vs Raster with no GI, AO, or soft shadows.
Ray Tracing: 50fps
Raster: 25fps

Ray Tracing is 2X times faster than raster, while delivering much higher image quality.

 
Is there another demo/talk where they explain how rtxdi can do all the shadow stuff in a single pass, vs multipl pass for present ray tracing tech/engines ?
 
Some preliminary numbers comparing 4090 vs 7900XTX based on published AMD numbers.

Cyberpunk 2077, native 4K:

4090: 40fps
3090Ti: 23fps
6900XT LC: 11fps
7900XTX: 17fps (50% faster than 6900XT LC)

The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 35% faster.

Metro Exodus EE, native 4K:

4090: 87fps
3090Ti: 48fps
6900XT LC: 25fps
7900XTX: 37.5fps (50% faster than 6900XT LC)

The 4090 is 2.3X faster than 7900XTX, the 3090Ti is 28% faster.

Dying Light 2 native 4K:

4090: 44fps
3090Ti: 24fps
6900XT LC: 11fps
7900XTX: 20fps (56% faster than 6900XT LC)

The 4090 is 2.2X faster than 7900XTX, the 3090Ti is 20% faster

Hitman 3 native 4K:

4090: 43fps
3090Ti: 23fps
6900XT LC: 16fps
7900XTX: 26fps (85% faster than 6900XT LC)

The 4090 is 65% faster than 7900XTX



AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2043_575px.png


 
Last edited:
And that is without SER. The gap between Navi31 and AD102/AD103 will be bigger in future games than it is today. I think that is a negative suprise. AMD should have been able to improve more than nVidia has done with Lovelace.
 
With that RT arch gain, same CU count and likely similar clock speeds for N33 as N23 that means it'll be similar to or slower than the A770, maybe A750 in heavy RT titles like Dying Light 2, Cyberpunk, Metro, Control etc right? Seeing as they're around >=1.5x the 6600XT in those titles
 
With that RT arch gain, same CU count and likely similar clock speeds for N33 as N23 that means it'll be similar to or slower than the A770, maybe A750 in heavy RT titles like Dying Light 2, Cyberpunk, Metro, Control etc right? Seeing as they're around >=1.5x the 6600XT in those titles
That will be embarrassing.. Nvidia and intel are on level 2 and level 3 ray tracing while amd is on level 1
 
That will be embarrassing.. Nvidia and intel are on level 2 and level 3 ray tracing while amd is on level 1
Actually there is nothing embarrassing about that, it's just a choice AMD made. I guarantee if they were to catch up to Nvidia in RT, the architecture would've been way more expensive than it is now.

AMD is just focusing on price to raw performance ratio again, that's what they always do. But this time it's not just a couple of bucks like it was with RDNA1 and Turing (and RDNA1 had a much inferior featureset) but actually it's a price difference of $1699 versus $999 and you get a performance that appears to be close enough to the 4090. You also get 24 GB VRAM which was unheard of previously in this price range.

Only when titles release that require RT to run, then it will shift in favor of Nvidia in terms of price to performance. But that might take a while, sadly (I was hoping it'd happen sooner)
 
Actually there is nothing embarrassing about that, it's just a choice AMD made. I guarantee if they were to catch up to Nvidia in RT, the architecture would've been way more expensive than it is now.

AMD is just focusing on price to raw performance ratio again, that's what they always do. But this time it's not just a couple of bucks like it was with RDNA1 and Turing (and RDNA1 had a much inferior featureset) but actually it's a price difference of $1699 versus $999 and you get a performance that appears to be close enough to the 4090. You also get 24 GB VRAM which was unheard of previously in this price range.

Only when titles release that require RT to run, then it will shift in favor of Nvidia in terms of price to performance. But that might take a while, sadly (I was hoping it'd happen sooner)
When rt heavy games with global illumination it will be destroyed by the 4080 and 4090 with shader reordering rt.. but at least it looks level with ampere.. but if the 3090ti beats the 7900 xtx its 😐
 
When rt heavy games with global illumination it will be destroyed by the 4080 and 4090 with shader reordering rt.. but at least it looks level with ampere.. but if the 3090ti beats the 7900 xtx its 😐
Most games will be using UE5 with Lumen in the future, and it seems Lumen is light on Raytracing-Hardware.


UE5 is the reason why I believe RDNA3 will do very well in future games.

If we had more games like Avatar Frontiers of Pandora and Metro Exodus Enhanced Edition, then I would look at this matter from an entirely different perspective.
 
Most games will be using UE5 with Lumen in the future, and it seems Lumen is light on Raytracing-Hardware.

Really?

EA games wont, Ubisoft games won't, Sony first party games won't, most of Microsoft first party games won't, Rockstar games won't, COD games won't........

In fact, there will be easily more games that don't use UE5 and Lumin than there will be games with.
 
Really?

EA games wont, Ubisoft games won't, Sony first party games won't, most of Microsoft first party games won't, Rockstar games won't, COD games won't........

In fact, there will be easily more games that don't use UE5 and Lumin than there will be games with.
We don't know exactly what direction EA (we might see what they think soon enough) or Microsoft and especially Take-Two Interactive are going to take quite yet in regards to HW RT ...

The next several Ubisoft games are going to be sponsored by AMD so they'll have a strong incentive to find alternatives or ideal optimizations for their partner. Activision dropped HW RT support on their latest iteration of COD because the console implementations weren't worthwhile for them enough to maintain the feature ...

Several of Sony's latest AAA game releases don't even use HW RT so no vendor that is trailing behind others in RT has anything to worry about in the near future there ...
 
We don't know exactly what direction EA (we might see what they think soon enough) or Microsoft and especially Take-Two Interactive are going to take quite yet in regards to HW RT ...

The next several Ubisoft games are going to be sponsored by AMD so they'll have a strong incentive to find alternatives or ideal optimizations for their partner. Activision dropped HW RT support on their latest iteration of COD because the console implementations weren't worthwhile for them enough to maintain the feature ...

Several of Sony's latest AAA game releases don't even use HW RT so no vendor that is trailing behind others in RT has anything to worry about in the near future there ...

That has nothing to do with the point, the point was in regard to his claim that "Most games will be using UE5 with Lumen in the future"

Which from we have announced at the moment is incorrect with no evidence to back that up.
 
Of course there is evidence for my claim.



These are all studios switching to UE5. https://mobidictum.biz/why-are-studios-switching-to-unreal-engine-5/
Sigh, no it's not.

Give me the list of confirmed UE5 games as of now and then give me a list of games not using UE5, what is there more of? Exactly.......

What you are doing is nothing but a reach and I don't know why, are you trying to downplay AMD's piss poor RT performance by claiming everyone will be using software?
 
That has nothing to do with the point, the point was in regard to his claim that "Most games will be using UE5 with Lumen in the future"

Which from we have announced at the moment is incorrect with no evidence to back that up.
Even if his point is incorrect that doesn't mean the rest of his entire leading argument is null or void ...

For now, AMD doesn't have anything to worry about even with developers not using lumen since they have an incentive to make their games both easy to maintain and perform well on consoles and if they do it'll be likely in the extended future in 2024 where they're just going to release new hardware again to be able to reset the playing field once more ...
 
Even if his point is incorrect that doesn't mean the rest of his entire leading argument is null or void ...

For now, AMD doesn't have anything to worry about even with developers not using lumen since they have an incentive to make their games both easy to maintain and perform well on consoles and if they do it'll be likely in the extended future in 2024 where they're just going to release new hardware again to be able to reset the playing field once more ...

I wasn't addressing his whole point was I? Merely his UE5 claim.
 
Back
Top