GPU Ray Tracing Performance Comparisons [2021-2022]

It's looking like MSRP comparisons could shift from last gen though. 7800 will likely be cheaper than 4070 where as 6800 was dearer than 3070 at least for the effectively fictional MSRP. Lower pricing does appear to be AMD's answer to weak ray tracing this gen, which is disappointing. Hopefully next gen they can finally compete on all fronts but still keep Nvidia in check with more reasonable pricing.

Alternatively, pricing only appears lower because NV decided to jack up the price on the 4080. The top end 7900 XTX is launching at the same MSRP as the top end 6900 XT from last generation. If AMD were truly going after the price play as they did with the 4870 and 4850, these would be launching at an even lower price point, IMO.

It'll be interesting to see if AMD manages to have similar RT performance with the 7900 XTX as the 4080 but with much greater raster performance. Even if it's still a bit slower at RT it's also still a bit cheaper.

Regards,
SB
 
Alternatively, pricing only appears lower because NV decided to jack up the price on the 4080. The top end 7900 XTX is launching at the same MSRP as the top end 6900 XT from last generation. If AMD were truly going after the price play as they did with the 4870 and 4850, these would be launching at an even lower price point, IMO.

It'll be interesting to see if AMD manages to have similar RT performance with the 7900 XTX as the 4080 but with much greater raster performance. Even if it's still a bit slower at RT it's also still a bit cheaper.

Regards,
SB
4080 has RT and raster performance about 20% faster than a 3090ti if I'm remembering the Nvidia graph properly. 7900XTX will win easily in raster but RT performance is below that of a 3090ti.
 
Alternatively, pricing only appears lower because NV decided to jack up the price on the 4080. The top end 7900 XTX is launching at the same MSRP as the top end 6900 XT from last generation. If AMD were truly going after the price play as they did with the 4870 and 4850, these would be launching at an even lower price point, IMO.

It'll be interesting to see if AMD manages to have similar RT performance with the 7900 XTX as the 4080 but with much greater raster performance. Even if it's still a bit slower at RT it's also still a bit cheaper.

Regards,
SB
We should hope that the 7900XTX will not be slower than a 3080 12GB in raytracing... When was the last time that a new generation cant beat the old one? The 7900XT will be even slower than a 3090. Production cost is much higher, too - 300mm^2 5nm chip, 185mm^2 6nm MCDs and performs worse than a 8nm, 628mm^2 chip.
 
20% faster how? In Spider-Man's heavy RT scenes, @4K resolution, the 3080 is 65% faster than 6800XT using max RT settings, and 45% faster using medium RT settings.


Even without heavy scenes, the 3080Ti is 45% faster than 6900XT at both 1440p and 2160p using max RT settings.
Ok, fair enough.

These were the only benchmarks I could find:

 
Ok, fair enough.

These were the only benchmarks I could find:

Yeah, this is going to very a lot because Spidey has no built-in benchmark. In some scenes with multiple reflective surfaces close by, Ampere/Turing will murder RDNA2. In other scenes where there is little in terms of RT reflections, there will barely be a difference.
 
We should hope that the 7900XTX will not be slower than a 3080 12GB in raytracing... When was the last time that a new generation cant beat the old one? The 7900XT will be even slower than a 3090. Production cost is much higher, too - 300mm^2 5nm chip, 185mm^2 6nm MCDs and performs worse than a 8nm, 628mm^2 chip.

RDNA2 vs RTX2000 series in RT 👀
 
Not really indicative of real world performance though, in heavy RT scenes like Dying Light 2 a 2080ti is the same as a 6950XT and in other games can beat it.
Games are rarely limited by triangle intersections speed in RT and are mostly limited by shading.
Also Turing+ can run AABB intersections in parallel to shading while I'm not sure about RDNA2.
And AMD may have issues reaching that peak number because of ray divergence negatively impacting their traversal evaluation.
 
So is RDNA3 accelerating BVH traversal in hardware or not? The AMD slide did say improvements were made to BVH traversal.
 
Games are rarely limited by triangle intersections speed in RT and are mostly limited by shading.
Also Turing+ can run AABB intersections in parallel to shading while I'm not sure about RDNA2.
And AMD may have issues reaching that peak number because of ray divergence negatively impacting their traversal evaluation.
Sounds like the need a ray sorting unit or shader reordering
 
And the offensive, anti RT position you are showing is indicative of otherwise too, funny these arguments are showing only when AMD is beaten time and again in RT, which indicates again that these fake arguments stem only from a defensive position of AMD's weak spot in RT.

Yep, just stick around this thread, and enjoy the RT ride.
Either way only time will tell ...
PhysX was a precursor for GPU particles. Just like TXAA was a precursor for TAA.
That must be a very polite way of describing it as a dead end ...

Mantle was a precursor to more explicit APIs but would you award it a similar amount of credit by your own logic ?
NVIDIA forced display makers to make quality displays, the point of the G-Sync module was never about making stuff exclusive to NVIDIA, it was to force displays to have some sort of quality design to deliver optimal variable refresh rate experience. FreeSync encouraged the spread of trash displays, and later had to adopt several tiers to distinguish good ones from bad ones. With G-Sync you knew your display is good. That's why display makers rushed to have the G-Sync badges, because having it meant the display is good. So you see, you are contradicting yourself again, what won here is the standard that promoted quality, not the standard that made a mess out of quality, and was made in a rush to steal headlines with no regards to quality.

If you care about proper implementations and specifications then you know it's the one that ended up winning with G-Sync.
@Bold And look where that got Nvidia ? Everyone else is implementing alternative technology that's not even tied with the original brand! G-Sync technology became both redundant and obsolete overnight ever since Nvidia decided to embrace the industry backed alternative that didn't originate from them. No one else but Nvidia "implements" the G-sync "standard" in the industry ...

Can you make any real guarantees that the original G-sync displays will still somehow work on Nvidia HW in the next 5 years ? How about in 10 years ?
Who cares? Unity has RT, that's what matters. Also you are wrong, ARM GPUs are now RT capable. Only Apple is left behind.
Well consumers and technical outlets certainly care because using a port of an iOS game isn't compelling content since the design limitations are still carried over from that platform. Adding ray tracing to a port of a mobile game doesn't change the fact that people are still playing a low budget production value game ...

It's not just Apple that has yet to divulge any details but Qualcomm doesn't have any public plans so far for HW RT as well. Together both Apple and Qualcomm dominate the high-end mobile devices market and Unity Technologies with their mobile centric engine can't ignore their voices since the demographics they appeal to (America, high-income European countries) are more likely to pay for more lucrative services as opposed to the average ARM graphics user ...
Two useless demos still, the real demo is the Matrix demo, you know .. the one where you walk, fly and drive around like an actual game. The one that actually supports HW-RT in a spectacular way. When Epic wanted to make a nex gen demo, they used this demo and they used HW-RT to pack the punch. And that's what the rest of the industry is doing.
How were the other demos any less "real" when they had interactivity as well ?
UE55.1 RTX branch fully supports nanite and foliage with very good performance. UE5 is a constantly changing landscape .. you are naive if you think Epic will risk losing NVIDIA's long standing support. More RT features and enhancements will come, as Epic stated. So stick around, and watch them do it.
Can the latest RTX branch actually run those "two useless demos" with HW RT yet ?

Notice how you mentioned that Nvidia's "support" comes in the form of their own proprietary fork (RTX branch) as opposed to actually contributing back to the upstream/master branch ? Epic Games doesn't really care about the RTX branch since it's mostly full of Nvidia's libraries for RT effects for which Epic Games themselves don't maintain at all. The RTX branch could burn in hell and Epic Games still wouldn't care ...

Do you seriously think that developers will actually prefer the RTX branch over Epic Game's own upstream/master branch where it's far more likely that Epic Games will provide superior technical support for their own product ?
 
Mantle was a precursor to more explicit APIs but would you award it a similar amount of credit by your own logic ?
Yes I would of course. Only teething pains here is that DX12 is causing equal troubles to both AMD and NVIDIA for no good gains.
Can you make any real guarantees that the original G-sync displays will still somehow work on Nvidia HW in the next 5 years ? How about in 10 years ?
NVIDIA supports their stuff longer than anyone else.

No one else but Nvidia "implements" the G-sync "standard" in the industry ...
Once display makers started making quality displays, the need for the module became irrelevant, the quality standards remain there, almost all premium displays follow the G-Sync standards now. The G-Sync implementation is about the standard, not the module, and the majority of new displays now follow the standards of G-Sync, that's why they get to carry the badge.

It's not just Apple that has yet to divulge any details but Qualcomm doesn't have any public plans so far for HW RT as well.
Only a matter of time now.
How were the other demos any less "real" when they had interactivity as well ?
They were scripted, and had no gameplay. They were also far less graphically impressive.

Can the latest RTX branch actually run those "two useless demos" with HW RT yet ?
Who cares? They are not real demos. The Matrix demo is the only one that matters.

Do you seriously think that developers will actually prefer the RTX branch
You simply don't know, at any rate, don't chnage the subject, UE5 is constantly adding new HW-RT features, you don't get to make false sweeping statements about an imaginary animosity towards HW-RT.
 
If history shows anything about UE development the main branch there tend to incorporate features from IHV branches once these are fully fleshed out.
It has happened with UE5 already with Lumen getting RT h/w support options.
 
NVIDIA supports their stuff longer than anyone else.
Long enough that they will somehow never obsolete the technology ? I find that to be doubtful when no recent displays have implemented the technology ...

Technology needs inertia in order to remain viable so how will original G-sync standard do that with no virtually no one else implementing this ?
Once display makers started making quality displays, the need for the module became irrelevant, the quality standards remain there, almost all premium displays follow the G-Sync standards now. The G-Sync implementation is about the standard, not the module, and the majority of new displays now follow the standards of G-Sync, that's why they get to carry the badge.
You're conflating the concept of quality with standards ...

Quality =/= Standards
Only a matter of time now.
I wouldn't be so sure and how do you know their HW RT implementations won't end up as bigger duds than AMD ?
They were scripted, and had no gameplay. They were also far less graphically impressive.


Who cares? They are not real demos. The Matrix demo is the only one that matters.
There actually was gameplay in Valley of the Ancients where you can engage in a small combat sequence against an ancient itself and shoot objects as well. Lumen in the Land of Nanite also showed us gameplay as well in the form of context sensitive platforming mechanics popularized by the Assassin's Creed franchise so how exactly are they any less real from a gameplay perspective ?
You simply don't know, at any rate, don't chnage the subject, UE5 is constantly adding new HW-RT features, you don't get to make false sweeping statements about an imaginary animosity towards HW-RT.
I wonder if Epic Games will continue developing/maintaining HW RT especially if their biggest customers (AAA game developers) don't ship the feature on consoles ...
 
Back
Top