GPU Ray Tracing Performance Comparisons [2021-2022]

They are not bad in comparison with AMD.
99% percentile low frame times is a bucket of frames that is often affected by either shader compilation or resources management. Looking at how gracefully these frame times decline with the amount of video memory, these are likely a resource allocation, data movement limited frames.
Please do post these graphs showing the graceful frame times because every single test posted so far says different.
 

CPU performance. Not too hot. There is not much performance scaling. Looks like a Dice fumble unfortunately. He didn't test with an AMD GPU so it’s possible Nvidia’s higher overhead is a factor.
That’s unfortunate; curious to know if they moved to GPU based dispatch or not as being a factor for CPU Load.
 
They were posted on previous page - GPUs with more vram have better frame times (the 1% low FPS metric)
Must have misunderstood your earlier post, I thought there were actual graphs showing the frametimes spread, which might give clue of they're what you were suggesting and not just that min times follow memory
 
I've gotta say that 4K is essentially an obsolete/pointless resolution now in the days of FSR/NIS/DLSS/TAAU. Thanks to NIS, literally every game has access to one or more of these technologies (at least if you have an NV card) and the delta from 4K is barely perceptible for an enormous performance uplift.
There are several user reports of games which don't work with NIS. And if it's the same implementation in drivers as with the SDK, it's DX11/12 only, which is far cry from "literally every game". Also at least some of the user reports of games where NIS doesn't work are DX11 and/or 12, where it actually should work.
Only FSR is actually universal and even it requires 3rd party injection tool for it.
 
There are several user reports of games which don't work with NIS. And if it's the same implementation in drivers as with the SDK, it's DX11/12 only, which is far cry from "literally every game". Also at least some of the user reports of games where NIS doesn't work are DX11 and/or 12, where it actually should work.
Not surprising since there are some open issues related to NIS being worked on. The bright side is once the issues are resolved almost all games optimized by GFE will be able to use NIS.
 
What every single test? GameGPU and ComputerBase tests show RTX 3000 GPUs having excellent minimum fps @4K with RT.

The 3080Ti is 30% better than 6900XT
https://gamegpu.com/action-/-fps-/-tps/battlefield-2042-test-gpu-cpu

The 3080 is 25% better than 6800XT
https://www.computerbase.de/2021-11...iagramm-battlefield-2042-3840-2160-raytracing
Those posted earlier in the thread. Like I said I misunderstood his post to mean there would be graphs with frametimespread showing his suggestion of high frametimes being a bucket of shader compilation or resource allocation affected frames.

It's curious how things change. It's not that long ago when avg was suggested to be practically irrelevant compared to framespikes, cause of stutters, not to mention minimum frametimes. Now suddenly just average is supposed to matter again. I wonder what changed in between :rolleyes:
 
The graph is 1440p, not 4K. Besides, the majority frametimes of the 3080 is measurably lower than the 6800XT.
That 1440p graph suggests a worse experience on the 3080 TBF. 4k could very well be different, but highly erratic and variable frametimes produce a worse experience even when AVG fps is higher. I suspect this is mostly down to DX12 being the only API. It has always been poor in Frostbite, particularly on Nvidia GPUs.
 
Like I said I misunderstood his post to mean there would be graphs with frametimespread showing his suggestion of high frametimes being a bucket of shader compilation or resource allocation affected frames.
The sole purpose of the 99% percentile frame times aka the 1% low "FPS" metric is to highlight outliers, you don't need graphs to figure that out unless that's a very first time you have to do with frame times.
Shader compilation and resources allocations are the two most common causes of stutters, so guess what frames will fall into the 99% frame time bucket?
Besides, you probably didn't notice, but look at the results for 3070 Ti here, drops like that usually happen due to memory allocation problems - memory leaks, etc.

I wonder what changed in between
People have changed. I am missing professional reviews from techreport.com, but other guys simply don't care what they are being fed with.
 
Last edited:
The sole purpose of the 99% percentile frame times aka the 1% low "FPS" metric is to highlight outliers, you don't need graphs to figure that out unless that's a very first time you have to do with frame times.
Shader compilation and resources allocations are the two most common causes of stutters, so guess what frames will fall into the 99% frame time bucket?
Besides, you probably didn't notice, but look at the results for 3070 Ti here, drops like that usually happen due to memory allocation problems - memory leaks, etc.
I doubt that it is VRAM related. Look at 6600XT 1% low result on a 4K graph in comparison to any Nvidia 8GB card - or even 12GB 3060 for that matter.
It's interesting though that out of all Nv cards 2080Ti seem to not be affected. May be a measurement glitch.
 
Last edited:
People have changed. I am missing professional reviews from techreport.com, but other guys simply don't care what they are being fed with.

Wow, after having refrained from checking this thread for a few weeks, apparently you are still feeling hurt by someone telling you that an amateur can compare the same scenes with different settings and present the performance figures, and say which one they prefer.

If you want professional views, from now on you ought to take part of GPU conferences or papers, or something else that is geared towards experts of the field.

It is also interesting how you still use the word professional. Are you requiring proof of a formal education or working experience in the field to meet that criteria, and if so, which sites fit it?
 
I've gotta say that 4K is essentially an obsolete/pointless resolution now in the days of FSR/NIS/DLSS/TAAU
I love DLSS and TSR, but FSR and NIS are soooooo lossy that I would never recommend using them unless there are no better options (tunning down some graphics settings would be a better option in this case imo).
Now everybody can see how lossy the spatial upscaling is by just enabling NIS on windows desktop where you actually need to distinguish between text details just to read something - https://imgsli.com/ODI5MjE
 
Last edited:
If you want professional views, from now on you ought to take part of GPU conferences or papers, or something else that is geared towards experts of the field.
That's what I actually do.

Are you requiring proof of a formal education or working experience in the field to meet that criteria, and if so, which sites fit it?
Experience would suffice, there are quite a few people here who read papers, watch GPU conferences, and know math and algorithms behind effects, so they can make a better judgment regarding what's needed, what's important and what's not.
I don't get why anybody should care about opinion of people, who likely don't even know how the effects which they review work or how they should look like, let's leave alone math and algorithms.
 
That's what I actually do.

Experience would do, there are quite a few people here who read papers, watch GPU conferences, and know math and algorithms behind effects, so they can make a better judgment regarding what's needed, what's important and what's not.
I don't get why anybody should care about opinion of people, who likely don't even know how the effects which they review work or how they should look like, let's leave alone math and algorithms.

Sure, lets not care about opinions anymore, sounds like a good idea. The majority of GPU reviewers are targeting PC gamers and showing them how different GPUs stack up, your expectations of them are are way off and alot of sites will fail meeting your critera.
Your position towards this debate just sounds weird. If game X gets 30 FPS on high-end GPUs at high settings with RT enabled at 1440 and we see screenshots or even a video of the benchmarked scene, am I interpreting it right that you really think that these results are worthless unless the reviewer is able to show that he understands the math? And if they are able to show proof that they do have the technical expertise, then it's all fine and valid?

PC gamers are convinced by the results they get, if gamers do not think that tessellation or raytracing or any other upcoming feature are worth the performance loss, it will get a bad reputation until the swansong game comes that convinces everyone of its worth.
GPU reviews and game benchmarks are for the most part quite straightforward, they try to find a scene which is easily repeatable, where you easily can compare the settings, and then they present the results. After that it is a fully subjective opinion of the reviewer and their individual readers whether enabling or disabling individual features are worth it or not.

Go back to the release of DX11, or DX10, and their respective launch benchmarks, and the same story with accusing them of being crap occured, despite there being all the explanatory texts on multiple sites of why they were better. Same deal with the now six-year-old DX12, elite programmers complained about DX11 being too limited and required lower level, then the first benchmarks were released with worse performance than DX11 and the PC gamers said it was crap. Or an opposite example, Vulkan for Doom 2016 getting alot of praise for the massive performance boost.
 
Last edited:
PowerVR has outlined 6 levels of RT acceleration:
  • Level 0: Legacy solutions/CPUs
  • Level 1: Software on traditional GPUs
  • Level 2: Ray/box and ray/tri-testers in hardware (AMD GPUs/consoles)
  • Level 3: Bounding Volume Hierarchy (BVH) processing in hardware (NVIDIA RTX GPUs)
  • Level 4: BVH processing and coherency sorting in hardware (PowerVR IMG CXT GPU)
  • Level 5: Coherent BVH processing with Scene Hierarchy Generation (SHG) in hardware

https://www.imaginationtech.com/whitepapers/ray-tracing-levels-system/
https://www.imaginationtech.com/ray-tracing/
 
Back
Top