GPU Ray Tracing Performance Comparisons [2021-2022]

Without any specific citation the best HDR displays according to your source don't even implement any G-sync technology as evidenced by the fact that they're fully functionally compatible with other hardware vendors despite being branded with G-Sync "Ultimate"
The very first and best display is G-Sync Ultimate, it has the module, so it delivers the best HDR experience by far.

The best HDR displays are the best because of their quality
Quality achieved by the module.

G-Sync is an open brand with closed quality control certification and a dead end technology ...
As shown above, your general statements are incorrect and are far from the truth, the best HDR displays remain G-Sync Ultimate, and has been for several years.
Hardware tessellation is deprecated in Unreal Engine
In UE5 of course, Nanite makes Tessellation redundant.

so maybe there's hope afterall that Epic Games will kill off HW RT too
Hope for AMD and their supporters of their dud hardware RT perhaps, but for the rest of the industry, that's not happening, but keep grasping at straws.

Spend 3+ years or more developing an in-house engine before they're able to officially start the project
Almost all in house engines are DXR capable now, do your research.

Valley of the Ancient is a much more technically demanding demo
Yeah, LOL .. now you are just imagining things, an empty desert with monster is somehow more technically demanding than an entire city with AI where you can fly, drive, and shoot things. Keep grasping at straws.
 
Says enough about your discussions. Comments like that dont go nowhere.
As I keep telling people times and again, the animosity towards HW-RT stems mostly only from the fans of AMD, it has nothing to do with the supposed "RT bring fps down", or "RT doesn't add much improvements".. etc, 90% of the arguments is from angry AMD fans who don't mind becoming anti progress if it means damage controlling AMD's lackluster RT performance.
 
Well atleast with UE5, Lurkmass has a point. The difference with HW and SW Lumen will be a lot less obvious than in games now where the comparisons are made to screen space reflections and SSGI, SW-Lumen is a lot more competent than those techniques and for outdoor scenes, most people are not going to notice any differences. On the plus side, there isn't a noticeable performance impact either when comparing SW/HW Lumen in GPU limit and if the game is made with HW-RT limitations in mind.

But I suspect some other engines like Snowdrop will take a different approach.
 
As I keep telling people times and again, the animosity towards HW-RT stems mostly only from the fans of AMD, it has nothing to do with the supposed "RT bring fps down", or "RT doesn't add much improvements".. etc, 90% of the arguments is from angry AMD fans who don't mind becoming anti progress if it means damage controlling AMD's lackluster RT performance.

I dont mind a good (even if heated) discussion, however things like 'i hope they will kill off tech X' isnt really even trying to forward a healthy technical discussion. A line like that shows what one's personal angrily views are. Todays digital foundry's Direct covers this and rightfully so put AMD at its place for lacking meaningfull competition in the ray tracing space. Just like rightfully so DF is putting corporations at their place for the horrible PSO stutter problems in some engines.
We all want AMD to stay competitive right? Intel now is investing heavily in the gaming GPU market and their actually having competitive hardware on their first try. For RDNA3 there were rumors for RT and ML acceleration akin to NV and Intels solutions, and that would have been great. However i think AMD wasnt really ready and we will be seeing competition (and better prices) when RDNA4 is around i guess.
Its not just affecting the PC gaming space, also the consoles. Its not long left before the design of the next consoles has to be considered, and then we can wonder if AMD is that good of a option to sign contracts with. Possibly Intel might be a good idea if BC stuff can be worked out somehow?
 
The very first and best display is G-Sync Ultimate, it has the module, so it delivers the best HDR experience by far.

Quality achieved by the module.
Well now your statement goes against the source in your last post ...

Credit mostly goes to the manufacturers who put in the work to make quality displays. None of it deservedly goes to Nvidia since they didn't help them at all on a technical basis ...
As shown above, your general statements are incorrect and are far from the truth, the best HDR displays remain G-Sync Ultimate, and has been for several years.
So basically outdated displays ...
In UE5 of course, Nanite makes Tessellation redundant.
Epic Games dropped HW tessellation in the recent upstream branches of UE4 as well ...
Hope for AMD and their supporters of their dud hardware RT perhaps, but for the rest of the industry, that's not happening, but keep grasping at straws.
I think Nanite has a good chance of killing off HW RT by itself. Animated Nanite foliage and terrain + HW RT is looking to be a really deadly combination on any hardware ...
Almost all in house engines are DXR capable now, do your research.
Quite a few are abandoning their own DXR capable in-house engines like Crystal Dynamics or CD Projekt ...
Yeah, LOL .. now you are just imagining things, an empty desert with monster is somehow more technically demanding than an entire city with AI where you can fly, drive, and shoot things. Keep grasping at straws.
It technically is since it's running at lower performance in comparison to the Matrix City sample ...
 
Credit mostly goes to the manufacturers who put in the work to make quality displays. None of it deservedly goes to Nvidia since they didn't help them at all on a technical basis ...
Those displays were made according to the standards of G-Sync Ultimate. You don't even see them under the supposedly more famous "FreeSync".

So basically outdated displays ...
State of the art, they are ahead of the curve of the rest. Maybe do some research first?
Epic Games dropped HW tessellation in the recent upstream branches of UE4 as well ...
Source?
Animated Nanite foliage and terrain + HW RT is looking to be a really deadly combination on any hardware ...
Wrong, in the RTX UE5 branch, they work really well with RTXGI and RTXDI. See latest NVIDIA updates on the subject.
It technically is since it's running at lower performance in comparison to the Matrix City sample ...
Ever considered a lack of optimizations? Since when lower performance means technically superior? Especially when you are rendering considerably less?
Quite a few are abandoning their own DXR capable in-house engines like Crystal Dynamics or CD Projekt ...
Many more remain. Meanwhile UE4 is accelerating the inclusion of RT in dozens of games every year, so there is that too.
 
Those displays were made according to the standards of G-Sync Ultimate. You don't even see them under the supposedly more famous "FreeSync".
That's false since there's FreeSync displays that have the same quality as G-Sync Ultimate displays. Look no further than G-Sync Ultimate displays which are FreeSync displays under the hood!
State of the art, they are ahead of the curve of the rest. Maybe do some research first?
Maybe you should post your citations first ?
It's been deprecated since Unreal Engine version 4.26 ...
Wrong, in the RTX UE5 branch, they work really well with RTXGI and RTXDI. See latest NVIDIA updates on the subject.
I haven't seen any RTX branches of UE 5.1 yet and how does the RTX branch of UE5 perform on the Valley of the Ancient sample with HW RT ?
Ever considered a lack of optimizations? Since when lower performance means technically superior? Especially when you are rendering considerably less?
Based on the nanite visualization, Valley of the Ancient terrain is more geometrically dense than the Matrix City sample and with no free performance left to spare for HW RT. Adding HW RT to Valley of the Ancient would make it unplayable on most systems ...
 
That's false since there's FreeSync displays that have the same quality as G-Sync Ultimate displays.
Nope, there is none.

Look no further than G-Sync Ultimate displays which are FreeSync displays under the hood!
Incorrect again, G-Sync Ultimate demand certain criteria not available in most Freesync displays "G-Sync Compatible".
It's been deprecated since Unreal Engine version 4.26
Nope, never happened to UE4.

The release notes weren’t too clear on this matter. We’ve since checked with the dev team, and tessellation still works for the time being. However, it will be deprecated and replaced with Nanite in UE5.

It's only deprectaed in UE5, because of Nanite.

Based on the nanite visualization, Valley of the Ancient terrain is more geometrically dense than the Matrix City
Once more, the Matrix city is the demo with the most resemblance of an actual open world game.
 
Except way worse since consoles were able to render with a higher internal resolution with HW RT on in the city sample as opposed to having a lower resolution with no HW RT enabled in the valley demo ...

If I set The Matrix demo to 1440p at Ultra settings my 3060ti drops to 13fps with DLSS on and 6fps with no up-scaling at all.............6fps!!

Just because it scales low enough to run on console doesn't mean it's not extremely demanding at the top end.
 
Here you go.



The Ancient Valley demo was made to showcase the capabilities of Nanite in creating extremely dense geometry, it wasn't made ao that games create this much dense geometry. The one created to showcase games is the Matrix demo.

It has nothing to do with this when rendering terrain Nanite overlap geometry and this is not compatible with raytracing. Any UE 5 games with terrain rendering and foliage will not have HW-RT. If you read all the Nanite presentation they done you would understand this.

For example The UE 5 Witcher game won't use hardware RT because of terrain rendering.

EDIT: Long term they think about changing Nanite terrain rendering technology.
 
Last edited:
Here you go.

They didn't specify that they were using Nanite for foliage in the video ...
The Ancient Valley demo was made to showcase the capabilities of Nanite in creating extremely dense geometry, it wasn't made ao that games create this much dense geometry. The one created to showcase games is the Matrix demo.
So then why did Epic Games add the world position offset feature in UE 5.1 to enable animated nanite foliage ? Epic Games are just making it easier to make incompatible content with HW RT with this change ...

Developers have even more reasons to turn off HW RT now if they're using nanite to render animated foliage and objects/characters/actors with terrain all at the same time ...
 
There are two UE 5 games and at least one of them Silent Hill 2 use Nanite and Lumen and it is not using HW-RT and running at 60 fps. If Tekken 8 use Nanite and Lumen it is 60 fps and it means not using HW-RT on PS5 and Xbox Series X. The real-time trailer running on PS5 is 60 fps.

I suppose out of a city without tons of foliage like Matrix demo many UE5 games won't use HW-RT, same if they want to run at 60 fps on current-gen consoles.
 
Last edited:
Nanite doesn't overlap anything, it's completely up to designers how they compose scenes - by kitbashing like in the Valley demo or more cleverly how it's done basically in any game.

I suppose if it was so easy they would not want to change Nanite terrain technology, kitbashing work well with Nanite just not compatible with hardware RT. But Land of Nanite was made with more care on this side but probably not compatible with RT too.
 
Back
Top