Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]


Looking inside the discussion here, most people really seem to think Raytracing is not worth it. Kind of a shame, really.

I can't wait until RT finally is a standard and it always enabled regardless of the setttings and is more efficient as well. Then that mindset will hopefully change.

If people thought AMD’s products provided greater value AMD would enjoy more than 8% market share.
 
This is not reflected in market share numbers between IHVs, nor reflected in the number of games/engines adopting ray tracing.
The question is if that is because of Raytracing or due to a variety of other factors. Nvidia had way higher marketshare than AMD before RTR was a thing, so I don't think this is a solid argument that proves RT is important to many gamers.

Regarding game/engines, I would go along with you if UE5 did not exist.


Many developers are just going to use Software Lumen because it has fewer compromises in terms of performance. It still looks very competent.
 

Looking inside the discussion here, most people really seem to think Raytracing is not worth it. Kind of a shame, really.

I can't wait until RT finally is a standard and it always enabled regardless of the setttings and is more efficient as well. Then that mindset will hopefully change.
I would not say that is in anyway representative - those who care enough to post, are those who are interested in the topic already to make posts about it. It is not a general question to random participants or anything.

If you go online and directly ask, you get selection bias immediately.
This is not reflected in market share numbers between IHVs, nor reflected in the number of games/engines adopting ray tracing.
That too.

In general it is easy to see that the voices against ray tracing in every online forum, poll, reddit, subredit, whatever are not at all representative of the IHV split or the way game devs are doing things (which are *actually* representative of consumer pressures).
 
I would not say that is in anyway representative - those who care enough to post, are those who are interested in the topic already to make posts about it. It is not a general question to random participants or anything.

If you go online and directly ask, you get selection bias immediately.

That too.

In general it is easy to see that the voices against ray tracing in every online forum, poll, reddit, subredit, whatever are not at all representative of the IHV split or the way game devs are doing things (which are *actually* representative of consumer pressures).
That is true, it's not very representative. But that is just the general vibe I get from other social media networks too.

I guess the only way to get representative data is if games would track per hardware if RT is enabled or not and phone home so to speak. I guess CDPR has implemented something like that in CP2077 called game analysis.

But as I have mentioned earlier, market share is not representative data on that matter either.
 
The question is if that is because of Raytracing or due to a variety of other factors. Nvidia had way higher marketshare than AMD before RTR was a thing, so I don't think this is a solid argument that proves RT is important to many gamers.
We've had crypto bubbles before, so crypto hangover is nothing new, and NVIDIA prices were consistently higher than ever, with AMD prices consistently being way lower than NVIDIA, AMD was also more or less on par with NVIDIA rasterization wise (for the first time in generations), AMD cards have also higher VRAM capacity than NVIDIA's counter parts, and had lower power consumption too, so all factors are pro AMD this time and against NVIDIA.

Yet, during 2022, NVIDIA captured the highest market share ever in history, much higher than before .. the 2060 and 3060 are the most popular ray tracing cards and most popular rasterization cards at the same time, and all of NVIDIA's high end RT cards are significantly outselling their competitors, all of that has to count for something.

In my opinion, the perception of NVIDIA's GPUs is that they are technologically superior, if you want performance there is DLSS, if you want quality there is much higher ray tracing performance and game support, if your interest is professional work, there is CUDA, Optix and several ML applications .. etc. That perception lead to said results.
 
Last edited:
The question is if that is because of Raytracing or due to a variety of other factors. Nvidia had way higher marketshare than AMD before RTR was a thing, so I don't think this is a solid argument that proves RT is important to many gamers.

The desire or ability to enable RT in games is probably not the most important criteria. However the narrative around RT performance feeds the perception of Nvidia as market leader and innovator. That perception in turn bolsters trust in the brand.

There are hordes of people buying Nvidia cards who aren’t playing RT games or turning it on in games that support it. UE5 will mitigate it somewhat but the RT train has left the station and it’s not turning back.
 

Looking inside the discussion here, most people really seem to think Raytracing is not worth it. Kind of a shame, really.

I can't wait until RT finally is a standard and it always enabled regardless of the setttings and is more efficient as well. Then that mindset will hopefully change.
It’s still a luxury feature and the majority of people can’t run it without serious compromises. I say RT is only really viable for 3070-class cards and above and even on my 2080 Ti which is equivalent, I avoided it most of the time.

Still wouldn’t rely on a reddit thread to make any conclusions. Ampere cards still outsold RDNA2 cards to an hilarious degree.
 
I would not say that is in anyway representative - those who care enough to post, are those who are interested in the topic already to make posts about it. It is not a general question to random participants or anything.
Textbook confirmation bias lol.
 
In general it is easy to see that the voices against ray tracing in every online forum, poll, reddit, subredit, whatever are not at all representative of the IHV split or the way game devs are doing things (which are *actually* representative of consumer pressures).

Obviously those that are against rt arent even pc gamers, as most pc gamers are either looking at NV gpus or have them.
 
If people thought AMD’s products provided greater value AMD would enjoy more than 8% market share.
Isn't the point that people's perception isn't in line with best offerings? Marketing clearly plays a role in market share and more than once throughout history, the better option/solution hasn't commanded the strongest position because of other factors. An argument here is that many nVidia buyers would be better off chosing AMD, similar to arguments like many PS buyers being better off with XB etc. But market leaders tend to gain momentum and dominate mindshare with people not making purely objective choices.
 
I've bought Miles for my PC.

I use the recommended settings by Alex but with the following RT settings:

- RT shadows on medium
- RT reflection resolution on high
- RT reflection geometry detail on very high
- Object range is 8

I've only done the first section of the game and done some free swinging around Time Square and have made the following observations:

- Using any form of upscaling destroys reflection quality
- My 3060ti can handle native 1440p at 60fps with room to spare!
- My i3 12100f at the above RT settings is at 60fps for 98% of the time
- The heaviest section of the game is Time Square and the lowest frame rate I've seen here is 55fps due to my CPU
- Like the other Spiderman game, very high reflection resolution has a lot of artefacts like it's using checkerboarding or something to achieve the higher resolution rather than just using a pure resolution bump
- My CPU is generally at 80% on all 8 threads (It's a quadcore + HT) and it only costs £113!!
 
Last edited:
And it'll still use RT hardware if it's available and run even faster.

For PC users, we don't care about what consoles get, I am almost 100% confident the PC version of these UE5 console ports will ship with hardware RT for many reasons:

-Better GI quality and precision in open world games, as hardware Lumen can cover a 1000m distance, and has better resolution and less light leaks.

-Better dynamism, as hardware Lumen (global illumination and reflections) support skinned and dynamic objects (people, vehilcles, animals, .. etc), software Lumen can't.

-Better interior scenes, as hardware Lumen supports much higher resolution (rays per pixels), which vastly improves light bounce and reflections in interiors. And renders vastly more details.

-Better shadows for horror games that don't rely on Nanite (that use interior scenes), as hardware ray traced shadows allows for a vast number of shadow casting lights, much higher number of shadows, and a vastly better contact hardening, Virtual Shadow Maps can't offer any of that.

-Exotic use cases, translucency, translucent shadows, caustics, .... etc.
 
Isn't the point that people's perception isn't in line with best offerings? Marketing clearly plays a role in market share and more than once throughout history, the better option/solution hasn't commanded the strongest position because of other factors. An argument here is that many nVidia buyers would be better off chosing AMD, similar to arguments like many PS buyers being better off with XB etc. But market leaders tend to gain momentum and dominate mindshare with people not making purely objective choices.
Of course but we don’t actually have an objective measure of value that we can use. Rasterized FPS/$ certainly doesn’t capture all of the value to the consumer. E.g. my LG C9 OLED TV only supports VRR on Nvidia cards.
 
Is that how it works? Lumen performance can be accelerated with the on GPU RT HW? Seems like something the consoles could take advantage of to make lumen more performant in games no?

It depends the framerate they target on current gen console, if it is 30 fps HW-RT if it is 60 fps the only choice is software lumen but I am curious to see GI 1.0 of AMD using HW-RT but seems to be compatible with realtime for current gen consoles maybe with optimization able to reach 60 fps in the future. AMD created a plugin for UE 5.
 
Last edited:
And it'll still use RT hardware if it's available and run even faster.
No, its not faster. You're mistaken it. Their software mode is not using BVHs, they are using signed distance fields. In some instances, SW-Lumen can be even faster than HW-Lumen. These are different methodes, it's not like running a DXR game like Control on Pascal!

Is that how it works? Lumen performance can be accelerated with the on GPU RT HW? Seems like something the consoles could take advantage of to make lumen more performant in games no?
Sadly not. SW-Lumen is using signed distance fields, not triangle based Raytracing. SDFs are faster than HW-Lumen in certain scenarios.
 
Back
Top