Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Last edited:
Image comparisons have also been made with a more accurate path traced reference rendered with Blender Cycles. Performance was measured on a laptop system using an NVIDIA RTX 2060 GPU with 6144 MB of GDDR6 memory, and an Intel Core i7-8750H CPU.
Figure 15
Figure 15
The first row of images (Figure 15) displays little visible indirect lighting and focuses mostly on comparing the indirect diffuse bleed from the curtains onto the arches. Here it can be observed that the indirect diffuse light from VCT produces more blurred indirect color bleeding compared to the sharperlighting produced by Cycles. The light also bleeds back behind the curtains with VCT, while it sharply stops with Cycles render. This is one of the key problems with the VCT implementation. The individual voxels stored in the texture may end up thicker than thin geometry, thus creating a conflict between the occluded and directly illuminated sides.

These two sides will then end up averaged into the same voxel texture slot bleeding some light over to the occluded side (see Figure 16 for a more detailed example). Note that the flowers are also a bit darker in the Cycles render which is primarily due to unresolved issues with texture transparency displaying as black.

The second row of images (Figure 15) focuses more on displaying indirect lighting, emitted from the light silhouettes cast through the open windows by a distant light source. Here the differences become more apparent. Firstly, the floor and top left arches are too bright which is likely caused by the previously mentioned light leaking. In this case though, the leak begins only in higher mip-levels of the texture around the window edges. Another problem can be observed in the color that is bleeding from the curtains up against the pillars, which does not quite match. In the Cycles render the indirect light climbs a bit further up the pillar. This likely has to do with the cone configuration. In this case the cone offsets coupled with a 30° blind spot perpendicular to the normal could cause the curtain to be partially missed. Utilizing more cones with a smaller aperture tends to improve detail but comes with a sometimes-significant performance cost.
1736771631428.png
Figure 16
In Figure 16 the light leak still looks somewhat natural since light would likely leak through the thin curtains in a real setting. However, it is in this case caused by an error with the potential to cause more undesirable results. Note that the issue also cascades into the higher mip-levels further intensifying the leaks in some cases.
 
Last edited:
That's one of the techs considered to replace HWRT in the pre-PS4 discussion, if compute would be adequate. I guess this was chosen for performance on consoles without adequate RT power? Nice to see someone using an alternative to the current defaults to compare results.

HWRT isn't available on the PC version for comparison?
No, this game is a non-RT game even on PC.
A shame as I am sure the image quality would be a notch up.
 
That's one of the techs considered to replace HWRT in the pre-PS4 discussion, if compute would be adequate. I guess this was chosen for performance on consoles without adequate RT power? Nice to see someone using an alternative to the current defaults to compare results.

HWRT isn't available on the PC version for comparison?
I'm surprised at this as well. Maybe in a future patch? I can't imagine it will look competitive to the RT games that have come out this year.
 
My thought exactly. As a developer, if you think the setting is unnoticeable then why the hell are you including it? Either do something worthwhile or don't artificially lower performance.
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.
 
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.

Ultra settings that tank performance for no visible gain won’t help with that.
 
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.

10-20 years ago when PCs still had rapid scaling and we moved into the multiplatform era people could run max settings on mid tier hardware. The prevaling complaint instead then was that consoles were stagnating graphics on the PC and developers weren't catering to PC users.

What it really is that people (well a vocal subset really) want all games to be optimized specifically for their hardware and performance criteria. Anything else means the developers are "lazy." This is veering slightly towards social commentary (well really internet commentary) but really all it is that the internet's given voice to the idea of everyone's self importance.
 
What it really is that people (well a vocal subset really) want all games to be optimized specifically for their hardware and performance criteria
Worse yet, they want 120fps at 4K on high end GPUs now. In the past we were lucky to get 60fps at the highest resolution in demanding games ... but now 120fps, 144fps or even 240fps are "required" ... strange times indeed!
 
Just a guess but I would bet many people commenting on the state of 4K gaming don’t actually own 4K monitors far less 4K monitors over 60Hz.
 
I’m honestly considering going back to 1080p gaming on one of those dual mode displays. I’m almost always using dlss at 1440. Dual mode would be better for productivity and video.

Chasing 4K has been a huge mistake for the gaming industry.
 
I’m honestly considering going back to 1080p gaming on one of those dual mode displays. I’m almost always using dlss at 1440. Dual mode would be better for productivity and video.

Chasing 4K has been a huge mistake for the gaming industry.
Fixed resolution displays have really impacted games more than I could have imagined. I remember back in the day having to go back to 640*480 to play a modern game, only to be OK with it after a few minutes. But CRT monitors handled that a bunch better. Unreal 1 was one of these games - I had been playing Quake 2 at higher resolutions, but Unreal was much heavier, but also looked so pretty.

I also think there's a market for a 10-12 inch 720p high refresh rate laptop with ~3050ti lever performance. I think having that level of performance/feature set at that low resolution and screen size would be a great portable gaming experience. I know razor make a laptop in that size, but IIRC the modern ones come with a 4k screen, which seams silly for a screen that size.
 
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.

We have "real ultra settings" in the likes of CP2077, Alan Wake, Indy and Wu Kong but I don't see any serious narrative along the above lines in any of those games.

Granted if the Ultra setting in this game has a virtually unnoticeable performance impact for its "virtually unnoticeable" visual return then I don't have a major issue with its inclusion, but it does beg the question of what is the point. And if it has a big performance impact then I'd argue its actively bad for PC gaming.
 
Granted if the Ultra setting in this game has a virtually unnoticeable performance impact for its "virtually unnoticeable" visual return then I don't have a major issue with its inclusion, but it does beg the question of what is the point. And if it has a big performance impact then I'd argue its actively bad for PC gaming.
When Crysis was released, people said they couldn't tell the difference between the highest settings and one step down (I don't remember the verbiage they used in game), and that may have been true. But today, with today's higher monitor resolutions and our more trained eyes, I bet many of us could tell the difference. Sometimes those settings aren't for today's hardware.

Another example was STALKER. It had an option labled something like "full realtime lighting". Enabling it tanked performance and appeared to have little image quality benefit. But, like the name suggests, the realtime lighting was done in realtime. So it had all of the benefits of realtime lighting. And those benefits weren't really appreciated back in the day, especially when you were comparing screenshots and not the game in motion and over a period of time. And to be fair, I think enabling this option made my computer generate screenshots. Single digit FPS if I remember it correctly. In some ways, I find analogies with this setting and current titles that have a huge performance hit with RT enabled. Maybe it's for the next generation of hardware.
 
You can always use the integer scaling to simulate the native 1080p look on a 4K display without losing contrast due to filtering)


This is what I'm thinking of, though pricing will probably keep me from jumping in. You have 4k240Hz for indie games, productivity, video and then you have 1080p480Hz that I'd use for esports type games or ray tracing games where I'd probably be playing low resolution and frame gen anyway.

Edit: Also not sure I want to move to a 32", especially with 1080p as the 2nd mode, but 4k 27" could be too small for regular desktop use.
 
Last edited:
Back
Top