Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

Last edited:
Image comparisons have also been made with a more accurate path traced reference rendered with Blender Cycles. Performance was measured on a laptop system using an NVIDIA RTX 2060 GPU with 6144 MB of GDDR6 memory, and an Intel Core i7-8750H CPU.
Figure 15
Figure 15
The first row of images (Figure 15) displays little visible indirect lighting and focuses mostly on comparing the indirect diffuse bleed from the curtains onto the arches. Here it can be observed that the indirect diffuse light from VCT produces more blurred indirect color bleeding compared to the sharperlighting produced by Cycles. The light also bleeds back behind the curtains with VCT, while it sharply stops with Cycles render. This is one of the key problems with the VCT implementation. The individual voxels stored in the texture may end up thicker than thin geometry, thus creating a conflict between the occluded and directly illuminated sides.

These two sides will then end up averaged into the same voxel texture slot bleeding some light over to the occluded side (see Figure 16 for a more detailed example). Note that the flowers are also a bit darker in the Cycles render which is primarily due to unresolved issues with texture transparency displaying as black.

The second row of images (Figure 15) focuses more on displaying indirect lighting, emitted from the light silhouettes cast through the open windows by a distant light source. Here the differences become more apparent. Firstly, the floor and top left arches are too bright which is likely caused by the previously mentioned light leaking. In this case though, the leak begins only in higher mip-levels of the texture around the window edges. Another problem can be observed in the color that is bleeding from the curtains up against the pillars, which does not quite match. In the Cycles render the indirect light climbs a bit further up the pillar. This likely has to do with the cone configuration. In this case the cone offsets coupled with a 30° blind spot perpendicular to the normal could cause the curtain to be partially missed. Utilizing more cones with a smaller aperture tends to improve detail but comes with a sometimes-significant performance cost.
1736771631428.png
Figure 16
In Figure 16 the light leak still looks somewhat natural since light would likely leak through the thin curtains in a real setting. However, it is in this case caused by an error with the potential to cause more undesirable results. Note that the issue also cascades into the higher mip-levels further intensifying the leaks in some cases.
 
Last edited:
That's one of the techs considered to replace HWRT in the pre-PS4 discussion, if compute would be adequate. I guess this was chosen for performance on consoles without adequate RT power? Nice to see someone using an alternative to the current defaults to compare results.

HWRT isn't available on the PC version for comparison?
No, this game is a non-RT game even on PC.
A shame as I am sure the image quality would be a notch up.
 
That's one of the techs considered to replace HWRT in the pre-PS4 discussion, if compute would be adequate. I guess this was chosen for performance on consoles without adequate RT power? Nice to see someone using an alternative to the current defaults to compare results.

HWRT isn't available on the PC version for comparison?
I'm surprised at this as well. Maybe in a future patch? I can't imagine it will look competitive to the RT games that have come out this year.
 
My thought exactly. As a developer, if you think the setting is unnoticeable then why the hell are you including it? Either do something worthwhile or don't artificially lower performance.
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.
 
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.

Ultra settings that tank performance for no visible gain won’t help with that.
 
People want to enable ultra settings on mid tier hardware and get to 60 fps without drops. If they don't get that, negative steam reviews, accusations of shit optimizations and "lazy developers" begin to pollute the conversations around the game.

If I was a developer, I wouldn't make real ultra settings either.

10-20 years ago when PCs still had rapid scaling and we moved into the multiplatform era people could run max settings on mid tier hardware. The prevaling complaint instead then was that consoles were stagnating graphics on the PC and developers weren't catering to PC users.

What it really is that people (well a vocal subset really) want all games to be optimized specifically for their hardware and performance criteria. Anything else means the developers are "lazy." This is veering slightly towards social commentary (well really internet commentary) but really all it is that the internet's given voice to the idea of everyone's self importance.
 
What it really is that people (well a vocal subset really) want all games to be optimized specifically for their hardware and performance criteria
Worse yet, they want 120fps at 4K on high end GPUs now. In the past we were lucky to get 60fps at the highest resolution in demanding games ... but now 120fps, 144fps or even 240fps are "required" ... strange times indeed!
 
I’m honestly considering going back to 1080p gaming on one of those dual mode displays. I’m almost always using dlss at 1440. Dual mode would be better for productivity and video.

Chasing 4K has been a huge mistake for the gaming industry.
 
I’m honestly considering going back to 1080p gaming on one of those dual mode displays. I’m almost always using dlss at 1440. Dual mode would be better for productivity and video.

Chasing 4K has been a huge mistake for the gaming industry.
Fixed resolution displays have really impacted games more than I could have imagined. I remember back in the day having to go back to 640*480 to play a modern game, only to be OK with it after a few minutes. But CRT monitors handled that a bunch better. Unreal 1 was one of these games - I had been playing Quake 2 at higher resolutions, but Unreal was much heavier, but also looked so pretty.

I also think there's a market for a 10-12 inch 720p high refresh rate laptop with ~3050ti lever performance. I think having that level of performance/feature set at that low resolution and screen size would be a great portable gaming experience. I know razor make a laptop in that size, but IIRC the modern ones come with a 4k screen, which seams silly for a screen that size.
 
Back
Top