Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

There's more to graphics technology than just ray tracing. Modern Sony games have very creative ideas about graphics programming, GPU driven rendering, shading pipelines, unique BRDF models, ad hoc approaches to indirect lighting and some of those problem spaces that can't easily be brute forced with more powerful hardware or 'solved' with ray tracing ...
In another post you mentioned Ghosts of Tsushima using custom BRDF among some other things to implement a form of indirect lighting. I don’t recall the lighting differentiating itself from other open world games of the time. Is there any real world benefit to their unique approach?
 
Last edited:
In another post you mentioned Ghosts of Tsushima using custom BRDF among some other things to implement a form of indirect lighting. I don’t recall the lighting differentiating itself from other open world games of the time. Is there any teal world benefit to their unique approach?
They have a custom BRDF in their forward shading pipeline to simulate the appearance of anisotropic fuzz. For their indirect lighting system, they first pick one of their dynamic reflection probes (total of 128) to relit/shadow and the results get baked into a compressed BC6H format. The entire process from the relighting/shadowing and texture compression all happens once per frame inside an async compute queue. They're basically balancing the dynamic indirect specular lighting update rate and use HDR texture compression to minimize the memory consumption on their reflection probes ...
 
This part in HB2 is seriously impressive with how well they animate this huge body of water draining. This image alone obviously can't do justice to this sequence but holy my jaw was on the floor. Games typically do not handle water draining in such a convincing way.

Senua-s-Saga-Hellblade-2-5-21-2024-9-14-57-PM.jpg
 
There's more to graphics technology than just ray tracing. Modern Sony games have very creative ideas about graphics programming, GPU driven rendering, shading pipelines, unique BRDF models, ad hoc approaches to indirect lighting and some of those problem spaces that can't easily be brute forced with more powerful hardware or 'solved' with ray tracing ...
And the result are shadows like that:
45-1080.99201915.jpg


This game lacks any kind of modern rendering features and yet needs the same performance as games with Raytracing. So no there isnt "more to graphics technology than just ray tracing". There maybe be alternatives but they should be at least the same quality or be much faster.
 
Last edited:
And the result are shadows like that:
45-1080.99201915.jpg


This game lacks any kind of modern rendering features and yet needs the same performance as games with Raytracing. So no there isnt "more to graphics technology than just ray tracing". There maybe be alternatives but they should be at least the same quality or be much faster.
If you're going to be fixated on just a singular methodology (ray tracing), you're going to miss out on grasping at the beauty offered in other subjects of real-time graphics like their GPU driven procedural grass system ...

There are topics that ray tracing alone can't be used to comprehend like virtual geometry, deferred texturing, other data structures, bindless, and even things like neural rendering too such as DLSS!
 
And the result are shadows like that:
45-1080.99201915.jpg


This game lacks any kind of modern rendering features and yet needs the same performance as games with Raytracing. So no there isnt "more to graphics technology than just ray tracing". There maybe be alternatives but they should be at least the same quality or be much faster.
You picked like the worst case scenario for the game, 99% of the time it looks much better than that.
 
There are topics that ray tracing alone can't be used to comprehend like virtual geometry, deferred texturing, other data structures, bindless, and even things like neural rendering too such as DLSS!
Or maybe some people are fixated on attributing unique properties to rasterization because the current consoles hardware is lackluster in RT. However, things tend to change. The PS5 Pro alone promises more changes and performance gains in RT than anywhere else. It's only logical to expect RT to gain disproportionately more attention compared to rasterization in future generations of consoles.

What do you mean by "can't be used to comprehend things like virtual geometry, deferred texturing, other data structures, bindless, and even things like neural rendering"?
If you are saying that they are not compatible, that's pure fiction. Besides, rendering trillions of instanced grass blades is the field where RT, with a proper approach, can wipe the floor with rasterization.
 
If you're going to be fixated on just a singular methodology (ray tracing), you're going to miss out on grasping at the beauty offered in other subjects of real-time graphics like their GPU driven procedural grass system ...

There are topics that ray tracing alone can't be used to comprehend like virtual geometry, deferred texturing, other data structures, bindless, and even things like neural rendering too such as DLSS!
Final Fantasy 15 used Tesselation for the "GPU driven procedural grass systems". The problem with these ports is they are used the most inefficient way to solve a problem because the hardware plattform itself used outdated GPU IP. Fixation on one plattform will always result in worst quality.
 
Game had a higher quality setting for shadows on PS4 for photo mode, does the PC use this ingame ?

Yes, essentially

I think Ghost of Tsushima does show pretty well, like Horizon, how there are limits to lighting realism without any sort of tracing. Hardware or otherwise. Some aspects do not look so hot in 2024 next to other titles as a result of just using things like probe lighting or shadow maps.
 
Or maybe some people are fixated on attributing unique properties to rasterization because the current console hardware is lackluster in RT.
But that's not true as these are subjects that can be applied in many contexts beyond rasterization! By no means is real-time graphics a completely solved field with ray tracing ...
at do you mean by "can't be used to comprehend things like virtual geometry, deferred texturing, other data structures, bindless, and even things like neural rendering"?
If you are saying that they are not compatible, that's pure fiction. Besides, rendering trillions of instanced grass blades is the field where RT, with a proper approach, can wipe the floor with rasterization.
I didn't claim that they were "not compatible", however you won't gain a deeper interpretation of those problem sets just purely out of understanding the theory of ray tracing as they are orthogonal concepts to each other ...
Final Fantasy 15 used Tesselation for the "GPU driven procedural grass systems". The problem with these ports is they are used the most inefficient way to solve a problem because the hardware plattform itself used outdated GPU IP. Fixation on one plattform will always result in worst quality.
I have a feeling that you might not know what the term "GPU driven" truly entails. It has nothing to do with tessellation or geometry. It's a very abstract concept about the "absence of a CPU-GPU feedback system" in rendering. With indirect rendering APIs, we evolved from elementary indirect APIs like draw/dispatch indirect (GPU generated commands for shader dispatches) to ExecuteIndirect (GPU generated commands to change shader input bindings) and finally Work Graphs (nested parallelism/forward progress w/ persistent threads) especially w/ the mesh nodes extension (GPU generated commands for PSO swapping)!

it's specifically about having NO CPU intervention for generating rendering command ...
 
Yes, essentially

I think Ghost of Tsushima does show pretty well, like Horizon, how there are limits to lighting realism without any sort of tracing. Hardware or otherwise. Some aspects do not look so hot in 2024 next to other titles as a result of just using things like probe lighting or shadow maps.

Yeah we’re hopefully getting to the point where the light transport problem converges on tracing across the industry. The hacky alternatives are getting too expensive for inadequate quality. Devs can then focus on other aspects of rendering. UE5 will help and the next consoles will hopefully make this a moot point.
 
By no means is real-time graphics a completely solved field with ray tracing ...
Never said graphics was solved with RT. RT is simply more flexible. There are more problems that can be solved with it, and the rest are current API/SW limitations, which can be solved.

I didn't claim that they were "not compatible", however you won't gain a deeper interpretation of those problem sets just purely out of understanding the theory of ray tracing as they are orthogonal concepts to each other ...
Yes, they are orthogonal, that's why I wouldn't attribute any advantages of deferred texturing to rasterization to begin with, because they are present due to limitations in the graphics pipeline in the first place. You don't suffer from quad overshading in RT, so you don't need visibility buffers and other redundant abstractions with it.
 
Yes, they are orthogonal, that's why I wouldn't attribute any advantages of deferred texturing to rasterization to begin with, because they are present due to limitations in the graphics pipeline in the first place. You don't suffer from quad overshading in RT, so you don't need visibility buffers and other redundant abstractions with it.
Deferred texturing is useful for more things than just reducing quad overshading. It can be helpful for sorting pixels into complete tiles to run full waves of the same batch (no divergent textures/constants too!) as seen with Horizon Forbidden West ...
 
With indirect rendering APIs
With indirect rendering APIs, certain aspects can be challenging, such as debugging and profiling, as has been noted in many presentations, optimization guides, and other resources. On the PS4, there might have been command buffers patched directly from compute shaders, bypassing the command processor entirely given that the prehistoric PS4 Pro GPU managed stable 30 FPS at 2560x1440 resolution. In contrast, the much more capable and modern RTX 2070 Super, which should be closer to the PS5, can't maintain the stable 30 FPS at max settings in the same resolution, which is ridiculous. It may be that their ExecuteIndirect implementation is lackluster if they're hitting a command processor bottleneck here. They should have tried other approaches or at least optimized it better because hitting bottlenecks that limit user experience on 80% of your audience's PCs is bad for everyone. In my opinion, the problem with those low level constructs is that they typically offer no visual improvement compared to more conservative approaches, while performance can suffer greatly and optimization becomes a nightmare, making games appear less and less optimized to an audience that doesn't care about all the low level details.
 
It can be helpful for sorting pixels into complete tiles to run full waves of the same batch (no divergent textures/constants too!) as seen with Horizon Forbidden West ...
Guess what? You can do the same with RT without deferred texturing.
 
With indirect rendering APIs, certain aspects can be challenging, such as debugging and profiling, as has been noted in many presentations, optimization guides, and other resources. On the PS4, there might have been command buffers patched directly from compute shaders, bypassing the command processor entirely given that the prehistoric PS4 Pro GPU managed stable 30 FPS at 2560x1440 resolution. In contrast, the much more capable and modern RTX 2070 Super, which should be closer to the PS5, can't maintain the stable 30 FPS at max settings in the same resolution, which is ridiculous. It may be that their ExecuteIndirect implementation is lackluster if they're hitting a command processor bottleneck here. They should have tried other approaches or at least optimized it better because hitting bottlenecks that limit user experience on 80% of your audience's PCs is bad for everyone. In my opinion, the problem with those low level constructs is that they typically offer no visual improvement compared to more conservative approaches, while performance can suffer greatly and optimization becomes a nightmare, making games appear less and less optimized to an audience that doesn't care about all the low level details.
Except, I can't see a much better abstraction to translate Ghost of Tsushima's GPU driven procedural grass system aside from doing it with Work Graphs ...

We'd have the 1st compute node feeding the 2nd compute node with it's blade count output record. The 2nd compute node would generate a graphics PSO with all the necessary state and would then launch a mesh node to kick off the rendering process along with using the 1st compute node's instance data output record ... (page 22)

All of that wrapped up neatly in a single DispatchGraph API call!
 
you could provide me a list of native only (not the multiplatform ones) PS3 and PS2 titles that launched on console and years later were later ported to PC, I would gladly like to see that list and see whether those games were running super well
Well technically speaking, both GTA 3 and GTA Vice City were released on PS2 alone first, then got ported to PC to almost a year later, then later to Xbox. Devil May Cry games (like Devil May Cry 3) did the same. In fact, I think many games did that, I just can't remember them all.

I think there was also titles like Heavy Rain, Beyond Two Souls, Metal Gear Solid 3, Metal Gear Solid 2, Journey and others.
 
Except, I can't see a much better abstraction to translate Ghost of Tsushima's GPU driven procedural grass system aside from doing it with Work Graphs ...
Looked briefly through the presentation. The pipeline looks like a perfect match for the mesh shaders with the task shaders, which should have helped to bypass the potential command processor bottleneck.
 
Looked briefly through the presentation. The pipeline looks like a perfect match for the mesh shaders with the task shaders, which should have helped to bypass the potential command processor bottleneck.
They probably want to address as large a market as possible. Are there any better solutions that would work on GPUs that don't support mesh shaders?
 
Last edited:
Back
Top