Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
The end result though amounted from pretty satisfactory to just perfect. Its been so long that I cant remember if they ever managed to replicate a transparent reflection using this technique ever (like seing your reflection on a window and at the same time whats behind it). I am checking MGS2's tanker section where Snake is fighting Olga, and it is just outststanding how well it mimics the result of RT and looks so clean at the same time.
But I guess RT is a much easier implementation to apply in a heavy reflective environment, than going through every puddle or reflectve surface and try to duplicate manually certain sections with transparencies with all the unpredictable action of particles and effects shown on the pseudo reflection. But in principle its kind of is the same thing that RT does.
It's much, much easier to implement oldschool planar reflections, but your game will run at 2fps with modern content/piplelines. The curve for cost at a given "scene/rendering complexity" is not sustainable for anything more complex than the early x360 era.
 
What's the point of that comparison? Planar reflections by rendering a second viewport or doubling geometry is an effective hack for particular situations but it's limited like that.
UE5 has the exact same limitation. The engine cant reflect even closely what the real enviroment looks. This here is the background of the scene:


UE5 (software) Lumen reflections are more a drawing, like somebody asked me to draw a picture of what i see...
 
It's much, much easier to implement oldschool planar reflections, but your game will run at 2fps with modern content/piplelines. The curve for cost at a given "scene/rendering complexity" is not sustainable for anything more complex than the early x360 era.
I edited my previous post. But yeah I get what you are saying.
 
UE5 has the exact same limitation.
It can reflect non-planar surfaces, like curved mirrors, reflections on bottles, and waves.

The engine cant reflect even closely what the real enviroment looks.
Would it be able to render that viewport as a planar reflection without tanking framerate? It'll have twice the rendering so of the order of half the framerate.
 
Primitive vs mesh shader explained again by Seb Altonen. There are further implications in replys


<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Primitive shaders are mesh shaders. It&#39;s AMDs internal name for the same feature. The first implementation didn&#39;t match PC DX12 perfectly and was not exposed. Sony still calls this feature Primitive Shaders.<br><br>There&#39;s minor HW differences, which can be seen in Vulkan discussions.</p>&mdash; Sebastian Aaltonen (@SebAaltonen) <a href=" ">November 1, 2023</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 
Primitive vs mesh shader explained again by Seb Altonen. There are further implications in replys

Primitive Shaders (NGG) was first implemented in GCN5, partially re-written again with RDNA1.However, RDNA1 is lacking per-primitive output, so neither in DX12 or Vulkan Mesh Shaders are supported.It requires >= RDNA2 (GFX10.3).

the PS5 is rather aligned to RDNA1 or to RDNA2 and if the lack of per primitive output has any major implications?There are also discussions about compute shader fallbacks and what the limitations would be.One could probably also mention that still not all features of Mesh Shaders map well to RDNA2 hw, as they lack native support to output more than one vertex/primitive per thread, which is one of the improvements of RDNA3 hw.



 
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Per-primitive output is very handy. I am glad that Microsoft didn&#39;t cave-in. One of the bottlenecks of the old pipelines is lack of per-primitive data, and the hacks around that are not pretty.</p>&mdash; Sebastian Aaltonen (@SebAaltonen) <a href=" ">November 1, 2023</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 
I wonder if mesh shaders will help the Series X with its apparent issue of being very wide relative to it's front and back end.

Mesh shaders will not only eat up more compute, but they could (should?) mean that the rasteriser is discarding fewer pixels and so able to pass more work onto the pixel shader stage.

I've speculated since the start of this gen that it will be compute and RT heavy* 'real' next gen games that will get the most out of the Series X. AW2 is a sample count of 1 of course, but it's looking like - maybe - we're getting to that point...?

*"RT Heavy" being relative, this is RDNA 2 we're talking about
 
Path tracing in AAA games is not going to become some critical thing anytime soon. It'll be an optional setting for those with ultra high end rigs. It's largely unusable, or at least certain not worth the performance cost for the large majority of RTX users as well.

Hypothetically "pathtracing" in the Nvidia sense here, just some light bounces, could be done by the end of this generation on consoles. Heavily spreading samples out over space/time is really useable for diffuse, Capcom has an updated RT model that heavily relies on this and denoising, but can run a diffuse RT single bounce at 60fps on PS5/Series X.

Reflections, well specifically mirror/near enough reflections are a bit harder. Movement of object and camera can quickly invalidate most samples, and you need a lot of samples. Obviously it can be done, Spiderman and Ratchet both run on PS5 with RT reflections, but still it's harder. Heavy reliance on hybrid RT might be a good optimization here; the idea is to use sparse SDFs (you can get a decent enough resolution, better than UE5 software right now) and make rays faster by building a close fit BVH around the SDF. SDFs are much faster than triangle testing, even on Nvidia's latest cards, but ray/box testing in a relatively sparse BVH is still faster than running an SDF trace the whole way.

Now that AMD has finally gotten rid of the old RTG head, and his repeated mediocre results, we might see better execution from whoever his replacement is. If I were AMD I'd go out and see who I could hire from a pair of papers at Siggraph Asia, one describing a "neural net does it for you" variable rate shading with better performance for visual quality than upscaling: https://drive.google.com/file/d/1wSPdfpwOkOIznQUqUZdBMmdQ3WAWlhms/view

The second, by some of the same researchers, is a neural net based next frame prediction, basically FSR/DLSS 3 but without waiting for the next frames motion vectors even, the thing just predicts everything for you and you go. The paper for that isn't up yet though.
 
Primitive Shaders (NGG) was first implemented in GCN5, partially re-written again with RDNA1.However, RDNA1 is lacking per-primitive output, so neither in DX12 or Vulkan Mesh Shaders are supported.It requires >= RDNA2 (GFX10.3).

the PS5 is rather aligned to RDNA1 or to RDNA2 and if the lack of per primitive output has any major implications?There are also discussions about compute shader fallbacks and what the limitations would be.One could probably also mention that still not all features of Mesh Shaders map well to RDNA2 hw, as they lack native support to output more than one vertex/primitive per thread, which is one of the improvements of RDNA3 hw.



It's funny that no developers (or even leakers) want to disclose if PS5 has RDNA1 or RDNA2 primitive shaders. NDAs must be strong here. But the fact that Alan Wake 2 mesh shaders have not being implemented on RDNA1 GPUs is really telling. If their implementation was perfectly working without per-primitive output hardware (in the case that it should be working on PS5 supposedly using RDNA1 primitive shaders), why not make it work on those GPUs?

Finally it's good to finally call a cat a cat. From the beginning Mesh shaders (the hardware part) are another name of Primitive shaders, nothing more and certainly not some new hardware that was going to bring incredible improvements. I am glad that narrative created by some enthusiastic gamers (that lasted 3 years) finally ended.
 
On the#stutterstruggle front, some decent news, Ghostrunner 2 has apparently incorporated shader precompiling now, and the latest update has improved it further.

However...


Sigh. Albeit though they do seem aware of it at least:


Still, there are plenty of comments along the lines of "Yeah it did that for me at the start, but eventually runs smoothly". Then there's the extreme examples of reality denial:

 
It's funny that no developers (or even leakers) want to disclose if PS5 has RDNA1 or RDNA2 primitive shaders. NDAs must be strong here. But the fact that Alan Wake 2 mesh shaders have not being implemented on RDNA1 GPUs is really telling. If their implementation was perfectly working without per-primitive output hardware (in the case that it should be working on PS5 supposedly using RDNA1 primitive shaders), why not make it work on those GPUs?

Primitive Shaders aren't exposed on PC RDNA1. Remedy have no way of doing what they did on PS5, even if the underlying hardware has the same functionality.

And even if it were possible (it's not) they'd have to have a custom path going outside of DX12 just for those old, limited userbase PC RDNA1 parts that could do it.

Finally it's good to finally call a cat a cat. From the beginning Mesh shaders (the hardware part) are another name of Primitive shaders

Primitive Shaders have historically been able to run on hardware that lacks the functionality to fully support (DX12U) Mesh Shader. NGG has evolved across generations.
 
Finally it's good to finally call a cat a cat. From the beginning Mesh shaders (the hardware part) are another name of Primitive shaders, nothing more and certainly not some new hardware that was going to bring incredible improvements. I am glad that narrative created by some enthusiastic gamers (that lasted 3 years) finally ended.
Primitive shaders does everything a Mesh Shader can do but it cannot do per primitive. That’s been around this forum several times, the impact of that, is unknown. But it’s safe to say that the solution for PS5 is going to be different than the one for directX, even moreso when Amplification shaders are brought into play.

That isn’t concern trolling for ps5, but we need to call a spade a spade, primitive shader is not a mesh shader. And the lack of amplification shaders on titles is telling that ps5 likely doesn’t have them which is why everyone is either skipping mesh shaders or rolling their own compute functions ahead of mesh shaders.

The latter is coming across as rare.
 
On the#stutterstruggle front, some decent news, Ghostrunner 2 has apparently incorporated shader precompiling now, and the latest update has improved it further.

However...


Sigh. Albeit though they do seem aware of it at least:


Still, there are plenty of comments along the lines of "Yeah it did that for me at the start, but eventually runs smoothly". Then there's the extreme examples of reality denial:

Surely it's gotten through to most developers who use Unreal Engine that they have to do this shit.. no matter how much of a pain in the ass it is. If you're using Unreal for any game in development currently, and don't have this hammered into your head by now.. then obviously we haven't made enough of an issue about it in the news/forums/socials and with our wallets. Because I feel like we're at the point now where it should just be understood. The games that miss this crucial step, are just doing themselves such a massive disservice.. it's insane.

I sure hope UE5.4's improved multi-threading will help with traversal stutters, because that's the next major issue we have to tackle. I don't give a damn if it's a Microsoft OS issue, an engine issue, a dev optimization issue, or a bug... it's got to stop.. this shit REALLY ruins all the effort and time that goes into making these beautiful environments and incredible cutscenes when it's full of stuttering as you move through the damn thing. Engines like Unreal have been focusing too much on achieving photo realism with the ability to create stupidly complex things relatively easily. Obviously they've done this because they wanted to diversify their engine to serve multiple industries.. which is fine/great even, but IMO it's undoubtedly come at a cost to optimization and performance for games.

Anyway.. glad that Ghostrunner 2 devs fixed their game... should have happened before launch.. but what's done is done. Might buy it eventually.
 
Primitive shaders does everything a Mesh Shader can do but it cannot do per primitive. That’s been around this forum several times, the impact of that, is unknown. But it’s safe to say that the solution for PS5 is going to be different than the one for directX, even moreso when Amplification shaders are brought into play.

That isn’t concern trolling for ps5, but we need to call a spade a spade, primitive shader is not a mesh shader. And the lack of amplification shaders on titles is telling that ps5 likely doesn’t have them which is why everyone is either skipping mesh shaders or rolling their own compute functions ahead of mesh shaders.

The latter is coming across as rare.

This convo is getting confusing as there’s no one thing called a “primitive shader” given the primitive shader implementation has evolved from GCN up through RDNA 3.
 
Primitive shaders does everything a Mesh Shader can do but it cannot do per primitive. That’s been around this forum several times, the impact of that, is unknown. But it’s safe to say that the solution for PS5 is going to be different than the one for directX, even moreso when Amplification shaders are brought into play.

That isn’t concern trolling for ps5, but we need to call a spade a spade, primitive shader is not a mesh shader. And the lack of amplification shaders on titles is telling that ps5 likely doesn’t have them which is why everyone is either skipping mesh shaders or rolling their own compute functions ahead of mesh shaders.

The latter is coming across as rare.
I can see developers skipping out on the usage of the amplification shader stage as is the case on UE5 and thus it's important to note that D3D12 doesn't force graphics programmers to use them in order to get the mesh shader stage. Developers can just stick to purely using the plain old mesh shader stage and they likely won't face a whole lot of contention in multiplatform development even on PS5 ...
 

Star citizen has showed off their ray traced gi solution. They claim it is a copycat of AMD’s GI 1.0. And they also have a software mode (using same screen space probe setup but not include dynamic objects)
I don’t know what kind of software etc they are performing (sdf? voxel? their word choice is odd “relit environment probes”)

Lumen really started the screen space probes’ trend. And we are also seeing more ray traced GIs now. I think the Finals also include a DDGI option
 
Status
Not open for further replies.
Back
Top