Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

DX12U does not seem to have much relevance for UE5. We have to see whether or not Mesh Shading for Nanite makes a big difference (if the r.nanite.meshshadingraster command in UE5 is indeed working correctly, then there's close to zero difference in performance)

Sampler Feedback and VRS are not supported either.

The command from Epic that DX12U is the gold standard of graphics were a blatant lie. They did not announce anything in regards to DX12U and they keep ignoring it.
Sampler Feedback requires invasive code changes and it's a common example to use pixel shaders to compute the lighting pass because you need access to derivatives which have only been traditionally offered in pixel shaders ... (detailed post)

HW VRS can only be used with pixel shaders and there's virtually no benefit to be extracted on deferred renderers with geometrically dense content ...

Mesh shading could be useful when rendering bigger triangles in Nanite but if no content meets this criteria, it's possible that the mesh shading path never gets triggered in Nanite ...
 
Has that actually ever been confirmed though? PS5 is after all, the only next gen console with ML in an actual released game (Spiderman) with ML inference run on the GPU.
The number of things confirmed about the PS5 GPU is basically zero. Like, was it ever confirmed that Spiderman is doing inference in realtime and that it wasn't just pre-baked using ML on a server? I tried looking around but I could never find anything concrete about that. 'Using ML' is a very clouded concept and training data on a server is quite different from just interpreting the results on a client system.

That said, I'm far from concerned about small architectural differences between the systems. The PS5 might not be SFS compliant according to the DX12 spec but that doesn't mean they don't have a similar system in place with their own different spec. Same goes for mesh shaders etc.
 
The number of things confirmed about the PS5 GPU is basically zero. Like, was it ever confirmed that Spiderman is doing inference in realtime and that it wasn't just pre-baked using ML on a server? I tried looking around but I could never find anything concrete about that. 'Using ML' is a very clouded concept and training data on a server is quite different from just interpreting the results on a client system.

That said, I'm far from concerned about small architectural differences between the systems. The PS5 might not be SFS compliant according to the DX12 spec but that doesn't mean they don't have a similar system in place with their own different spec. Same goes for mesh shaders etc.
Sony is as bad as Nvidia WRT divulging information about how their hardware works. Microsoft and AMD are so much better.
 
The number of things confirmed about the PS5 GPU is basically zero. Like, was it ever confirmed that Spiderman is doing inference in realtime and that it wasn't just pre-baked using ML on a server? I tried looking around but I could never find anything concrete about that. 'Using ML' is a very clouded concept and training data on a server is quite different from just interpreting the results on a client system.

GOW Ragnarök supposedly uses ML algorithm as part of the DRS on PS5 in performance mode. No direct quote from a dev yet to confirm so take with a grain of salt.
 
Has that actually ever been confirmed though? PS5 is after all, the only next gen console with ML in an actual released game (Spiderman) with ML inference run on the GPU.

You don't need INT4/INT8 to do ML, you can use FP16 to do ML. There are some ML Algorithms needing better precision than INT4/INT8 too. If the PS5 could not use ML, it would be a huge problem.

EDIT: Like @Lurkmass said Sampler feedback is not compatible with visibility buffer like in UE5 or for the deferred texturing for vegetation in Horizon Forbidden West or with any deferred renderer using a compute based light prepass. And it is known to not be interesting for deferred renderer like Unreal Engine, Decima Engine, Naughty Dog engine...

And HW VRS is limited to 8x8 pixel, sometimes software VRS is more performant because you can go lower like in COD software VRS and they use a forward + renderer.

Out of INT4/INT8 I think the compromise done by Sony on the GPU side to keep it narrow are ok.

Flexible scale rasterization do variable shading and resolution too, this is going a bit further than VRS.
 
Last edited:
Any modern-ish hw should be able to do ’ML’. Even last generation.

Yes like the PS5 GPU. Any GPU able to do compute shading and good at asynchronous compute is good for ML. Compute is probably the biggest things of the last decade to be introduced on GPU side without compute no Nanite, no Dreams, compute based ML and tons of less spectacular but very useful usage like compute light prepass, compute based particle, compute cloth simulation and so on... Give flexibility to the dev and they will find a way to do something great.

EDIT: I forget the great Frosbite hair technology using compute based Analytical AA and software rasterization.
 
Last edited:
Has that actually ever been confirmed though? PS5 is after all, the only next gen console with ML in an actual released game (Spiderman) with ML inference run on the GPU.
I don't think it has ever been confirmed, which is what my "every feature" comment was about. But there aren't any real special requirements for machine learning in modern hardware. Just because Miles Morales uses machine learning doesn't prove feature parity with Series GPU.
 
Just because Miles Morales uses machine learning doesn't prove feature parity with Series GPU.

I never said it did, I just still find it strange that the consoles with confirmed INT4/8 ML capabilities still don't have a single game that has used it.

But the console that supposedly doesn't have INT4/8 hardware has.

I was honestly expecting Microsoft to have a working ML based upscaler like DLSS/XeSS for Series consoles 12-18 months after release.
 
I never said it did, I just still find it strange that the consoles with confirmed INT4/8 ML capabilities still don't have a single game that has used it.

But the console that supposedly doesn't have INT4/8 hardware has.

I was honestly expecting Microsoft to have a working ML based upscaler like DLSS/XeSS for Series consoles 12-18 months after release.
Me too, honestly. Although, now that I've seen XeSS's DP4 performance, I'm not sure an ML upscaler is going to be performant enough in every case, especially when you compare it to a "dumb" upscaler like FSR 2.
 
Yes like the PS5 GPU. Any GPU able to do compute shading and good at asynchronous compute is good for ML. Compute is probably the biggest things of the last decade to be introduced on GPU side without compute no Nanite, no Dreams, compute based ML and tons of less spectacular but very useful usage like compute light prepass, compute based particle, compute cloth simulation and so on... Give flexibility to the dev and they will find a way to do something great.

EDIT: I forget the great Frosbite hair technology using compute based Analytical AA and software rasterization.

PS4 should be perfectly fine doing it too. 2011 GPU's should aswell (fermi?). Its a question of performance and theres where hardware acceleration provides. If musle deformation/DRS is enough then yeah, though i think AI/ML is able to offer much more than that.

We go back to days data scientists were doing ML with pixel shaders on Kepler; thus starting the beginning of CUDA.

Right, i thought it was Fermi. Still, thats over decade ago too.
 
PS4 should be perfectly fine doing it too. 2011 GPU's should aswell (fermi?). Its a question of performance and theres where hardware acceleration provides. If musle deformation/DRS is enough then yeah, though i think AI/ML is able to offer much more than that.



Right, i thought it was Fermi. Still, thats over decade ago too.
Probably fermi. CUDa was released with Kepler
 
Back
Top