Support for Machine Learning (ML) on PS5 and Series X?

RTX GPUs even have dedicated ML hardware that accelerate these INT8/4 instructions, the output from TCs for these kind of tasks is not only much higher compared to shader cores, they also won't impact gaming performance as much as on say RDNA2 without dedicated ML hardware. That is why DLSS upscaling is so extremly fast (0.8ms on a 2060 at 1080p) and without TCs, running on the shader cores, it would be much slower.
While this is 100% true, wasn't there a version of DLSS included with one of the Control updates that ran exclusively on the shader cores? I seam to remember one of the DF videos talking about it, and it was basically the first implementation of DLSS that wasn't the smeary mess it was at launch. And it was still much faster than native rendering.
 
While this is 100% true, wasn't there a version of DLSS included with one of the Control updates that ran exclusively on the shader cores? I seam to remember one of the DF videos talking about it, and it was basically the first implementation of DLSS that wasn't the smeary mess it was at launch. And it was still much faster than native rendering.
There was.
But the quality was not quite there. But yes it can run for sure.

nvidia cards however all support dp4a/or equivalent for a long time, not just on tensor cores.
 
While this is 100% true, wasn't there a version of DLSS included with one of the Control updates that ran exclusively on the shader cores? I seam to remember one of the DF videos talking about it, and it was basically the first implementation of DLSS that wasn't the smeary mess it was at launch. And it was still much faster than native rendering.

Yes, quality was a lot worse than DLSS 2.0

Buuuuuuuut I am not so sure if DLSS 1.9 really was exclusively running on the shader cores like many people including DF say.

I remember using Nsight and DLSS 1.9 actually had tensor core activity.
 
Intel Xe Super Sampling (XeSS) to feature five quality modes, including Ultra-Quality - VideoCardz.com
I know there is some debate if both consoles support dp4a but since we know that the Series does support dp4a it would behoove Intel to get XeSS into the Xbox GDK as soon as possible. Considering that all XGS games are released on the PC and that nearly all games that are released by 3rd parties on the Xbox are also released on PC, it would go a long way to it being adopted by developers that are using the GDK to deliver games to both platforms.
 
Just finished watching this two minute paper on ai raster to vector transformation and i thought is it possible to transform the bitmaps on the frame buffer into vector images in real-time? possibly hardware accelerated, since vectors are infinite this would mean the end of aliasing.
 
Back
Top