They didn't announce their GPU ALUs can do FP32 MADD operations either. They do have half a dozen published patents on foveated rendering since 2016, though.
However they're doing foveated rendering, it's not the same way as DX12 Ultimate's VRS. You have no idea how foveated rendering is implemented in Sony's GPU, other than it's different from Microsoft's.
Maybe one of every 1000 patents actually end up going into actual practice.
The patent for Foveated Rendering is in relation to VR, not the PS5.
There was a Sony patent that showed the console having unified L3 Cache in the CPU, which sent all the fanboys into a rage saying that the PS5 CPU was Zen 3 and had unified cache, when in reality the die shots show that it doesn't have unified cache at all.
Here is a patent for a time travel machine, so I guess it must be legit.
https://patents.google.com/patent/US20090234788A1/en
This constant "push for victory" on VRS is puzzling, except for Microsoft's marketing material being successful at convincing people that a) Sony isn't capable of doing foveated rendering their own way despite all the patents they released on the subject and b) it's a super important tech / secret sauce whose performance gains are yet to be seen in any relevant means.
The constant "push for victory" as you put it is to correct misinformation that was put out there. People ran with the PS5 having VRS because it was apparently a core function of Rdna 2, and so by default the PS5 must have it.
Well, as we have found out neither the PS5 nor XSX are full RDNA 2, and are actually made up of some RDNA 1 parts, and some RDNA 2 parts, and the PS5 is using more RDNA1 parts than the XSX is. It was a fawed conclusion.
It is obvious to anyone with an ounce of logic that the PS 5 does not have VRS. From the fact that Sony didn't list it as a feature, to the fact that the PS5 uses older RDNA 1 ROPs which don't have the alterations to perform VRS, when the XSX does have the newer RDNA 2 ROPs with VRS hardware. You then have devs who have spoken about how they are using a software solution for VRS on the PS5, which is similar to what they were using on the PS4 while they are using the hardware VRS on XSX. Then we have game released that use VRS on XSX but not on PS5. AMD have also tied in VRS to Direct X 12 Ultimate and no other API.
You really have to be willfully ignorant at this point to continue thinking that the PS 5 has VRS.
And then it also seems that Sony going silent on lower-level specs of their GPU must mean it's an inferior GPU that lacks all RDNA2 features. Except for raytracing that Cerny happened to have to confirm three times on several interviews (otherwise some would still believe the PS5 was doing RT through the CPU or something).
Sony has spoken about their main GPU features including their Cache Scrubbers, their GE and their Primitive Shaders. Until they say they have other features it's most likely that they don't.
Using your logic, Microsoft hasn't said the XSX doesn't have Cache Scrubbers so it does have them until they say otherwise.
The reality is that the PS 5 does lack alot of Rdna 2 features. It lacks Mesh Shaders, VRS, SFS and infinity cache for starters. The XSX also lacks some RDNA 2 features like infinity cache as well.
The fact is that Microsoft and AMD actually did alot of common work together on RDNA2 and Direct X 12 Ultimate. Go on AMDs site and see how much Dx12 Ultimate is on there. The only API that AMD supports for Mesh Shading, VRS and Sampler Feedback on RDNA2 is DX 12.
That may be the reason why PS 5 doesn't have them, or it could be that like MS said they had to wait longer to get those functions and maybe Sony didn't want to wait. Or maybe Sony just didn't see the value in them. Who knows.
I'm not sure why you have taken the whole "fanboys were saying the PS5 RT wasn't hardware based" so personally. That's just the ramblings of 12 year old console warriors. On the other side you have Sony nuts saying the XSX 12 tflops were GCN flops and not RDNA flops. It's all just sillyness.
Care to point to any official statement corroborating this?
I'm not sure Nvidia go around putting out official statements on everything they do, but there's plenty of articles on them using Int8 and Int4 and the benefits they get over higher precision.
https://developer.nvidia.com/blog/int4-for-ai-inference/
Microsoft didn't change the GPU to get "lower precision abilities". Higher throughput RPM of INT8 and INT4 has been in all RDNA GPUs except the first Navi 10.
It's on the RDNA1 whitepaper.
It would actually take Sony to
make changes to the GPU to
not have 4xINT8 / 8xINT4 throughput. Either they did that or not is up for confirmation (or
faith, apparently).
It'd be best to stop conflating Microsoft's declared "ML enhancements" with a feature that has been present in all RDNA GPUs released since October 2019.
The RDNA white paper refers to Int8 texture sampling which has nothing to do Int8 for compute.
Also, not all RDNA cards had int8 and int4 capabilities, it was only on some cards. It was not a stock feature found in all of them. That's the same situation with RDNA 2. Microsoft themselves said they added the customization to their GPU, meaning it was base across all cards.
There is no word that Sony also added it.
How about you send me some confirmation from Sony that the PS5 GPU has Int8 and Int4 precision then?
Silence from Sony tends to indicate they don't have something. They mentioned FP16 on the PS4 Pro, and I would have expected they would have said it had it if it did.
But like I said, I look forward to you providing me with the offical confirmation that the PS5 has it.
The only confirmation is that it doesn't have Machine Learning by one of its principal engineers.