D
Deleted member 13524
Guest
AFAIK there's only this slide that was mentioned in the previous page:did Microsoft talk anything about this custom instructions ?
AFAIK there's only this slide that was mentioned in the previous page:did Microsoft talk anything about this custom instructions ?
very general statement this first point, how do we know its something different than int8/4 capabilities?AFAIK there's only this slide that was mentioned in the previous page:
nah, just pr, they are very openly celebreting all xsx/rdna2 features so if they have some other special souce for ml they would talked about it for months nowI assumed them (Microsoft) saying "we added special hardware support" would mean they actually added special hardware over the generic RDNA1 capability of running mixed precision dot4 INT8 / dot8 INT4.
nah, just pr, they are very openly celebreting all xsx/rdna2 features so if they have some other special souce for ml they would talked about it for months now
they have int8/int4 support so "they add" it read used rdna2 cheap with all features wouldn't called it lyingOk so MS is lying?
Did they lie when they said they would be using the power of the cloud special sauce to make the XBOne's CPU 4x as powerful?Ok so MS is lying?
they have int8/int4 support so "they add" it read used rdna2 cheap with all features wouldn't called it lying
when did I quoted Cerny lmao, its your opinion I'm dismissive towards microsoft, I'm here more dismissive towards ps5 int8/4 capabilities simple because I don't think sony added it to consoleI am not surprised this comment comes from you. Always very quick to dismiss everything XSX/MS related in defend to ps5. TF/vrs now ml. And yet you always quote Cerny as he is the only source of truth.
Show me where on the patent it says PS5.
https://patents.google.com/patent/US9892024B2/enThere's no Sony patent showing unified L3 cache in the CPU.
You are just being ridiculous at this point.You mean you couldn't find a single official statement from Nvidia proving your "We know DLSS uses INT4 and INT8" statement, which is why you linked to a page with zero mention of DLSS, upscaling or supersampling.
When a series of GPUs all dont contain a particular feature, then it isn't a base feature. Not all RDNA1 cards had it. It isn't on all RDNA 1 cards. Not sure what part of that you don't understand.Yeah sure, just for "texture sampling" and "nothing to do with compute".
First paragraph of page 14 in the RDNA1 whitepaper:
Here's an official statement from Microsoft.Changing goalposts from "Microsoft changed the GPU to get INT8/4 on RDNA2" to "it was on RDNA1 but not all of them".
Your input to this thread was useless. You challenged something and were shown to be wrong. Rather than say that was the case you just added more and more silly things to confound the issue.Here it is, the trolling and flamebaiting.
Thank you for confirming my suspicion that you didn't create this thread to be clarified, but to post concern trolling on the PS5.
Bye then. This thread is useless.
Maybe? Isn't there a version of Control that had DLSS that was Cuda based, in between the DLSS1.0 and 2.0 versions. And it looked better than 1.0 and worse than 2.0. But nVidia said they have no plans to release it for non RTX cards, they've moved on to 2.0 and above.I am very curious about this. DLSS requires Machine learning.
FidelityFX is the equivalent of DLSS and supposedly coming to both platforms. But we know the PS5 doesnt have ML. So how is it supposed to have it?
Or maybe they indeed "added" dot4 INT8 / dot8 INT4 support over what was present in the One and OneX. That way you could call it "adding".
I assumed them (Microsoft) saying "we added special hardware support" would mean they actually added special hardware over the generic RDNA1 capability of running mixed precision dot4 INT8 / dot8 INT4.
Do we know how granular the availability of different blocks for a semi-custom ic is?So Microsoft ADDED the hardware for Int8 and Int4. That would rule out it being a base RDNA 2 feature or they wouldn't have said that. They never said "we added compute units to the GPU", or "we added Ray tracing hardware" did they?