That RDNA 1.8 Consoles Rumor *spawn*

They way overhyped it and the host introductions and interviews were also super corny, stilted and forced.

It had that fucking awful corporate scripted feel Inside Xbox always has that makes it really feel like a remnant of the Mattrick era.

I mean if thats how you feel about it. I found it to be fine esp with how early it was in the close down
 
It was bad, but it's no overhyped Geoff Keighley production about a controller as far as the horribleness goes.

Let's hope all sides learn from their mistakes and give us better.

Anyways, how horrible events are is way offtopic.
 
Still don't get what was so bad about the may event. It was showing off some third party games that will be available early in the life cycle of the console. Some of them looked great and interesting. At that point we already knew MS was going to be doing different monthly things with a big event planned for their first party titles
The event itself wasn't so much the problem(apart from bitrate etc) .
It's what it was presented as and hyped up to be, was the biggest problem.
Let's be fair, fact is, it was considered a let down by most people.

Hence why there's more pressure on them to do better this week, which I'm personally very hopeful off.

Anyway my main point was, that had wider exposure than this will.
Sony isn't presenting PS5 as the most powerful anyway (for obvious reasons) , so it's not what people will focus on.
Anyways, how horrible events are is way offtopic
Yea sorry, that got away from my actual point.

Which was about, that I think this will have limited exposure and impact.
It wound be nice if Sony came out and clarified, but they really don't need to. Its not like its gonna be talked about much if at all on IGN or something.
 
Last edited:
F1qCUax.png


The engineer of Sony confirms there is only one RDNA2 feature not used by Sony and there is other features not inside RDNA1 or 2 at all.

Just thinking generally about the guy's latest comments ... I mean, none of that actually goes against he said in the tweet that should have stayed private. He said it's "somewhere inbetween 1 and 2" (plus customisations) so I'm sure it is. And that must mean having things that are present in "RDNA1" and no longer present in "RDNA2". I don't think there's any way to get around that, and I still like the backwards compatibility hypothesis.

If Sony want to outwardly call it RDNA2 then that's up to them - and it's probably a lot more accurate than calling it '1' and a lot better externally in terms of message. And I mean it does have what appears to be RDNA2's biggest selling point in ray tracing support.

An important thing to bear in mind about "one less" feature, though, is that features are not the same thing as underlying hardware. You can have the same features, but different hardware with different characteristics.

Sampler feedback maybe...

Hmm ... I'll take a punt on maybe VRS. That seems to be deeply tied to the Compute Unit's inner guts, and I think .... maybe ... those are what aren't quite the same as "PC RDNA 2". This is based on the whole "RDNA 1 still has some GCN elements in the CUs idea" that might be important in terms of having a base to build backwards compatibility support from.

But that's a big stretch and I don't really know what I'm talking about. I need to grab a coffee and do some Googlin'!
 
FP16 is more than sufficient for running ML models. Int8 and Int4 will provide better optimization, but FP16 support is the baseline. I wouldn't worry about ML. If they want _more_ of it, Int8 and Int4 support would have made it better, but it doesn't take ML out as an option because it's not there.
Full implementation of ML is IMHO of little of no interest for consoles.... hope Sony explains officially now what is there and what is not (as MS did)....
 
My overall intuition is that they have the same functionality; but it’s not based nor does it behave the same way that is specified as per DirectX spec.

So they wanted the same things because developers told them; and then went about doing it their way, but I wouldn’t call it AMD tech.
 
Full implementation of ML is IMHO of little of no interest for consoles.... hope Sony explains officially now what is there and what is not (as MS did)....

Wasn't machine learning used to give Halo 5 & Fusion Frenzy HDR support? I wouldn't say ML is no interest for consoles.

Tommy McClain
 
Wasn't machine learning used to give Halo 5 & Fusion Frenzy HDR support? I wouldn't say ML is no interest for consoles.

Tommy McClain
Also DLSS uses the tensor machine learning cores.

So AMD should be able to do similar with their ML hardware. Like I said if one company had a DLSS like tech vs the other without it , it could be a large advantage. You get 1080p performance with 4k visuals thats a big win.

Of course these are rumors so who knows what is actually happening and what both companies have.

These consoles were designed in what 2017 / 18 ? Perhaps Sony didn't see the point of ML then ? or maybe they thought the transistors can be used else where ?
 
Also DLSS uses the tensor machine learning cores.

So AMD should be able to do similar with their ML hardware. Like I said if one company had a DLSS like tech vs the other without it , it could be a large advantage. You get 1080p performance with 4k visuals thats a big win.

Of course these are rumors so who knows what is actually happening and what both companies have.

These consoles were designed in what 2017 / 18 ? Perhaps Sony didn't see the point of ML then ? or maybe they thought the transistors can be used else where ?

You don't need ML specific hardware to implement ML algos. The jury is still out on the usefullness of current ML gpu hw for games.
 
You don't need ML specific hardware to implement ML algos. The jury is still out on the usefullness of current ML gpu hw for games.

I wouldn't worry about it at all seeing as how DLSS isn't going to be standardized. Even with AMD's new reduced precision dot product instructions, it still only has half as much throughput as Turing's tensor cores and then there's the fact that it's implementation and future direction is dictated by a single vendor so when DLSS 3.0 comes around it'll be capitalized around their new hardware rendering it incompatible with other hardware implementations ...

Touting ML specific instructions as an advantage based off of it's applications from another hardware vendor rings hollow since they can't guarantee if that the vendor they're talking about will be compatible with the other vendor's technology or even if it will be performance portable ...
 
Touting ML specific instructions as an advantage based off of it's applications from another hardware vendor rings hollow since they can't guarantee if that the vendor they're talking about will be compatible with the other vendor's technology or even if it will be performance portable ...
I doubt anyone is talking about DLSS on amd gpu's.
It is just used as as an example of ML Upscaling and what it can achieve.
Could also use the MS presentation on DX12 ML where they demo'd ML Upscaling.
 
If I had to make a bet, Ariel and Oberon is Navi 10 with HW RT from RDNA 2 added. All other additions are unique to PS5.

Interesting idea. As AMD patents talk about using texture units for part of traversing the RT acceleration structure (if I'm remembering that correctly), I wonder if that could indicate Sampler Feedback too even such a bet turned out to be a winner? I think SF will turn out to be quite valuable down the line and for more than just texture streaming.

I wouldn't worry about it at all seeing as how DLSS isn't going to be standardized. Even with AMD's new reduced precision dot product instructions, it still only has half as much throughput as Turing's tensor cores and then there's the fact that it's implementation and future direction is dictated by a single vendor so when DLSS 3.0 comes around it'll be capitalized around their new hardware rendering it incompatible with other hardware implementations ...

Touting ML specific instructions as an advantage based off of it's applications from another hardware vendor rings hollow since they can't guarantee if that the vendor they're talking about will be compatible with the other vendor's technology or even if it will be performance portable ...

I think the idea is for something like the DirectML API to become a vendor agnostic way to use a range of different ML hardware, same way DX works for graphics and compute. The future is developers and middleware vendors like Epic making their own ML tools / features for vendor agnostic APIs rather than Nvidia's DLSS somehow coming to other IHVs products.

And it's not just DLSS style frame buffer up-resing that could benefit. MS are already playing with real time texture up-resing which would be awesome for distribution, storage and streaming. And maybe we can finally get some radical improvements in AI!
 
I doubt anyone is talking about DLSS on amd gpu's.
It is just used as as an example of ML Upscaling and what it can achieve.
Could also use the MS presentation on DX12 ML where they demo'd ML Upscaling.
Indeed, but they borrower Nvidias model for that presentation. Sort of goes to show not many players in the field who want to do the heavy lifting to make the models. Unless there is a financial motive for Nvidia to sell or give them out, I still see DLSS as a nvidia only thing.

there is only 1 GPU family I know if that won’t work with ML... and that’s Pascal cards lol.
 
I doubt anyone is talking about DLSS on amd gpu's.
It is just used as as an example of ML Upscaling and what it can achieve.
Could also use the MS presentation on DX12 ML where they demo'd ML Upscaling.

What 'can' be possible doesn't mean we should implicate that similar results and efficiency will be obtained from the alternative ...

Even with DirectML could we actually expect baseline quality implementation of ML upscaling comparable to at least DLSS 2.0 ? If we can't then what compelling case is there to tout a hardware feature that can't give us similar quality of life benefits compared to the bespoke competing technology for demonstration ? It was hard enough to justify the merits of ML upscaling prior to DLSS 2.0 so it will become another objective to prove that a multi-vendor implementation can be done to make it a standard ...

And it's not just DLSS style frame buffer up-resing that could benefit. MS are already playing with real time texture up-resing which would be awesome for distribution, storage and streaming. And maybe we can finally get some radical improvements in AI!

Texture upscaling doesn't need to be done in real-time. That can be done offline and with likely better performance to boot since you wouldn't need to run any inferencing shaders at all when GPUs are optimized to do texture sampling very fast ...
 
Texture upscaling doesn't need to be done in real-time. That can be done offline and with likely better performance to boot since you wouldn't need to run any inferencing shaders at all when GPUs are optimized to do texture sampling very fast ...

It needs to be done in real time if you do it after loading it into memory, after streaming it from the SSD. This could potentially allow you to get your mip 0 (the highest res one) using 1/4 of the data from SSD, or for no transfer from SSD at all if you use a mip level already in memory.

So in the same way you'd trigger streaming from the SSD, you trigger an up-res instead. Smaller downloads, smaller installs, less data needed to be streamed from the SSD. I can see why some developers are looking into it.
 
Indeed, but they borrower Nvidias model for that presentation. Sort of goes to show not many players in the field who want to do the heavy lifting to make the models. Unless there is a financial motive for Nvidia to sell or give them out, I still see DLSS as a nvidia only thing.

there is only 1 GPU family I know if that won’t work with ML... and that’s Pascal cards lol.
AMD already has fidelity FX and I doubt they will sit on it and not make improvements. DLSS is just an example of what can be done. There is nothing stopping amd from using machine learning to improve their own solution
 
Back
Top