Digital Foundry Article Technical Discussion [2024]

Sure, but are we then saying that the ideas behind any of these other things are not valuable? Many techniques that are simple to implement are shared freely, including basically all of the foundation of computer graphics. DLSS is a case where NVIDIA very much could still today - many years later - be profiting primarily off their ML hardware since it is likely not very practical to run on GPUs without similar hardware. Intel could likely do it of course, but I don't think the distance between DLSS and XeSS is really large enough to be a primary driver of why one would buy one or the other.

That's my point: ideas should be free to be shared, but specific implmentations, not necessarily.
Again using DLSS as an example: I agree if NVIDIA actually publishes the weights some people might be able to improve on it (e.g. fine tuning against some edge cases for a specific game), but if everyone can use the weights I'm not sure if NVIDIA would started such project at the first place. I mean, apparently no one else did.
 
I'm watching this before the DF interview

He calls the PS5 gpu an RDNA2 gpu. Wasn't this one of the things that people questioned? Wasn't there some discussion about whether PS5 was RDNA1 or 2 on this forum?

Yeah, there was. PS5 is "full RDNA2". Only things it's missing are ... almost all the things that were added to RDNA1 to make it RDNA 2. Except for RT, where it got v1 compared to RDNA2's v1.1. But there seems to be a very good reason for this.

PS5 was supposed to be almost aligned with PC RDNA1 and launch in 2019 like PC RDNA1. Navi 10 and PS5 were supposed to be almost twins in that regard. PS5 was delayed during development, probably to add boost clocks and/or RT, which it did not originally have. And so despite missing most of the features that differentiate PC / Xbox RDNA2 from RDNA 1, for marketing reasons Sony and AMD called it RDNA2.

In reality, it's clearly beyond PC RDNA1 in features, but clearly behind PC / Xbox RDNA2. It's got the most important addition though: RT. But good luck explaining that to gamers. The moment MS made a huge fuss about how they had RDNA2, Sony were going to say they had it too.

In Road to PS5, Cerny said something to the effect that if AMD released a PC product very similar to PS5 at a similar time, it was proof that their very close collaboration had been successful. That product was IMO clearly Navi 10. Add RT to Navi 10 and that's basically PS5. Same configuration, same ROPs, same Geometry Engine, same lack of mixed precision integer ops and all of that.

I'd probably go further. I don't think that PS5 is simply a custom RDNA part, I think PS5 might have helped define the starting point for RDNA itself: a common set of requirements that suited both PC and console and allowed for shared development.

The reality is probably far more interesting than just "is it 1 or 2". ¯\_(ツ)_/¯
 
Last edited:

Their round-table about the ps5 pro event/presentation.

I thought Oliver's interview with Mark Cerny was well done. I also appreciate Mark Cerny's talk explaining the tech. I even kind of like how he deferred a few answers in his interview with Oliver to Mike Fitzgerald to keep him included. Seems like he'd be a good person to work with.
 
I'm watching this before the DF interview

He calls the PS5 gpu an RDNA2 gpu. Wasn't this one of the things that people questioned? Wasn't there some discussion about whether PS5 was RDNA1 or 2 on this forum?
For PS5 they use GFXIP 1013 and on PC it's GFX 1030 while with Xbox series consoles it's GFX 1020 ...

RDNA2 is just a "marketing brand" for a specific set of GPUs that shares some technologies in common with each other. It's all somewhat relative ...
 
Bit surprised at the admission of information around removing the double float point launch. I guess, I would ask why they didn't keep it in. But the recompile/binary costs were too high for them to consider it.

There doesn't seem to be emulation happening as much here as I once thought, so all the talk around the binaries is interesting.
 
Right those answers are with respect to the GNU-style definition. But that has little relevance here, and nothing to do with the topic of sharing knowledge/research. The relevant question in terms of knowledge sharing is can I learn from what Unreal does and build on it myself, for which I will quote the FAQ:



I don't mean to get on a big tangent here, but I highly disagree with the notion that Unreal is contributing at all to any lack of industry advancement and information-sharing. There's lots of things you can complain about Epic, but Unreal pretty much defines the high road as far as that stuff goes, especially compared to hardware and platform vendors.
How much can I copy without paying from UE5, before laywers poke me?
 
Bit surprised at the admission of information around removing the double float point launch. I guess, I would ask why they didn't keep it in. But the recompile/binary costs were too high for them to consider it.
Both GFX11 and GFX10 implementations have incompatible sets of instruction encodings so it's not as if the upgraded SKU could move to a new ISA without the guarantee of backwards compatibility ...

PS5 is BC with PS4 software because GFX7 (PS4 GPU ISA) is a strict subset of GFX10 (PS5 GPU ISA). Even though GFX9 introduced lower precision packed math feature, it's not necessarily "forward compatible" with GFX10's lower precision packed math feature since GFX10 is constrained by it's design goal (BC) of being made for console customers ...

It's not all that big of a surprise if they chose not to reimplement/backport the VOPD feature on another custom GFX10 implementation ...

Programmers can also potentially hard code (magic numbers anyone ?) the wave sizes in their source code so they'd have to rewrite some of the code if they were explicitly using wave32 to gain the benefits of VOPD. If the program was designed against wave64 the compiler might very well could have applied the feature for optimization purposes ...
 
Back
Top