That RDNA 1.8 Consoles Rumor *spawn*

nah he is just paid by Sony !

I took his comments because he is not from Epic. ;)

At least, there is no Sony Corp investment inside ID software.

I think my ps4pro can last a bit more [emoji28]

The high frequency of the APU is take in consideration with the cooling system and the PS5 size.

It is in french no problem if you don't understand but this guy did using an AR app a comparison of PS5 and XSX size and the PS5 is huge.

And you will be better if like me you have the OG PS4 Pro zn horrible console for the noise.
 
To me it seems that the PS5 Geometry Engine (as described in the patent) is integrated into the traditional geometry pipeline. My suspicion is that when work started with it, is was being added/integrated to RDNA1 CUs developed at the time. And because of that big customization, a very limited set of RDNA2 features were later back ported to PS5 CUs. It may be that the PS5 GPU is feature wise more like RDNA 1.2 rather than RDNA 1.9.
 
Praises by other people..... I think the praising has not much bearing on how good or bad any console is, in special when comparing. But if we want to do that again, there was a ex GG dev that thought the PS5 was underwhelming, in comparison to the XSX.

https://www.criticalhit.net/gaming/...xbox-series-x-is-a-beast-compared-to-the-ps5/

Billy Khan have access to PS5 and XSX devkits* not this guy not working in the industry since a decade, Gavin Stevens bullshit level. And he was not at GG in Amsterdam but from Cambridge studio and all the studio he worked were close by Sony this is how he quits game industry maybe he behold a grudge against Sony. ;)

* He said later than XSX is a good console too.

Edit: Having devkits is much more reliable than people not having it(Gavin Stevens) and it is even worse for people not working anymore in the industry since a decade(ex GG Cambridge guy).
 
Mesh shader is and Microsoft/nVidia name for same thing as AMD primitive shader.
As of our current understanding, primitive shaders are enabled on RDNA 1 cards, but as of direct3d feature evaluation It shows no support. Which means that despite how close in functionality it is, that mesh shader code path won’t run on 5700xt. Scroll down the output https://forum.beyond3d.com/posts/2138571/

Perhaps coincidentally all the features that MS marketed for XSX and DX12U are unsupported by RDNA1
 
Last edited:
Billy Khan have access to PS5 and XSX devkits* not this guy not working in the industry since a decade, Gavin Stevens bullshit level. And he was not at GG in Amsterdam but from Cambridge studio and all the studio he worked were close by Sony this is how he quits game industry maybe he behold a grudge against Sony. ;)

* He said later than XSX is a good console too.

Edit: Having devkits is much more reliable than people not having it(Gavin Stevens) and it is even worse for people not working anymore in the industry since a decade(ex GG Cambridge guy).

Nah, his friends work there though, whom he still is in contact with. With 'devs' and journalists praising things and the other way around, we can go on all day long.
 
Nah, his friends work there though, whom he still is in contact with. With 'devs' and journalists praising things and the other way around, we can go on all day long.

Again he can say what he wants like I said GG Cambridge and all the studio he worked for were closed by Sony. You can verify his LinkedIn account. ;)

He can say what he wants, he can lie. Billy Khan have access to the two consoles devkits and he said the consoles are good. This is much more value than a dev quitting gamedev after Sony closing some studios.:)

This is why I took no journalist, no guy from Epic and no people without the devkit.
 
Last edited:
I think you're overcomplicating what Sampler Feedback's are. The feature as I like to think about it is really nothing more than a feedback texture containing a requested MIP level for the sampled location.

Having actually looked at the spec for myself, in the case of streaming there is also a MinMip/'residency' texture describing the current MIP level. Herein lies the problem behind "MinMip texture", since it reflects the current tiled resource mappings so updating your MinMip texture involves doing calls to UpdateTileMappings API to change the tiled resource mappings so it pretty much kills any hope of this texture streaming system being performant ...

Seeing as how AMD specifically discouraged against using that API before and they don't recommend using tiled resources for RDNA, I'm going to assume that their recommendation remains true for RDNA2 until they show a robust reduction in their binding costs otherwise ...

Sampler feedbacks are about as valuable as tiled resources are when it comes to texture streaming which amounts to nearly nothing ...

The Minmip is the texture indirection I was talking about.
RDNA does not support sampler feedback, a feature which explicitly makes use of tiled resources, while RDNA 2 does.
 
Again he can say what he wants like I said GG Cambridge and all the stupid he work for were closed by Sony. You can verify his LinkedIn account. ;)

He can say what he wants, he can lie. Billy Khan have access to the two consoles devkits and he said the consoles are good. This is much more value than a dev quitting gamedev after Sony closing some studios.:)

This is why I took no journalist, no guy from Epic and no people without the devkit.

And what were you expecting the guy to say? He develops games for a living and has nothing to gain by delving into fanboy power talk (like Mr RDNA 1.9 or the SIE Engineer).
 
And what were you expecting the guy to say? He develops games for a living and has nothing to gain by delving into fanboy power talk.

Like many devs he can say nothing. ;) He said spontaneously than PS5 is good on Twitter and after complaints from people on Twitter, he said too Xbox Series X is good.

He never worked for any Sony studios he is from a third party studio.
 
The Minmip is the texture indirection I was talking about.
RDNA does not support sampler feedback, a feature which explicitly makes use of tiled resources, while RDNA 2 does.

Just because RDNA2 has sampler feedbacks doesn't mean that it'll have a robust implementation of tiled resources. It's safe to assume that they still probably haven't found a way to do cheap tile mapping updates with RDNA2 unless they explicitly state or show otherwise ...

Again, I'm not holding much out hope that they figured out to make tiled resources performant when the issue has persisted for many hardware generations from GCN to RDNA ... (and likely RDNA2 as well)
 
Just because RDNA2 has sampler feedbacks doesn't mean that it'll have a robust implementation of tiled resources. It's safe to assume that they still probably haven't found a way to do cheap tile mapping updates with RDNA2 unless they explicitly state or show otherwise ...

Again, I'm not holding much out hope that they figured out to make tiled resources performant when the issue has persisted for many hardware generations from GCN to RDNA ... (and likely RDNA2 as well)

Don't use guidelines about RDNA to ascertain things about RDNA 2. There are major differences in the architectures as illustrated by the mesh shading example.
 
Yes is too late, but they (Sony) could try to repair the damage with a decent interview to a decent tech-site.
Sony should repair what, one comment that says the PS5 is missing one feature (among hundreds/thousands) from the complete RDNA2 ISA?
And to satisfy whom? The 13 forum dwellers that actually care that the PS5 is missing this one unnamed feature, most (if not all) of whom already decided they'd buy a SeriesX a decade ago?

Finding out which feature is missing might be interesting, but it isn't the very least important. Cerny confirmed that Sony's collaboration with AMD on the GPU is deep(er than many thought to be possible), so it's just natural that they'd leave a couple of features behind in order to pursue their custom optimizations.


Hmm ... I'll take a punt on maybe VRS.
Considering Sony's success and current/future investment on VR and the fact that VRS' "origins" are on VR optimizations, if there's no VRS on the PS5 then it's because Sony preferred to implement one or a couple of other features that allow/accelerate foveated rendering in a different way.
There are patents from Sony addressing foveated rendering, (which they call "region of interest adjustment") so I have no doubt this will be hardware accelerated. It could be less flexible than VRS, though.

Or perhaps this "one thing" the PS5 missing from RDNA2 is quad-rate INT8 and octo-rate INT4 processing for neural network inferencing. I haven't heard Sony talking about it. But I'm guessing it would be harder to take away these capabilities that are embedded into the ALUs themselves.


So some real problems are there against XSX... That combined with the extreme clock of this console GPU and weird solution found to control temperature spikes lead me to a prudent wait & see in regards of buying it...
Today I learned dynamic clock adjustments that have been in AMD GPUs for over 13 years are weird and high GPU core clocks are bad.
What was it called? Concern trol.. nah, it can't be.
 
Exactly, even the Xbox one had a strong point (executeindirect) compared to the PS4 even though it was massively outgunned everywhere else. We shall know what's up in four months time.

And weaknesses is very relative the console are much more balanced and powerful compared to PS4 and Xbox One. The Jaguar was a bad CPU from the get go.

Before seeing the specs no one were thinking we will have SSD inside console and some fast SSD, after raytracing was a pipe dream, Vega 7 send many people to despair. Console based around Zen2 and RDNA 2, this is huge. The two consoles GPU are based on the same AMD architecture which will release during the same timeframe than the console.

I will buy a PS5 because of the game. But if the PS5 is noisy I will not buy one before a revision. I wait a lot the test of noise level of the consoles.

I have TLOU2 but I played only three hours because weather is hot and my PS4 Pro do a crazy noise.
 
Sony should repair what, one comment that says the PS5 is missing one feature (among hundreds/thousands) from the complete RDNA2 ISA?
And to satisfy whom? The 13 forum dwellers that actually care that the PS5 is missing this one unnamed feature, most (if not all) of whom already decided they'd buy a SeriesX a decade ago?

Finding out which feature is missing might be interesting, but it isn't the very least important. Cerny confirmed that Sony's collaboration with AMD on the GPU is deep(er than many thought to be possible), so it's just natural that they'd leave a couple of features behind in order to pursue their custom optimizations.



Considering Sony's success and current/future investment on VR and the fact that VRS' "origins" are on VR optimizations, if there's no VRS on the PS5 then it's because Sony preferred to implement one or a couple of other features that allow/accelerate foveated rendering in a different way.
There are patents from Sony addressing foveated rendering, (which they call "region of interest adjustment") so I have no doubt this will be hardware accelerated. It could be less flexible than VRS, though.

Or perhaps this "one thing" the PS5 missing from RDNA2 is quad-rate INT8 and octo-rate INT4 processing for neural network inferencing. I haven't heard Sony talking about it. But I'm guessing it would be harder to take away these capabilities that are embedded into the ALUs themselves.



Today I learned dynamic clock adjustments that have been in AMD GPUs for over 13 years are weird and high GPU core clocks are bad.
What was it called? Concern trol.. nah, it can't be.
It is. It's known.
 
Don't use guidelines about RDNA to ascertain things about RDNA 2. There are major differences in the architectures as illustrated by the mesh shading example.

That's just deflection on your part ...

This discussion between us is not about RDNA2's mesh shader implementation. This discussion is about sampler feedbacks and how it's useful with tiled resources ...

In no way should we expect RDNA2 to have a different tiled resources implementation from RDNA seeing as how 95%+ of what was stated in AMD's RDNA optimization guide applied to GCN too despite also having just as major architectural changes like primitive shaders and a wave32 execution mode ...

Why get so insecure over a potentially worthless feature ?
 
Back
Top