AMD: RDNA 3 Speculation, Rumours and Discussion

Status
Not open for further replies.
If Nvidia take the piss with Ada pricing as recent indicators seem to suggest and AMD go the high perf/dollar route like RV770, then as long as RT performance holds up in line with Ada at the same performance tier (a big if I know) then with the recent FSR2 improvements I can see me being tempted to AMD this generation.
 
If Nvidia take the piss with Ada pricing as recent indicators seem to suggest and AMD go the high perf/dollar route like RV770, then as long as RT performance holds up in line with Ada at the same performance tier (a big if I know) then with the recent FSR2 improvements I can see me being tempted to AMD this generation.
It would take a lot for me to be lured away from Nvidia.

But I wish AMD the best. They lured me away from Intel by releasing a kickass product at the right time.. so there's nothing that says it's impossible, but I'd need another 9700pro-level beast at this point.
 
It would take a lot for me to be lured

Same. But the stars are possibly starting to align for me it seems. It all comes down to Ada pricing. This gen for the first time in 25 years I'm willing to spend "crazy" money on a x080 level GPU and I've even accepted a possible increase from the MSRP of the 3080 of up to $150. But if NV think they can try and charge me $1000+ for x080 level performance then I'm done with them.
 
So I think OREO is required to support distributed vertex shading combined with coarse rasterisation.

My theory:

Vertices are distributed by a central scheduler, in groups of hardware threads, to any WGP that's available. Using a cut-down vertex shader, which only exports position, the resulting triangles are then coarse-rasterised. Only after this has been done and the screen-space tiles covered by a triangle have been identified, is the full vertex shader evaluated for each triangle's vertices (to generate all relevant attributes).

To perform the full evaluation of the vertex shader, each triangle is sent to the shader engine that owns the screen space tile touched by the triangle. So the shader engine has to construct hardware threads for the vertices received and assign them to WGPs.

If a triangle touches more than one screen space tile then each shader engine will separately evaluate the full vertex shader, for the triangle's vertices.

Once each shader engine has evaluated the full vertex shader, the triangles can be finally assembled and fine-grain rasterised.

As a result of the varying workloads of shader engines, fully-assembled triangles will be pixel shaded in an ordering that no longer corresponds with developer intent. This is because adjacent or overlapping triangles will have originally been position-only shaded by any shader engine and only arrived at the final shader engine for pixel shading after a journey that takes an indeterminate amount of time, versus other relevant triangles.

I believe this is the problem OREO solves, it allows the GPU to pixel shade triangles in an arbitrary order but the result in the render target (and depth buffer) is in agreement with developer intent.

All of this rests upon "next gen geometry" ("primitive shaders") which is something that has been confirmed for RDNA 3: the DirectX/OpenGL vertex processing pipeline is no longer executed in the set of shaders separated by fixed-function hardware that we've known for decades.

Naturally, this makes tessellation and geometry shading more complex, as both of these techniques generate vertices as output from shaders. AMD has solved that problem.

In theory, distributed final vertex shading takes us back to the old problem of multi-GPU rendering (alternate line, split frame. or screen-space tiled rendering): the vertex shader has to be ran by multiple shader engines for some vertices, so there is an overhead to distributed final vertex shading when triangles span screen space tiles.

Once you've got a combination of:
  • next gen geometry shading
  • vertex-position-only shading
  • coarse grained rasterisation
  • multiple shader engines each aligned to an exclusive set of screen space tiles
  • final vertex shading
  • fine-grained rasterisation
  • opaque random export order
You then have, in my opinion, all the ingredients required to support a GPU that consists of multiple compute chiplets, each functioning as a shader engine, each aligned with a set of screen space tiles.

This could work ok if there's say, only 2 compute (WGP) chiplets for the highest end. You split the biggest compute chip into two, where the die space savings are greatest anyway, and get two SKUs out of it (one with 1 chiplet, 1 with 2). Then the constrained one, the WGP chiplet version, is assumedly going to be running at such high resolutions anyway that triangle overlap will be minimized.

This should hit forward shaded games, like Doom Eternal/Call of Duty the most, while UE5 titles and any engine that looks similar shouldn't care at all. Heck with visibility buffers coming to Call of Duty, Unity, and probably much else in the way of still independent AAA engines the overhead should be minimized. And most of the frame time is spent on pixel/compute/RT so I don't see such a scheme incurring too much of a penalty if done this way. Obviously if done with that "6 WGP chiplet" setup it would start constraining a lot more, a point in favor of just two big WGP chiplets.
 
Same. But the stars are possibly starting to align for me it seems. It all comes down to Ada pricing. This gen for the first time in 25 years I'm willing to spend "crazy" money on a x080 level GPU and I've even accepted a possible increase from the MSRP of the 3080 of up to $150. But if NV think they can try and charge me $1000+ for x080 level performance then I'm done with them.
Thing about Nvidia is... they'll come with some new proprietary must-have feature, which puts the ball back in their court and has the rest of the industry on the back foot again.

Every time AMD seems to catch up to them and potentially release a great competing product... Nvidia's got something new to push lol.
 
Thing about Nvidia is... they'll come with some new proprietary must-have feature, which puts the ball back in their court and has the rest of the industry on the back foot again.

Every time AMD seems to catch up to them and potentially release a great competing product... Nvidia's got something new to push lol.

The only thing with that is, if the price is too high at some point no amount of "must-have" features will get some people to buy it. If it's really high, then nothing might get most people to buy it. This last round it helped a lot that the US government helped people buy things like TVs and graphics cards by telling them that they can't be evicted from their apartments for non-payment of rent during covid. You should hear the stories from my friend who manages 3 apartment complexes. In one of them a couple tenants convinced most other tenants that if they stopped paying rent they could buy all sorts of things. So, most stopped paying rent and suddenly boxes from Amazon, NewEgg, and all sorts of online shops were delivering boxes of stuff to those apartments.

So, what's the next thing that will help people afford high priced graphics cards in the same numbers as the 3xxx series now that governments aren't subsidizing their purchase?

Regards,
SB
 
The only thing with that is, if the price is too high at some point no amount of "must-have" features will get some people to buy it. If it's really high, then nothing might get most people to buy it. This last round it helped a lot that the US government helped people buy things like TVs and graphics cards by telling them that they can't be evicted from their apartments for non-payment of rent during covid. You should hear the stories from my friend who manages 3 apartment complexes. In one of them a couple tenants convinced most other tenants that if they stopped paying rent they could buy all sorts of things. So, most stopped paying rent and suddenly boxes from Amazon, NewEgg, and all sorts of online shops were delivering boxes of stuff to those apartments.

So, what's the next thing that will help people afford high priced graphics cards in the same numbers as the 3xxx series now that governments aren't subsidizing their purchase?

Regards,
SB
I'll say the same thing I said in the other Nvidia thread about it.... They'll have a GPU for you that you can afford... that's what the 30 series will be for until affordable SKU 40-series cards hit the market.

And truthfully... what you mentioned above is something that has always happened and will continue to happen, regardless of what the economy looks like. Some people just have messed up priorities... and they'll do whatever it is they have to.
 
This could work ok if there's say, only 2 compute (WGP) chiplets for the highest end. You split the biggest compute chip into two, where the die space savings are greatest anyway, and get two SKUs out of it (one with 1 chiplet, 1 with 2). Then the constrained one, the WGP chiplet version, is assumedly going to be running at such high resolutions anyway that triangle overlap will be minimized.
With the 2 or 1 compute chiplet scenario, there's no rumoured GPU that has a maximum of 24 WGPs that's also rumoured to be chiplet based. Rumours say 48, 30 and 16 are the maxima for Navi 31, 32 and 33.

This should hit forward shaded games, like Doom Eternal/Call of Duty the most, while UE5 titles and any engine that looks similar shouldn't care at all. Heck with visibility buffers coming to Call of Duty, Unity, and probably much else in the way of still independent AAA engines the overhead should be minimized. And most of the frame time is spent on pixel/compute/RT so I don't see such a scheme incurring too much of a penalty if done this way. Obviously if done with that "6 WGP chiplet" setup it would start constraining a lot more, a point in favor of just two big WGP chiplets.
Overall, I agree. It's worth remembering that vertex shading costs a tiny proportion of ALU cycles per frame anyway (5% these days?), and so in a compute-heavy architecture which RDNA 3 appears to be, it's probably mostly a question of L3 cache hit rates for vertex shading across multiple chiplets, when running the full VS.
 
The only thing with that is, if the price is too high at some point no amount of "must-have" features will get some people to buy it. If it's really high, then nothing might get most people to buy it. This last round it helped a lot that the US government helped people buy things like TVs and graphics cards by telling them that they can't be evicted from their apartments for non-payment of rent during covid. You should hear the stories from my friend who manages 3 apartment complexes. In one of them a couple tenants convinced most other tenants that if they stopped paying rent they could buy all sorts of things. So, most stopped paying rent and suddenly boxes from Amazon, NewEgg, and all sorts of online shops were delivering boxes of stuff to those apartments.

So, what's the next thing that will help people afford high priced graphics cards in the same numbers as the 3xxx series now that governments aren't subsidizing their purchase?

Regards,
SB
I seriously doubt your friend's story is true, by the way. People in apartments dont typically get involved with each other to such a degree in the first place. And it really lines up with a lot of nonsense I read about fake stories about the 'suffering' of landlords and whatnot.

Of course, the stimulus checks surely helped some people buy luxuries when they were otherwise comfortable, but the idea that masses of people all stopped paying rent in order to buy GPU's and whatnot is ridiculous.

I think the thing that really helped people justify high priced GPU's this last generation was "Well I'll mine when I'm not gaming on it". This seems like a much better argument for why high priced GPU's wont be tolerated this go around. We've seen this already with Turing, which didn't sell well at all. Nvidia may hope their much bigger performance improvement will make for a totally different situation, but I think there are absolutely mental barriers for most in terms of what they're willing to spend, regardless of performance, as you say.
 
I seriously doubt your friend's story is true, by the way. People in apartments dont typically get involved with each other to such a degree in the first place. And it really lines up with a lot of nonsense I read about fake stories about the 'suffering' of landlords and whatnot.

Of course, the stimulus checks surely helped some people buy luxuries when they were otherwise comfortable, but the idea that masses of people all stopped paying rent in order to buy GPU's and whatnot is ridiculous.

I think the thing that really helped people justify high priced GPU's this last generation was "Well I'll mine when I'm not gaming on it". This seems like a much better argument for why high priced GPU's wont be tolerated this go around. We've seen this already with Turing, which didn't sell well at all. Nvidia may hope their much bigger performance improvement will make for a totally different situation, but I think there are absolutely mental barriers for most in terms of what they're willing to spend, regardless of performance, as you say.

You may think that but they invited me over to take a look in the apartment complex and yup. Delivery guys were just leaving tons of boxes for residents who hadn't paid rent in months.

A lot of them were evicted after they didn't pay their multi-thousand USD rent bills once covid lockdown was over. Generally between 10-30 thousand USD depending on how long the tenant had been abusing the system and how many bedrooms their apartment was (Studio, 1 BR or 2 BR).

Regards,
SB
 
Last edited:
I seriously doubt your friend's story is true, by the way. People in apartments dont typically get involved with each other to such a degree in the first place. And it really lines up with a lot of nonsense I read about fake stories about the 'suffering' of landlords and whatnot.
You have a lot more faith in people that is far from reality. There are many lower income tenants that would use any opportunity to not pay rent and instead buy a big screen TV or a new pair of Nike shoes.
 
You may think that but they invited me over to take a look in the apartment complex and yup. Delivery guys were just leaving tons of boxes for residents who hadn't paid rent in months.
You would have stated this part to begin with, not just said you 'heard it' from a friend.

And why would they invite you over to see such a thing? Come on now. This isn't at all believable unless you really want to believe it for confirmation bias reasons.

You have a lot more faith in people that is far from reality. There are many lower income tenants that would use any opportunity to not pay rent and instead buy a big screen TV or a new pair of Nike shoes.
It's not about 'having faith' in people, it's about countering nonsense claims intended to denigrate less well off people, acting like they're just leeches who are only out for a free check. It's not that such people dont exist whatsoever, but this idea that entire apartment complex blocks had all stopped paying rent and were just trying to get free stuff is absolutely ridiculous.

Disappointed to see people here try and reinforce such notions. It's pure 'poor people bad, rich people good' mentality.

Again, there's FAR better explanations for why people were buying overpriced GPU's the past couple years than this crap.
 
Disappointed to see people here try and reinforce such notions. It's pure 'poor people bad, rich people good' mentality.

Again, there's FAR better explanations for why people were buying overpriced GPU's the past couple years than this crap.

Sorry for continuing the off-topic... but both mining and government stimulus likely contributed to the sizeable increase in gaming GPU purchases over the past 2 years. I know anecdotally that the U.S. COVID stimulus payments (and other such payments, like PPP funds) went to many people who (1) didn't lose their jobs or suffer pay cuts and, so, treated the stimulus they nonetheless received as "pocket change" or (2) leveraged the inability of landlords to evict to spend money that would have otherwise gone to rent. Of course, mining also heavily contributed (as discussed in many other topics on this board). No one is saying "poor people bad, rich people good."
 
I have no idea what people spent their money on or what their motivations were but i can confirm that many people stopped paying rent in my area.
 
Last edited:
It's not about 'having faith' in people, it's about countering nonsense claims intended to denigrate less well off people, acting like they're just leeches who are only out for a free check. It's not that such people dont exist whatsoever, but this idea that entire apartment complex blocks had all stopped paying rent and were just trying to get free stuff is absolutely ridiculous.

Disappointed to see people here try and reinforce such notions. It's pure 'poor people bad, rich people good' mentality.

The extrapolation that your fellow forum users or the culture in general is unable to understand the concept that generalizations don't apply to every single individual is entirely on your own and far harder to believe than the story at hand. Don't insult our inteligence, nor assume you can predict our thought process.

I know this preemptive safe-guarding of culture from possible negative stereotypes over whatever group one choses to consider to be "the disenfranchised" is a common practice, specially among sheltered middleclass riddled with privilege guilt. I know this behaviour well since I was exactly like that until I was in my mid tweanties. Once I've befriended people ouside my bubble, and have also been poor myself (in poor cities at a poor country) working and living with bluecolar people, I learned this "cultural nannying" is more demeaning than helpful. The vast majority of poor people don't feel protected or respectes by this type of mentality, on the contrary, they feel patronized and the "protector" are seen as out of touch and arrogant.

In US, for example, I often hear Democrats ask "why do blue-colar workers vote Republican" 9 out of 10 times: this is why, and when the same Democrat then writes that off with a lame excuse as "must be them racists" it only adds insult to the injury.
 
Last edited:
Status
Not open for further replies.
Back
Top