AMD Execution Thread [2024]

Will RDNA4 compete with Blackwell? Far as I heard, AMD isn't aiming to get high-end GPUs out for RDNA4 so they will be competing with mid-tier Lovelace products.

Yes but there will presumably be mid-tier Blackwell parts that will replace the mid-tier Lovelace parts in the same price bracket (I would hope).
 
Yes but there will presumably be mid-tier Blackwell parts that will replace the mid-tier Lovelace parts in the same price bracket (I would hope).
I mean, Nvidia will gladly sell you a 150mm² GPU for $500, named like some upper midrange part. And AMD seem to have no motivation to actually compete with Nvidia, they just want to sell you a clearly worse GPU at a slightly better price since they're obsessed with margins rather than competition or marketshare.

It's not promising that AMD are avoiding any high end products for RDNA4. It strongly suggests they dont believe it has the performance efficiency to be viable(ala RDNA1). Which likely means it's going to have clear technological inferiority compared to Blackwell and they'll simply try and get by on the 'clearly worse GPU at a slightly better price' thing yet again.
 
It's not promising that AMD are avoiding any high end products for RDNA4. It strongly suggests they dont believe it has the performance efficiency to be viable(ala RDNA1). Which likely means it's going to have clear technological inferiority compared to Blackwell and they'll simply try and get by on the 'clearly worse GPU at a slightly better price' thing yet again.
No, N4C was canned for purely practical reasons.
 
It's like they both are setting prices which make these pesky margins manageable enough for the products to bring in solid profits and thus make sense to, you know, make. Which is what all commercial companies do.

The sad part is competition still isn't high enough that the average buyer can stop worrying about VRAM. It's not like VRAM is expensive overall, but tiering it out into higher end products means even if "16gb" is the magic sweet spot like 8gb was for last gen, it seems likely to be stuck in the $500+ price point for too long. Hopefully Battlemage brings change, they're trying to aim for this year but don't know if they'll make it.
 
The sad part is competition still isn't high enough that the average buyer can stop worrying about VRAM. It's not like VRAM is expensive overall, but tiering it out into higher end products means even if "16gb" is the magic sweet spot like 8gb was for last gen, it seems likely to be stuck in the $500+ price point for too long. Hopefully Battlemage brings change, they're trying to aim for this year but don't know if they'll make it.
We've discussed this a number of times. Adding VRAM above what's actually necessary (and you can decide that figure by comparing 7600s and 4060Tis with ease thankfully) is just making perf/price worse for no reason at the moment when the advancement of that is low enough to be considered a zero by many already.
 
I'm just going to say the VRAM issue with Nvidia is a bit more complex and nuanced. VRAM on Nvidia isn't the same as VRAM on AMD or Intel at the moment. People might not like the implications here but reality is you can do more with VRAM on Nvidia both in gaming (it's really console+, and currently mostly RT and to some extent things like FG that push VRAM really above 8/12gb) and outside of gaming (due to the growing popularity of creation/AI workloads on the consumer end).

And yes the non gaming implications are signficant here as well because Nvidia is selling the idea of content creation to consumers but the VRAM segmentation complicates the market choice, particularly for those that also want to game.
 
As for the margin discussion I feel it often seems to be discussed without looking at the whole picture.

Fixed costs are rising for new products and overall market size growth as stalled to a large extent (gaming isn't going anymore mainstream, and the next "China" isn't here yet). Without any prospect of signficant volume growth the only option really is to extract more per unit sold.

Also it's always a bit strange in that users do place value in the associated software stack/features associated with each IHV that cannot be directly cpatured in the unit BOM yet also lament margins. I do find PC gamers (well the enthuasists) have always been a bit strange in this aspect in that it always seems like they feel that other than the base cost of the game itself any other software value add on related to gaming should basically be "free."
 
As for the margin discussion I feel it often seems to be discussed without looking at the whole picture.

Fixed costs are rising for new products and overall market size growth as stalled to a large extent (gaming isn't going anymore mainstream, and the next "China" isn't here yet). Without any prospect of signficant volume growth the only option really is to extract more per unit sold.

Also it's always a bit strange in that users do place value in the associated software stack/features associated with each IHV that cannot be directly cpatured in the unit BOM yet also lament margins. I do find PC gamers (well the enthuasists) have always been a bit strange in this aspect in that it always seems like they feel that other than the base cost of the game itself any other software value add on related to gaming should basically be "free."
Steam has been hitting new ATH's regularly over the past few years. The idea that the PC gaming market has 'stalled' is completely false.

I've genuinely never seen any other place on the internet go to the lengths that this place does to justify Nvidia's blatant greed.
 
I mean, Nvidia will gladly sell you a 150mm² GPU for $500, named like some upper midrange part. And AMD seem to have no motivation to actually compete with Nvidia, they just want to sell you a clearly worse GPU at a slightly better price since they're obsessed with margins rather than competition or marketshare.

It's not promising that AMD are avoiding any high end products for RDNA4. It strongly suggests they dont believe it has the performance efficiency to be viable(ala RDNA1). Which likely means it's going to have clear technological inferiority compared to Blackwell and they'll simply try and get by on the 'clearly worse GPU at a slightly better price' thing yet again.
The only way to regain strategic parity is if diverging approaches contend against each other. Only then could we determine whoever ultimately had the right vision ...
 
They're divergent in that AMD's just gonna bruteforce it.
Otherwise, fair playground.
It makes you really wonder who's going to blink first ...

Will one vendor implement more exotic hardware or will the other abandon said hardware in the end ?
 
It makes you really wonder who's going to blink first ...

Depends on whether they can deliver the performance and IQ that people want. If both approaches work out then great. However the wind is currently blowing in the direction of more dedicated hardware not less across PC IHVs and consoles.
 
Depends on whether they can deliver the performance and IQ that people want. If both approaches work out then great. However the wind is currently blowing in the direction of more dedicated hardware not less across PC IHVs and consoles.
How can you be absolutely certain of your last statement especially in the latter case when perf/logic complexity matters more than ever in an era where the leading integrated graphics circuit designer has shown us yesterday that they're still stuck on the very same process node technology as before ?
 
The only way to regain strategic parity is if diverging approaches contend against each other. Only then could we determine whoever ultimately had the right vision ...
I dont think any 'approach' matters if AMD simply dont deliver on architectural performance. RDNA2 was brilliant in this regard and what had many hopeful for RDNA3, but RDNA3 turned out to be the biggest dud they've ever produced instead. I really dont think people appreciate how much of lead weight this put on the GPU market. AMD had started some real progress and momentum was very much within reach, but RDNA3 killed it. Dead. In its tracks. All the positivity and expectations AMD had built up as a potential challenger were done, and now nobody believes in AMD on the GPU side whatsoever, and not unjustifiably so. I'll say again - RDNA3 is beyond bad. It's one of the worst architectures AMD has ever produced.

AMD's chiplet strategy could have been competitive and perhaps significant in changing the market had RDNA3 been good.
 
Back
Top