RDNA4

Say AMD do provide significant value improvements with RDNA4 but don't see much uptick in sales/market share, do you think they stop that strategy and go back to mirroring nvidia's prices more closely (we tried and it didn't work so maximise $/card) or carry on for a few gens to see if they can get a foothold?
 
This post’s accuracy is not substantiated. Exercise discretion and wait for verified sources before accepting its claims.
do you think they stop that strategy and go back to mirroring nvidia's prices more closely (we tried and it didn't work so maximise $/card) or carry on for a few gens to see if they can get a foothold?
they're not going back to anything, the strategy is to build a pile with hundreds of WGP and hopefully win it all!
(yes it's an exact copy of how CPU guys did it, but hey it works).
 
Better ones!

It's good for Gaming segment bottom line (and thus, me).

Large, no. Better ones oh yes.

What was it 128megs for the 4k infinity cache ? I'm not sure that will be enough but we will have to see

For me to consider them i'd need 4080 performance at $600.
 
Yeah, but there are other caches besides MALL.

Slower but maybe (hopefully) for less.
they need to move units.
eh, might be too slow then compared to the competition. Certainly to slow for me. 4080 is the minimum i'd consider to replace my 3080 with. maybe it will last me another gen until rdna 5
 
Fitting 64 high clocked CUs + L3 and MCs in 240mm^2 would be very impressive. If it can’t get to the 4080 it needs to hit close to the 7900 xt. Nvidia already has a < 300mm^2 die that’s not too far behind.
 
I was thinking that too but looking into this, the fastest mass production GDDR6 is Samsung's 20gbps, SK Hynix and Micron are 18gbps. Samsung are sampling 24gbps and SK Hynix are (I assume) sampling 20gbps so it's not as bad as I first thought. Guess GDDR7 is around the corner so effort's put into that instead but would've been nice if 20gbps+ was more plausible for RDNA4




Plus side is more even arch/gen comparisons

The problem with GDDR7 for GPU designed with GDDR6 in mind would have a mismatch between bandwidth and compute, as well as memory support and compute size. Combined with a GPU that's going to be dropped in 18 months, and GDDR7 being more expensive per GB/s as it initially comes out, "eh forget it" sounds like a good strategy.
 
Is RDNA4 going to show up in laptops next year? Intel is starting to put up a fight for the iGPU space, and I'm sure AMD would like to continue its dominance of the iGPU gaming market. "Battlemage" is showing what I'm assuming to be a 40% uplift (it says 56 EU vs the 128Eu benchmarks, but I'm guessing they're going for the doubled up floating point pipeline thing just like everyone else). Admittedly just in Sisoft so not a great benchmark, but RDNA4 plus a 16mb LLC would certainly be welcome.
 
AMD wanted to do a major laptop push with both RDNA2 and RDNA3, and, well, crickets. Geforce is not just plain better in a limited power envelope, but also has a such massive brand power that laptop makers don't want to tie their expensive products to the brand that the market feels is inferior.

I think that outside APUs, laptop GPU penetration will significantly lag desktop GPU market penetration. Which is probably why Strix Halo, it lets AMD build a single product that's both the best CPU on the market and the best GPU AMD can put in the power budget.
 
AMD wanted to do a major laptop push with both RDNA2 and RDNA3, and, well, crickets.
RDNA2 did very well in laptops.
3, though. Oops!
Ouch. But happens.
Which is probably why Strix Halo, it lets AMD build a single product that's both the best CPU on the market and the best GPU AMD can put in the power budget.
Nah it's just that Apple proven big APUs are workable/viable in laptops and now certain OEMs get major stiffies for any IHV emulating them.
Also why LNL exists!
 
Back
Top