AMD: RDNA 3 Speculation, Rumours and Discussion

Status
Not open for further replies.
Are they maybe counting the V-cache as a "chiplet"?
9chiplets = 4MCD + 4 v-cache + GCD
13chiplets = 6MCD + 6 v-cache + GCD
If the 9 chiplet is indeed navi32 vcache version and started mass production that would suggest navi32 is earlier ready than previous speculated , no?
 
Are they maybe counting the V-cache as a "chiplet"?
9chiplets = 4MCD + 4 v-cache + GCD
13chiplets = 6MCD + 6 v-cache + GCD
The 13 chips is year away according to the tweet, so doesn't add up for Navi31
(Also 'just started developing' definitely doesn't add up with coming in a year)
 
The 13 chips is year away according to the tweet, so doesn't add up for Navi31
(Also 'just started developing' definitely doesn't add up with coming in a year)
The initial version of Navi 31 shouldn't be equipped with V-cache, so the V-cache variant (13 chiplets) could be prepared as a refresh. In the case they are counting layers as chiplets.

If they don't count V-cache layers as chiplets, then the GPU with 9 chiplets could be the rumored professional model based on 2 GCDs from Navi 22 (2 GCDs + 3 MCDs+V-cache per each GCD + bridge? => 192MB Infinity Cache + 384bit bus). And the GPU with 13 chiplets would be some next-gen product.
 
Unless it was a slip up, Su said in the Ryzen 7000 launch show that RDNA3 uses "5 nanometer chiplets", which contradicts most of the recent rumors (1x5nm GCD + 6x 6nm MCD)
edit: in theory it could just mean there's more than 1 RDNA3 N5 chiplet, just not in same GPU
 
Navi33 will likely be Laptops first for quite some while.

There was an actual, reliable looking leak (I think it was patch notes?) mentioning 4 models with one being APU only. Regardless of what the code number that's likely the one, maybe chiplet based that slots in, that will initially be mobile and maybe some sort of desktop APU only for quite a while.

It could also be sold really cheaply. GDDR6 is $12-15 a gb or so, cutting that out could save $50-100 (depending on how much memory you'd have otherwise), then you get savings from a unified memory system, etc.

A 5-8 teraflop GPU included in a desktop 6 core APU for $349 or so could look really tempting to the right crowd. Perfectly solid 1080p gaming in a tiny form factor for cheap.

Meanwhile a bigger chip, something similar to the 6600xt, still has a desktop/mass market standalone appeal. The 6600xt had problems, but fix those up and add good raytracing support, and people would buy an 11 teraflop gpu for $299 or so. It's been too long since the GTX 1060/RX (4/5)(7/8)0 crowd has had a good deal and there's going to be people there waiting.
 
Last edited:
Since AMD were conservative with some Zen 4 information early on (e.g. >15% 1T perf claim at FAD) could they be conservative with their RDNA3 information? >1.5x perf/w is significant but it's a fun thought
 
24Gbit ICs for G6 do not exist so what else could they do.

200-something mm^2 dies aren't that by definition.
1. 192bit 12GB Vram. ;) I would be even willing to compromise for 160bit and 10GB Vram.:)

2. Polaris was 232mm2 and pretty future-proof in my opinion with 8GB Vram.
 
Last edited:
Yea it made them no money.
You gotta earn some cash first before making better stuff, simple as.
You have a habit of making strong authoritative statements with no actual proof of anything. The idea that AMD made no money on Polaris 10 is baseless.

It also makes no sense to suggest that AMD cant afford to make decently spec'd 200mm² GPU's now that they actually do have cash.

"Well AMD's goal is to make money" is a convenient copout to excuse any sort of criticism about value or specs at a given tier.

"I'm just explaining the reasoning" is similarly a copout response doing the same thing. No one is confused about the notion of companies wanting to make money. But your argument was that a 200mm² is 'by definition' not going to have any future proofing. Then when presented with an example that contradicts this, you're just doing the whole copout thing I was referring to.

If Navi 33 is gonna come in at $400+, with only 8GB of RAM, in 2023, marketed as like a '1080p' GPU or something, people are understandably not gonna be thrilled. It's ok to acknowledge that. Even $300 for a 2023 GPU that's only targeting 1080p isn't gonna be that great. Can you imagine if Nvidia/AMD were trying to sell like a $240 GPU in 2016 with only 2GB that was targeted at like 720p? Cuz that's basically the equivalent situation.
 
Last edited:
Status
Not open for further replies.
Back
Top