That's why I underlined rumored. There's an old 505mm2 rumor and the latest 536mm2. Both could turn out to be complete BS.
Of course, but usually rumors have at least
some grain of truth to them. And I'm sure none of us expect Big Navi to be around a 3080 at half the die size (different processes I know). It should be somewhere in that ballpark by all accounts.
In theory with a 384 bit bus, you can get to 16 GB, with 8x 1GB and 4x 2 GB GDDR6 chips.
Though would not have much benefit over 12 GB, as the extra 4 GB is on a 128 bit bus.
Still very intrigued why they would go with 256 bit, where the XBX has 320 bit.
Theoretically they could even do 16 GB with a 320 bit interface just like the XSX. But there must be some downsides to it, or we'd have seen more of these asymmetric memory configurations over the years. Heck Nvidia could have given the 3080 12 GB and totally avoided the "10GB is less than 11GB 1080/2080Ti" criticism.
Let me be the pessimistic one and say those numbers are from their high-end offering. Have we forgotten "Poor Volta" already?
And that performance is more than good enough for the majority. If it's akin to RV770 then it will win based on the price alone. The Radeon HD 4850 and 4870 were dirt cheap for the performance they offered.
RDNA2 supposedly has a less complex PCB (only 256-bit memory bandwidth) and uses GDDR6 (cheaper).
For some reason $700 graphic cards have been normalised and talked about as being midrange.
I don't think most people would refer to $700 card as midrange for sure. Up until Pascal, we had a reasonable price to performance ratio from both parties, and with better perf at each price point with every generation. Of course with Turing it stagnated and this is where we saw the $700 graphics cards being "normalised" as you say. AMD's GPUs at the time could not compete beyond the upper mid-range and relegated the high end to Nvidia. We saw the "mid-range" moving from a $199 GTX 960 to a $249 GTX 1060 to a $349 GTX. One has to keep in mind inflation and price of silicon (on a $/mm2 basis) has been going up rather significantly so the prices going up is not just pure profiteering.
And FWIW, I think AMD underpriced RV770 (not that I'm complaining, I happily bought a HD4850 to replace my 8800GT at the time), and could have easily priced it a little higher and made some more money. It's unlikely they will repeat this. As evidenced by Zen 2 and more so with Zen 3, AMD will price at a premium if they can.
It's not necessarily same process on same production lines at TSMC. And AMD can't sell four chiplets for 1600 USD to same guy, but they can sell two chiplets and $700 GPU to one guy for 1500 USD
(yes, I know there's Threadrippers, but the point should be obvious)
Unless navi is on 7nm+, they are both going to be on the same 7nm process. Even otherwise, its more about the total wafer allocation AMD has secured and they are competing against other players as well obviously.
Aside from Threadripper, there is also of course EPYC. With Milan, AMD have a very strong product at a time Intel has badly faltered. With Icelake server reportedly delayed and underwhelming, AMD has to make hay until Sapphire Rapids in late 2021/early 2022. To add, AMD is also experiencing record demand for their APUs at the moment. All of these will certainly drive their wafer allocation more towards CPUs/APUs.
If we follow that logic, there's no reason for AMD to produce anything except EPYCs and Threadrippers as the frequencies there are much more closer to the V/F sweetspot (variation in silicon quality is less pronounced), margin is far greater and so on. There's also the "professional GPU" market - former Quadro/FireGL (or how are they called now) cards are usually getting sold for thousands of dollars or even 10k+ dollars (as in case of RTX 8000).
IIRC Nvidia once said that the R&D for the professional parts is paid by the consumer parts. The professional lineup would not be able to sustain itself on a standalone basis, or at least it couldn't at that point (circa 2016-2017). Today perhaps Nvidia might be able to survive on professional alone, but gaming is still a majority of their revenue. It's also good to diversify your revenue sources obviously. If AMD had decided to focus only on Opteron back in the Athlon 64 days, they'd likely have died out by mid 2015 without a consumer line to keep them going as their server market share plummeted. Either ways, the current supply situation is likely short term and exacerbated due to the console ramp for the launches. It should ease by the next quarter with the capacity vacated by Huawei and Apple.