Indeed, RDNA 1 / Navi seems like an architecture that was developed for release in GPUs in late 2017 / early 2018 originally, probably to one or more canceled nodes. E.g.
during early 2016 they had planned it for late 2017 release using GF 10nm before knowing
the fab would skip directly to 7nm, then
until mid 2017 they had planned it for Q3/Q4 2018 using
GlobalFoundries' 7nm DUV before knowing the
fab would drop out of the high-end nodes altogether.
Mix that with the incredibly low R&D budget they've had for GPUs until 2018 and they just couldn't keep up the redesigns to keep the GPU from getting delayed over and over, forcing AMD to compete in the gaming segments using GCN GFX9 GPUs.
End result is a GPU that (finally) competes well for the power segment because it's gaming focused and it's using a recent node, but isnt' bundling any technology or standard that would be expected for a GPU released in Q3 2019 like HDMI 2.1, VirtualLink, variable rate shading and hardware acceleration for DXR .
The pricing is a bit of a let-down to me but it seems to be solely based on nvidia's offerings though. I can't see anything that says the 5700 XT couldn't be sold for $300, and it probably will after 3 or more quarters when the RDNA 2 higher end card and nvidia's 7nm cards come up.
Without any new hardware features, I wonder if people won't find the currently discounted Vega 10 cards to have a better price/performance ratio than the 5700 family, and if reviewers will point out the poor price/performance comparison relative to their predecessors like they did with the RTX series against Pascal.
Every single AMD RTG graphics card release has been a monkey paw wish (i.e. there's
always some negative factor that takes center stage), and I'm thinking those prices might be their undoing. Especially with the rumors of heavy price cuts on nvidia parts.
I wonder what the originally planned "January 2018 Navi" looked like. There would be no 7nm and no GDDR6. GDDR5X seems to have been almost exclusive to nvidia. 384bit GDDR5 and more CUs to compensate for lower clocks?
What exactly in a high resolution picture of a single-chip GPU makes you believe it's using a chiplet setup?