AMD RDNA5 Architecture Speculation

RDNA4 is out in less than 6 months! And we've got info on RDNA5 already:

DGF: A Dense, Hardware-Friendly Geometry Format for Lossily Compressing Meshlets with Arbitrary Topologies (hardware compressed mesh format that's raytracing friendly, already praised by lead designer of Nanite)

"the largest technical leap you will have ever seen in a hardware generation" (PR, but multiple leaks point towards Xbox going w/AMD again)

"RDNA5 to feature entirely chiplet arch" (according to patents)

"Medusa Point Zen 6/RDNA5 in 2026" (Zen 6/RDNA5 already scheduled for laptops/mobile in 2026)

Supposedly targeting late 2025 for desktop versions, though who knows if that date will be hit or if CES - Computex 2026 is more likey. Still, that's a heck of a lot info, implied and leaked (yes Medusa as a codename has come up more than once and looks to be guaranteed as the actual codename). Feels like enough to start a thread.

Other leak points towards some sort of increased AI instruction throughput. Maybe it's that block FP16 (an entire set of matrix multiplication instructions share the same exponent) that AMD is so fond of (already shipping in XDNA2, confirmed support in MI350)
 
RDNA4 is out in less than 6 months! And we've got info on RDNA5 already:

DGF: A Dense, Hardware-Friendly Geometry Format for Lossily Compressing Meshlets with Arbitrary Topologies (hardware compressed mesh format that's raytracing friendly, already praised by lead designer of Nanite)

"the largest technical leap you will have ever seen in a hardware generation" (PR, but multiple leaks point towards Xbox going w/AMD again)

"RDNA5 to feature entirely chiplet arch" (according to patents)

"Medusa Point Zen 6/RDNA5 in 2026" (Zen 6/RDNA5 already scheduled for laptops/mobile in 2026)

Supposedly targeting late 2025 for desktop versions, though who knows if that date will be hit or if CES - Computex 2026 is more likey. Still, that's a heck of a lot info, implied and leaked (yes Medusa as a codename has come up more than once and looks to be guaranteed as the actual codename). Feels like enough to start a thread.

Other leak points towards some sort of increased AI instruction throughput. Maybe it's that block FP16 (an entire set of matrix multiplication instructions share the same exponent) that AMD is so fond of (already shipping in XDNA2, confirmed support in MI350)

Both Microsoft and Sony are likely to go AMD again for the next gen consoles due to backwards compatibility reasons. In terms of technical leaps, I would say the current gen consoles with the move to a Zen based CPU instead of low power, RDNA 2 graphics and SSD storage brought about the biggest technical leap till date. Not really sure what further leap is possible with the exception of a huge increase in ray tracing performance.

The Medusa APU name has leaked before and yes rumoured to be RDNA 5 based, skipping RDNA 4 on mobile altogether.

And RDNA5 is supposed to bring high end/halo parts like RDNA 2/3 did. Though late 2025 seems a bit optimistic given that they haven't even released RDNA 4 yet. There were rumours that the higher end RDNA 4 parts were cancelled to accelerate RDNA 5 instead but even then I wouldn't expect them before 2026.
 
MI325 (GFX940): 24Q4
MI350 (GFX950): 25Q4
MI400 (???): 26Q4

RDNA4 (GFX12): 25Q1
“Navi 5x” (???): 26Q4 allegedly

Hot news from the rumour mill says MI400 and “Navi 5x” would be the first “UDNA” products.

If “Navi 5x” is indeed a late 2026 product, the timeline does seem plausible. If true, that implies the “unification” decision was likely taken a couple of years ago, since it is now (Nov 2024) ~2 years away from the alleged launch date for both MI400 and “Navi 5”. Supposedly at this point, the SoC/chip lineup should have been decided, and the SoC physical designs should have commenced.

-

While said hot news also suggested this to be “GCN-based”, I find it hard to picture AMD architects wanting to ride on the GCN-classic (Wave64 on SIMD16 over 4 cycles) pipeline for yet another 5 years. That is leaving all the ILP and lower execution latency on the table.

So it is nice to hear a similar opinion in that UDNA is probably gonna be a RDNA descendants/look-alike with CDNA features bolted on:

 
If “Navi 5x” is indeed a late 2026 product, the timeline does seem plausible. If true, that implies the “unification” decision was likely taken a couple of years ago, since it is now (Nov 2024) ~2 years away from the alleged launch date for both MI400 and “Navi 5”. Supposedly at this point, the SoC/chip lineup should have been decided, and the SoC physical designs should have commenced.
it´s pretty hard to believe in such story , because that would make whole RDNA family GPU roadmap pointless.... It´s more likely very recent decision.
While said hot news also suggested this to be “GCN-based”, I find it hard to picture AMD architects wanting to ride on the GCN-classic (Wave64 on SIMD16 over 4 cycles) pipeline for yet another 5 years. That is leaving all the ILP and lower execution latency on the table. So it is nice to hear a similar opinion in that UDNA is probably gonna be a RDNA descendants/look-alike with CDNA features bolted on:

I agree. This is most plausible scenario, they using RDNA5 as a base of the architeture whille making additional changes to fully support all CDNA features and call it UDNA. This is possible explain, how they could get UDNA out so quickly, as usually making new GPU arch takes up to 4 years to develop.
 
Last edited:
While said hot news also suggested this to be “GCN-based”, I find it hard to picture AMD architects wanting to ride on the GCN-classic (Wave64 on SIMD16 over 4 cycles) pipeline for yet another 5 years. That is leaving all the ILP and lower execution latency on the table.
If the idea is to push as many flops as possible and use that for graphics (expecting that graphics is moving onto compute more and more each cycle) then moving back onto GCN approach may in fact be beneficial even despite the relative issues it has. GCN h/w in gaming was never really constrained by its compute performance, all the issues stemmed from its purely graphical frontends/backends. If these would be unnecessary then GCN would likely do just fine. Even if these would just be adequate it would do fine -- although I wonder if it would still need to rely heavily on async compute to fill in state change bubbles whenever these would be occuring.

As for the timelines I'd say chances are high that the whole idea came along with the cancellation of RDNA4 high end - and probably while looking at what they can provide to Sony/MS for the next console gen h/w. So in this regard it does seem more plausible that "UDNA" is a "cancellation of RDNA5 and using CDNA3 to make the next gaming lineup".
 
Since RDNA 5 is pretty much canned wouldn't this thread be better renamed to UDNA?

This thread kind of has a feel that some are believing there will be an RDNA 5 after RDNA 4 despite UDNA being known as existing.

I think posters should instead create a new thread to explicitly discuss UDNA.
 
That wouldn't be surprising. Trying to compete with whatever monster the RTX 6090 ends up being with the first UDNA lineup doesn't seem worth it. In truth, the 4090 and the xx90 cards after it really seem to be a new tier: ultra-ultra-enthusiast. They should have been called Titans.
 
That wouldn't be surprising. Trying to compete with whatever monster the RTX 6090 ends up being with the first UDNA lineup doesn't seem worth it. In truth, the 4090 and the xx90 cards after it really seem to be a new tier: ultra-ultra-enthusiast. They should have been called Titans.

Yeah the halo cards like the 4090 have distorted the perception of the market a bit. Especially when people become more concerned with relative positioning of SKU names instead of actual performance. Most people should just ignore the 4090 exists.

If you look at what it takes to get good performance in the latest games the situation isn’t nearly as bad as some make it out to be. These are at ultra settings too.

1080p/60: $300 (4060 or 7600)
1440p/60: $450 (4060 Ti or 7700 XT)
2160p/60: $800 (4070 Ti)

I don’t know that $450 for 1440p/60 at ultra settings warrants the amount of bellyaching we see online these days.
 
Yeah the halo cards like the 4090 have distorted the perception of the market a bit. Especially when people become more concerned with relative positioning of SKU names instead of actual performance. Most people should just ignore the 4090 exists.

If you look at what it takes to get good performance in the latest games the situation isn’t nearly as bad as some make it out to be. These are at ultra settings too.

1080p/60: $300 (4060 or 7600)
1440p/60: $450 (4060 Ti or 7700 XT)
2160p/60: $800 (4070 Ti)

I don’t know that $450 for 1440p/60 at ultra settings warrants the amount of bellyaching we see online these days.
Most of the games are PS4 era which heavily skews the results upwards though.
 
Isn’t that representative of games people actually play though? I get the impression most of the complainers online don’t actually play games.
Well you said the latest games. Most of the games benched are old. In the latest games I would say the 4070 TI is a 1440p/60 card with DLSS.
 
Last edited:
Well you said the latest games. Most of the games benched are old. In the latest games I would say the 4070 TI is a 1440p/60 card with DLSS.

You're right, TPU has a few older games in there. The 4070 Ti doesn't need DLSS to hit 1440p/60 in the latest games though. With the exception of Black Myth Wukong the 4070 Ti got you 1440p/60 native at epic/ultra settings in 2024 releases. Adding DLSS and lowering some pointless ultra settings would get you well north of 60.

STALKER 2 - 70 FPS
Dragon Age: Veilgard - 81 FPS
COD Black Ops 6 - 108 FPS
Silent Hill 2 - 60 FPS
God of War Ragnarok - 108 FPS
Final Fantasy XVI - 60 FPS
Warhammer: Space Marine 2 - 95 FPS
Star Wars Outlaws - 68 FPS
Black Myth Wukong - 60 FPS (66% upscaling)
First Descendent - 74 FPS
Hellblade 2 - 64 FPS
Ghost of Tsushima - 88 FPS
Homeworld 3 - 148 FPS
Horizon Forbidden West - 94 FPS

All that to say my point stands. You don't need to spend an enormous amount of money to play at 1440p in 2024. And that's doubly true if you're playing older games (which most people are according to Steam). Yes we should expect more value for our money over time but the complaining about flagship GPU prices just doesn't match reality. Nobody "needs" a 4090 so therefore AMD doesn't necessarily "need" a 4090 competitor if they play their cards right.
 
Last edited:
@trinibwoy tagging you to avoid a large quote window.

Game performance varies heavily and various other benches of different areas have markedly lower performance. Also Im factoring in RT being enabled. Not full RT, but a reasonable level.

I consider titles like Ghosts of Tsushima old even though they are new for PC. Similarly for titles like Call of Duty. I was thinking more along the lines of games with modern rendering tech. No one needs a 4090 I agree, but native 1440p/60 on a 4070 Ti is going to require sacrifices. 60 to me means a locked 60. I hold the same standards as I would to consider a console title 60 fps. Drops should be exceedingly rare. An average of 60 is something different.
 
@trinibwoy tagging you to avoid a large quote window.

Game performance varies heavily and various other benches of different areas have markedly lower performance. Also Im factoring in RT being enabled. Not full RT, but a reasonable level.

I consider titles like Ghosts of Tsushima old even though they are new for PC. Similarly for titles like Call of Duty. I was thinking more along the lines of games with modern rendering tech. No one needs a 4090 I agree, but native 1440p/60 on a 4070 Ti is going to require sacrifices. 60 to me means a locked 60. I hold the same standards as I would to consider a console title 60 fps. Drops should be exceedingly rare. An average of 60 is something different.

You may want to look only at games with modern tech at ultra settings but that’s not what the average gamer is doing. Some people may want 4K @ 240fps etc. Those people aren’t representative of the wider market and therefore aren’t relevant to the success or failure of AMDs next lineup.

Intel actually did a very good job explaining that the B580 is a good GPU even though it didn’t bring anything new to the table. The performance isn’t special. The price isn’t even that special. It was the messaging (thanks TAP) and in this case reviewers amplified that positive message.
 
Last edited:
Intel B580 is almost like Intel's Ryzen moment in the GPU space. Right performance, right time, right price to shake things up. But will there be takers?
 
Back
Top