AMD RDNA3 Specifications Discussion Thread

that AMD does not use elevated fan-out bridges as first reported by some outlets, but indeed some InFO-R variant without a local silicon interconnect
Yeah it uses a dogshit cheap (costs like 6 bucks to make) InFO-R fanout with class leading 30um pitch.
The entire N3x lineup gimmick is that each part is dogshit cheap to make.
the quoted 0.4pJ/bit would be on the very high end for relatively low clocked silicon bridge connections with a lot of lines
Pretty sure Intel quoted higher figures of MDIO over EMIB in SPR and that's lotta lines extending rather low clocked (low 3s GHz) mesh.
 
Last edited:
Yeah it uses a dogshit cheap (costs like 6 bucks to make) InFO-R fanout with class leading 30um pitch.
The entire N3x lineup gimmick is that each part is dogshit cheap to make.

Pretty sure Intel quoted higher figures of MDIO over EMIB in SPR and that's lotta lines extending rather low clocked (low 3s GHz) mesh.
Isn't it 0.3pj/bit for the old EMIBs with 55µm pitch and 0.15pj/bit for Lakefield and PonteVecchio (Foveros)? I guess one can always make it worse.
For comparison, AMDs direct/hybrid bonding approach for der 3DVcache allegedly needs just 0.05pJ/bit.
 
AMD posted some performance comparisons .. in rasterization the average increase for the 7900XTX over 6950XT appears to be 50%.

RE Village, 53%
Modern Warfare 2, 51%
Watch Dogs Legion, 47%
Cyberpunk, 67%


Ray Tracing numbers are less flattering

RE Village 43
Hitman 3, 65%
Cyberpunk, 61%
Dying Light 2x (they used High RT not Ultra)

AMD also kept comparing the card to the 4080.

videocardz.com

AMD compares its Radeon RX 7900 series to GeForce RTX 4080 - VideoCardz.com

AMD Radeon RX 7900 vs. GeForce RTX 4080 The company is convincing fans Radeon RX 7900 series is a better choice than NVIDIA’s upcoming RTX 40 GPU. Just 2 days ahead of the RTX 4080 launch AMD decided to steal some attention from NVIDIA by sharing more details on their new RX 7900 series. These […]
videocardz.com
videocardz.com
 
Isn't it 0.3pj/bit for the old EMIBs with 55µm pitch
That was for Stratix10 tiles quote.
MDIO on SPR is higher iirc.
and 0.15pj/bit for Lakefield and PonteVecchio (Foveros)?
Think it's 0.2 on PVC and who knows what on Lakefield since it's a stacked PCH aka very low power intensity.
AMDs direct/hybrid bonding approach for der 3DVcache allegedly needs just 0.05pJ/bit.
Correct and guess what they're using next for their GPUs.
AMD also kept comparing the card to the 4080.
Yeah cost class ain't far off.
 
AMD posted some performance comparisons .. in rasterization the average increase for the 7900XTX over 6950XT appears to be 50%.

RE Village, 53%
Modern Warfare 2, 51%
Watch Dogs Legion, 47%
Cyberpunk, 67%


Ray Tracing numbers are less flattering

RE Village 43
Hitman 3, 65%
Cyberpunk, 61%
Dying Light 2x (they used High RT not Ultra)
How are bigger numbers less flattering?
AMD also kept comparing the card to the 4080.
Of course, NVIDIA hasn't released anything cheaper than that and even XTX is $200 under 4080. If it was challenging 4090 it wouldn't cost $999
 
AMD posted some performance comparisons .. in rasterization the average increase for the 7900XTX over 6950XT appears to be 50%.

RE Village, 53%
Modern Warfare 2, 51%
Watch Dogs Legion, 47%
Cyberpunk, 67%


Ray Tracing numbers are less flattering

RE Village 43
Hitman 3, 65%
Cyberpunk, 61%
Dying Light 2x (they used High RT not Ultra)

AMD also kept comparing the card to the 4080.

videocardz.com

AMD compares its Radeon RX 7900 series to GeForce RTX 4080 - VideoCardz.com

AMD Radeon RX 7900 vs. GeForce RTX 4080 The company is convincing fans Radeon RX 7900 series is a better choice than NVIDIA’s upcoming RTX 40 GPU. Just 2 days ahead of the RTX 4080 launch AMD decided to steal some attention from NVIDIA by sharing more details on their new RX 7900 series. These […]
videocardz.com
videocardz.com
40-60% increases and in dying light 2 twice the performance ? That seems really good to me over one generation
How are bigger numbers less flattering?

Of course, NVIDIA hasn't released anything cheaper than that and even XTX is $200 under 4080. If it was challenging 4090 it wouldn't cost $999
Yea I don't really get that

Considering the 7900xtx is $200 cheaper than a 4080 of course they will compare it to that. I bet amd wished the 4080 12gig card was still releasing in Dec. That would have been great for them since the 790xt would have been the same price and the 7900xtx would have been just $100 more
 
It would, that's the whole gimmick.
Maybe a hundred bucks more?
At the point they could challenge 4090 with this it would be stupid not to take extra margins, something like "4090 performance for the price of 4080"
 
At the point they could challenge 4090 with this it would be stupid not to take extra margins
People would vomit and scream much the same way they're vomiting and screaming at 4080 being ~400mm^2 die sold for $1200.

Mining season left your average gamer vewy angry.
 
Great. Now all they would need to do is cut GPU in half and put a CPU plus HBM on their fabric, to get a PC on a chip.
But guess they won't until soembody like Sony / MS instructs them. <: (

Cutting the GPU into 3rds for Navi 31 die would be better. 32 of their WGP in one die, tile out to 1 - 4 dies. Supposedly similar to Battlemage.
But cutting it in half seems more likely. The geo pipeline AMD patented looks like a it'll make the main chiplet a bottleneck for the rest, and Battlemage's supposed die sizes look like there's a similar bottleneck there.

The question for 3rds is where you put your memory controller and media engines as well. So maybe half with heterogenous dies is just better.

Just the first coming to my HW noob mind becasue it's small. Any other memory type is fine.

Obviously, so i take that as confirmation my game APU dream can still life on. :D

Maybe not HBM, but there was this interesting paper suggesting using static analysis to pick out the most heavily accessed parts of a program and shifting the memory addresses from GDDR to a small amount of HBM (not cache so none of the overhead).

Maybe they could bring EDRAM back? 384mb of on package EDRAM, static analysis to keep frame buffers/bhv stuff there. Better bandwidth/power efficiency because you're not going off package, better IPC as you're lowering latency. It would cost more silicon though, but maybe chiplets and packaging can minimize that cost.
 
Last edited:
People would vomit and scream much the same way they're vomiting and screaming at 4080 being ~400mm^2 die sold for $1200.

Mining season left your average gamer vewy angry.

I don't think gamers really care about the size of the die. They care about pricing. The 3080 was $700 and then the 3080ti was $1200 and now the 4080 is the same price but the performance isn't really there to justify purchasing in the same class of cards. I will wait for final benchmarks but I don't think its worth spending $1200 for the performance over my 3080. Also pricing is crashing since the mining boom is done. So you can get used 3080s for sub $500 (I've even seen a few go for $350ish) Its a huge jump for costs but not a huge leap for performance.
 
They don't care if the value is great (see 3080 or 6800XT) and go ballistic over it if it is not.
It's just a justification for them to get mad if a fair one at that.
I mean you are kinda agreeing with me but I don't think people are mad that the 4080 is a large die , they are mad that its not offering the performance they come to expect for such a large price.. The problem is the jump from a 3080 or 3080ti to a 4080 doesn't seem to warrant another $1200 bucks , $500 more than the 3080 release price and the same as the 3080ti which many people already felt was extremely over priced compared to the 3080. People were more willing ot accept it because the shortages and demands for mining cards artificially lifted the price anyway of all the cards. But that is all gone now but Nvidia wants to charge like its still there.
 
But that is all gone now but Nvidia wants to charge like its still there.
No they're charging because it's N5 now (and they're trying to bump their ASPs by upselling into 4090s).
The problem is the jump from a 3080 or 3080ti to a 4080 doesn't seem to warrant another $1200 bucks
Yes and the kids need to vent their rage somehow so they screech small die high price.
 
No they're charging because it's N5 now (and they're trying to bump their ASPs by upselling into 4090s).

Yes and the kids need to vent their rage somehow so they screech small die high price.

And so they get blow back because of it. If nvidia really wanted to increase their ASP they should have announced new naming schemes across the board and change the msrps with the new pricing.

I am not sure what kids you are taking about? Am I one of those kids
 
And so they get blow back because of it. If nvidia really wanted to increase their ASP they should have announced new naming schemes across the board and change the msrps with the new pricing.

I am not sure what kids you are taking about? Am I one of those kids

That assumes competence on basic stuff. AMD is going to have multiple 7XXX products with almost the exact same names. Nvidia is going to charge far more than they used to for things that imply the exact same names. Literally all they had to do was type a different set of names in some marketing doc and they'd have solved these unnecessary problems, but somehow they couldn't manage it. Baffling.
 
That assumes competence on basic stuff. AMD is going to have multiple 7XXX products with almost the exact same names. Nvidia is going to charge far more than they used to for things that imply the exact same names. Literally all they had to do was type a different set of names in some marketing doc and they'd have solved these unnecessary problems, but somehow they couldn't manage it. Baffling.

I don't disagree with you realy. But in all honesty with amd at least the 7900xt and 7900xtx are very similar in price. Nvidia has such a huge diffrence in pricing with their original 40x0 pricing . You had the 4090 at $1500 and then at $900 you had a 4080 12 gig which is what half the ram and half the every thing else which was out of wack with the the 30x0 series of cards Then you had the 4080 16 gig which falls between the two but at $1200. Its just a huge amount of gaps and performance falls off a cliff. I'd be surprised if the 4080 12gig wasn't actually slower than the 3080ti
 
Nobody cares about names and die sizes. The only thing which matter is the performance increase you get for the same price you would pay for an older product.

Feel like you're overestimating the average consumer. My favorite gamedev story is about the Wow beta. They wanted to slow down obsessive player levelling, so they slowed XP gain if you played too long without logging off. People complained a lot. So they just renamed as an "XP bonus" for people that hadn't logged on in a while. Suddenly everyone was happy, despite the math not changing one bit.

I wouldn't doubt the same principle applies here, the amount of grumbling I've seen about the "4080 costing $1200!!!" is off the charts. If they'd called it a 4080ti I'd be there'd be less grumbling and more buying.
 
Last edited:
Back
Top