Bondrewd
Veteran
Not even a HotChips keynote.First mcm GPU shipping and silence like sh...
AMD are primordial shitlords uhuhuhuhuh.
Not even a HotChips keynote.First mcm GPU shipping and silence like sh...
Yes.nd now the same leaker says there's going to be two multi die... dies?
You really really need to understand what the fuck are they doing with the packaging there.There's little reason to tape out a separate die a third smaller than your big die, when you can just have a GPU with one of those modular dies instead of two
MCDs are now what you think they are.making another MCD setup with an entire other die makes severely little sense
It's an entirely different uArch that bears not much resemblance to RDNA1/2.Two 6900s on the same package, probably with better raytracing
Chief MI100 to MI200 is 15.5-ish to >42TF DPFP per 500W OAM.without CDNA making the same relative advanc
I've been wondering this several pages back.And now the same leaker says there's going to be two multi die... dies?
Fuck it, I severely doubt this leaker now. There's little reason to tape out a separate die a third smaller than your big die, when you can just have a GPU with one of those modular dies instead of two. Heck you go a bit further into binning and the multi die setup is the relatively low power, low clocked bin while the single die bins are the ones you can just amp the clock up a ton. Considering salvage gets you another 2 separate configs at least, making another MCD setup with an entire other die makes severely little sense. There's no hole in your lineup to fill.
If any leak is true it's from the reliable Red Tech gaming. Two 6900s on the same package, probably with better raytracing. Doesn't require any unbelievable leap in power efficiency, doesn't require the practical implementation of SRAM to shrink dramatically between generations for good die sizes and yields, and it even tracks with the relative compute power between CDNA and RDNA. CDNA was more power efficient for compute than RDNA, having RDNA both catch all the way up and go multi die at the same time, without CDNA making the same relative advance, doesn't ring true.
I've been wondering this several pages back.
Apparently it's to be "faster", not cost less.
So we all should prepare for some records, if only in pricing I feel.
Oh man.not for bragging rights.
dGPUs don't make business sense from an AMD POV at all.this doesn't make business sense
Yes.
You really really need to understand what the fuck are they doing with the packaging there.
MCDs are now what you think they are.
It's an entirely different uArch that bears not much resemblance to RDNA1/2.
Chief MI100 to MI200 is 15.5-ish to >42TF DPFP per 500W OAM.
Of course it does."Yes" that makes no sense.
What even.Like, I went through and laid out the entire reason the business case is silly
Read the goddamn thread for once.What do you "think" they are?
Not anymore!AMD already having homogenous dies for CPUs
N O T A N Y M O R E.that making the most business sense
Of course there will be chopped GCD configs for N31 and 32 both.why there isn't a further salvage die
Yea but your VRAM gets cut.More realistic 30-50%?
Less than the 16/8 they already have in that range?Yea but your VRAM gets cut.
But 450 buck range is 12GB now.Less than the 16/8 they already have in that range?
That’s right, i was misremembering NV GPUs as the only ones with 12. But ya 8Gb in a 2022 4-500$ GPU is not goodBut 450 buck range is 12GB now.
Gets down to 8 next year unless somehow lucky and JEDEC updates G6 spec and all.
Oh man.
AMD runs an entirely redundant client HEDT lineup for that very purpose.
There was not much engineering to be done on those, though, to thoroughly kick Intels behind.I would also count the initial Threadripper in the "Let's do this, it's fun" territory. And honestly, I think it is nice to see (sometimes unchecked) engineering spirit at work ... not the bean counters or propaganda ministers.
Original Threadripper was supposedly engineers pet project they did on their free timeI would also count the initial Threadripper in the "Let's do this, it's fun" territory. And honestly, I think it is nice to see (sometimes unchecked) engineering spirit at work ... not the bean counters or propaganda ministers.
400 for a 25tf GPU? If this happens, my worries about an expensive high end only future would not be justified. Looks like a very good offer, even if 8GB requires to tone down settings.But ya 8Gb in a 2022 4-500$ GPU is not good
It can be 12GB too; we've qualified 24Gb DDR5/LP5 partsLooks like a very good offer, even if 8GB requires to tone down settings.
Not happening really.I hope for future RDNA2 / Ampere refreshes to solve this...
That really lets your enthusiast heart relate and bond, doesn't it?Original Threadripper was supposedly engineers pet project they did on their free time
For gamers 8GByte on a new card would not be desireable for 2022 and onward, given console have larger memories and it tends to get utilized there to good effect. Unless of course someone's aiming at the high-fps e-sports crowd exclusively. Which might not be the worst idea for brand-building.400 for a 25tf GPU? If this happens, my worries about an expensive high end only future would not be justified. Looks like a very good offer, even if 8GB requires to tone down settings.
But i assume it will be much more expensive in the end.
I hope for future RDNA2 / Ampere refreshes to solve this...