Bondrewd
Veteran
That's GA103 only so far iirc.Over 175w depending on manufacturer.
Most GA104/N22 variants run like 165W tops.
That's GA103 only so far iirc.Over 175w depending on manufacturer.
Gaming Frequency | TBP | Specification | |
RX 6700XT | 2424 MHz | 230 W | 40CU; 192bit 16gbps |
RX 6800M | 2300 MHz | 145+ W | 40CU; 192bit 16gbps |
They're completely identical in the power curves across their whole range. The desktop card uses 240W at 2639MHz and 150W at 2243MHz.Comparing laptop do desktop SKU is not very accurate.
For example RX 6800M has ~45-50% better perf/W compared to desktop model
It depends on a lot of factors. If the performance increase was given at 1080p, in example (as gaming laptops have in the most of cases that resolution) you can be more CPU limited than at higher resolution. I.e. at 1080p the 4090 is not so much higher than a 6950XT while at 4K it is much, much faster.I have to wonder how high is this N32 actually clocked, because 80% higher performance at ISO power is not so much If I think about N32's specs.
N32 has 50% more WGPs than N22(30 vs 20) and each WGP should be significantly better.
Expected performance per ISO power(~150W)
RX 6800M : 100%
N32 mobile : +80%
N32 has +50% WGPs and If we say RDNA 3 WGP is 50% better than RDNA2 WGP then ideally that's 100*1.5*1.5=225% or +125% higher performance, but It's actually only +80%. I have to lower clockspeed by 20% from 2300MHz to 1840MHz to have the expected performance.
This leak from Greymom55 could actually be a cutdown version of N32, because I don't expect a mobile N32 to be downclocked from >3GHz down to <2GHz just to be within 145-165W, when N22 was downclocked by only a few %.
It's true we don't know at what resolution It has 6950XT level of performance, but N32 has 256bit GDDR6 + 64MB IC, so even at higher resolutions It won't be bottle necked, so I don't think It matters at what resolution It was.It depends on a lot of factors. If the performance increase was given at 1080p, in example (as gaming laptops have in the most of cases that resolution) you can be more CPU limited than at higher resolution. I.e. at 1080p the 4090 is not so much higher than a 6950XT while at 4K it is much, much faster.
Are you just assuming some of the leaks being correct or do you actually know the specs for facts like you claim?It's true we don't know at what resolution It has 6950XT level of performance, but N32 has 256bit GDDR6 + 64MB IC, so even at higher resolutions It won't be bottle necked, so I don't think It matters at what resolution It was.
If that performance was for N33, but that was already denied, then It would be only at 1080p, because It has only 128bit GDDR6 + 32MB IC.
I don't have any insider info, I am basing It on Angstronomics and no one was disputing their leaked RDNA3 specs as far as I know.Are you just assuming some of the leaks being correct or do you actually know the specs for facts like you claim?
No, but considering how 'reputable leakers' have been all over the place as usual, I'm not giving credit to a new name before they've earned itI don't have any insider info, I am basing It on Angstronomics and no one was disputing their leaked RDNA3 specs as far as I know.
Link
Do you have any info that their specs are incorrect?
If It has 2x better raster performance, but only 2x better RT performance then that would mean RT performance is the same as was with RDNA2 and no improvement was made there.
I'm betting instead for:Slight update to guesses, assuming more recent leaks are true:
By this school of logic, is AMD somehow "gimping" all their Zen 3 CPU SKUs so that 5800X3D can exist?Just not very keen on this idea that they might be 'gimping' their normal flagship part in order to charge an extra premium for a Vcache variant.
Can we presume that Navi 31 uses a heatspreader, like those seen on Ryzen? The heatspreader is required to deal with the chiplets to protect them from mechanical damage and provide a known thermal solution
The Vcache chip for the 5800X3D represents a very significant percentage increase in silicon usage.By this school of logic, is AMD somehow "gimping" all their Zen 3 CPU SKUs so that 5800X3D can exist?