AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
It’s funny though how leaky the partners are. No wonder they are completely left in the dark.

Yeah complete silence all this time and as soon as AIBs come into the picture the flood starts. Really no way to stop that except for what AMD is doing. Launch with reference first and bring in AIBs as late as possible.
 
Honestly, who cares about leaks. It's not like it affects anything. It's not like we have the entire picture yet, and none of it is confirmed, though it's likely to be close to true at this point.
 
Honestly, who cares about leaks. It's not like it affects anything. It's not like we have the entire picture yet, and none of it is confirmed, though it's likely to be close to true at this point.

Yes exactly this. Its close to a 3080, if they did this with alot of cache then fine.
 
Leaks are juicy gossip for us but are absolutely awful for vendors. Months of careful competitive positioning can be disrupted because some asshole decided to win some internet points by spilling technical information.

Don’t get me wrong, leaks about poor working conditions or toxic cultures are important. But technical leakers are scum.

And yet, here I am jumping into and enjoying all this technical gossip based on leaks. Hypocrisy 101 I suppose. :(
 
Leaks are juicy gossip for us but are absolutely awful for vendors. Months of careful competitive positioning can be disrupted because some asshole decided to win some internet points by spilling technical information.

Don’t get me wrong, leaks about poor working conditions or toxic cultures are important. But technical leakers are scum.

And yet, here I am jumping into and enjoying all this technical gossip based on leaks. Hypocrisy 101 I suppose. :(

Well, I get it. The timing of the leaks matters. A couple weeks from a launch event doesn't really make much difference because the hardware is pretty much solidified at this point. Pricing is a big thing. If Nvidia had known AMDs pricing when they launched the 3080 and 3090, it might have made them tweak things a bit. Who knows.

Overall, at this point I'm not seeing anything leaking that really looks shocking. It all looks in line with the 50% performance per watt improvement AMD had articulated, and the doubling of the CUs was the most logical assumption. On top of that, we got a ton of it from Series X and PS5 presentations.

In terms of the upcoming event, I really don't see it being spoiled.
 
For Nav21 speculation that is missing the point though, what is interesting is that we now know that the 6800XT is minimum 10k+ points @ FSU. Which means that the 6900XT is probably ~10-15% faster so approx, 11-12.5k points for FSU performance. But purely conjecture.

There is a comment in the thread which is interesting:

Higher bins usually end up 20-25% faster, thus putting ahead of a 3090 in certain cases. This tracks with the 6080 being a 72cu gpu, just lower the clockspeed about 10% as well and you've got your two cards.

I did want to guess $600 for the 6080, but surely AMD would want to up there own profit margin right? Well, maybe not.

Either way all these rumors would explain why the 3090 is called that and not a Titan. Also why Nvidia launched as early as possible with very little stock, gotta bilk the fanboys for as much as possible before the competition comes in. If so I hope Intel gets it's shit together within the next year for their launch, because heavy competition is most welcome.
 
Two 8-pin connectors, a humongous amount of VRMs, a mediocre 256bit bus to the VRAM, with best-case scenario using 16Gbps chips.

It fits all the rumors being told about Navi 21.

Also looks an awful lot like a Sapphire PCB to me, from the colour, shape, BIOS switch, and component markings.
Interesting to see that the upper right choke and capacitor have seen some manual rework from all the solder flux up there - and given the look of the upper right choke in that power stage I'm not entirely certain it didn't let the smoke out altogether.

Bios switch and the fan connectors are a dead ringer for what Sapphire likes to use on their high end cards - compare to:

https://www.techpowerup.com/review/sapphire-radeon-rx-5700-xt-nitro-special-edition/3.html

EDIT: I completely missed the faint 'Sapphire' hologram on the HDMI connector in the top left, so that confirms that it's one of theirs.
 
Looks like a 14 phase vcore +2 phase vmem power delivery setup. A lot of oddly darkened solder balls underneath the kapton tape, mostly where the power delivery pins in GDDR6 are.
I suspect the memory VRM blew up and took the GDDR6 with it on that particular card. Might be how some enterprising individual manage to snag it and take pictures.

The larger photo is also conveniently cropped out very precisely over by the fan connectors on the bottom right to obscure the Sapphire part number that normally goes there, as you can see in my previous link to the 5700XT PCB. Interesting that they managed to perfectly obscure that but left the Sapphire logo on the HDMI port visible.
 
Higher bins usually end up 20-25% faster, thus putting ahead of a 3090 in certain cases. This tracks with the 6080 being a 72cu gpu, just lower the clockspeed about 10% as well and you've got your two cards.

I did want to guess $600 for the 6080, but surely AMD would want to up there own profit margin right? Well, maybe not.

Either way all these rumors would explain why the 3090 is called that and not a Titan. Also why Nvidia launched as early as possible with very little stock, gotta bilk the fanboys for as much as possible before the competition comes in. If so I hope Intel gets it's shit together within the next year for their launch, because heavy competition is most welcome.

amd did spend two generations of ryzen pricing low vs intel. They might do that here just to get market share.
 
amd did spend two generations of ryzen pricing low vs intel. They might do that here just to get market share.

We would need to consider that in order for the situation to be similar the following would also need to occur -

Ryzen's aggressive pricing was in a large part due to necessity due to deficiencies it had against Intel's offerings that were important to the market segment in question. It wasn't to purely offset mind share differences (actually in general I think people overstate the mind share deficit, if it even has one, AMD has particularly if we break it down to specific demographics but this is another matter). I'm not sure what plausible scenarios similar to this could occur in the graphics market.

Ryzen had a significant cost advantage over Intel due to both design and manufacturing. Intel had a supply issue itself and essentially also had to cede market share due to market priorities being higher elsewhere. They prioritized preserving the enterprise/OEM segments instead which is why we see it's still a slow slough on those fronts for AMD even though DIY retail has completely been upended.

Just commentary on AMD pricing in general it needs to be kept in mind that historically the Terascale vs Tesla situation was the only time the market actually got upended that was non generational. Given that's happened once there's is the possibility of it happening again but I don't understand why there seems to be a sentiment of this happening with every new release when it never has outside of that. Even with Terascale it's important to note that AMD did cede absolute performance to Nvidia against both Tesla and Fermi. The last time when AMD had a pretty universal advantage and lead time with GCN they also had pretty poor generational perf/$ improvements as well.

This is also veering off topic, as it's somewhat Nvidia centered, but for whatever reason dating back to Kepler Nvidia's top xx104 GPU has had value/$ issues even within it's own stack and I don't think RTX 3070 changes that. The generational comparison against RTX 2080ti sounds great more so because of latter was very poor in the respect. But typically you expect significant value/$ improvements going down and regression going up but for some reason the top xx104 seems to have for awhile now been an outlier. So beating RTX 3070 itself in that respect isn't very a high bar to hurdle.
 
There are also other scores going around, even higher than that (and for the whole 3DMark suite with 3080 winning some and losing some) for the "XT" but it is unknown if a) it is for reference or AIB b) AMD has indeed this very secret "XTX" board going on

Edit: writing at the same time of the comment above lol
 
Status
Not open for further replies.
Back
Top