AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
If you:

1. schedule a "big date";
2. vaguely "sneak peeked" your value prop/positioning 2 weeks ahead; and
3. end up having zero punch (well, other than the boring "it's cheaper" angle) on the actual event day;

This is absymal marketing and failed cliffhanger by the textbook. Not saying this couldn't happen (cough Vega cough), and we should keep our hope low, but I thought RTG had moved on from this crap after R... Vega.

:runaway:
 
Last edited:
There is nothing NV is able to do to "react" at this point, they've already played all of their cards between $500 and $1500 for this gen. There is zero reason to keep anything under wraps because of NV at this point. NV's next high end update will happen no sooner than in a year from now. (Well, there will be 16 and 20 GB cards but I hardly consider them an "update" even, more like a way to kill your perf/price to make some people happy.)
Why else would they have moved the 3070 launch to just one day after the event on the 28th? Never underestimate nVidia. They are extremely adaptable. I think AMD learned that lesson, and even if they think nVidia can't do anything, they will not risk it. Even a smear campaign is not beyond nVidia.
 
Why else would they have moved the 3070 launch to just one day after the event on the 28th?
To make some stock excess before launch. Considering that they are already shipping these to stores (which is how you make stock excess) there's about zero things they can do between now and 29th with these cards.
 
1. schedule a "big date";
2. vaguely "sneak peeked" your value prop/positioning 2 weeks ahead; and
3. end up having zero punch (well, other than the boring "it's cheaper" angle) on the actual event day;
Theres more to a GPU lineup than the top GPU tho. Lots of people will be interested in other angles be it price or w/e else they show. Also are you telling AMD to not have an event if they don't have the GPU crown? Like seriously?
 
Theres more to a GPU lineup than the top GPU tho. Lots of people will be interested in other angles be it price or w/e else they show. Also are you telling AMD to not have an event if they don't have the GPU crown? Like seriously?
I didn’t say “not to have an event”, mate. I am saying there is no point to create suspension if there isn’t a punch.

Launches for (especially) non market flagship have always been more simple & direct, usually without lengthened marketing sequences like this. IMO you do this only because you have a strong punch to deliver a somewhat top-down halo effect.

Though on the flip side, you could also say they are doing a sneak peek simply for expectation management, slowing down the hype trains.
 
Last edited:
While the comparisons between those benchmark numbers and the 3080 are valid and interesting to speculate with, it should be remembered that AMD claimed to want to disrupt the 4k gaming market with rdna2 much like zen/+/2 has done with high core count cpus.

So really the purpose of the benchmarks was to show that smooth 60fps 4k gameplay has been achieved. The next element of disruption will be enabling as many people as possible to obtain it through pricing. So based on that either this is the top die
and it is priced very well, or it is a lower stack die that can achieve 4k 60+ fps and is also priced well

Nvidia may have priced the 3080 at 699 to try and paint amd in to a corner if they believed the top die would perform equal or less than a 3080, yet would cost substantially more to make at rumored 500m2 on tsmc 7n. but if the rumored memory configurations and clock
speeds are accurate at all, perhaps AMD was able to achieve this level of performance for much less cost.
 
There is nothing NV is able to do to "react" at this point, they've already played all of their cards between $500 and $1500 for this gen. There is zero reason to keep anything under wraps because of NV at this point. NV's next high end update will happen no sooner than in a year from now. (Well, there will be 16 and 20 GB cards but I hardly consider them an "update" even, more like a way to kill your perf/price to make some people happy.)
"Super" cards were quite an impressive reaction to 5700 series, I'm sure AMD is expecting NVidia to do the same again. Given that 3080 10MB is configured with an entire GPC turned off as well as two unused memory channels, I'd say NVidia has lots of "performance and configuration" room to play with by the end of November.

A differential of 10% between two competing cards is meaningless (especially with variable refresh rate tech). Sadly graphics card reviews make a mountain out of a 10% molehill. So it's entirely rational for AMD to keep NVidia guessing, because NVidia's going to have to decide how much of 3090's performance it's going to give gamers for a "Super" spoiler for RDNA2. All while keeping the price below $1000.

With RDNA2 being at the very least similar in performance to 3080 and being very likely to have lower power consumption, I'm concerned that RDNA2 is going to be spectacular for ETH mining and tens of thousands of cards will disappear into mines for the next six months.
 
At the very least, AMD will deliver their greatest generation improvement in the history of the company. The Nov5 launch of zen3 was a little disappointing. I hope RDNA2 is available much sooner than that after launch
 
"Super" cards were quite an impressive reaction to 5700 series, I'm sure AMD is expecting NVidia to do the same again. Given that 3080 10MB is configured with an entire GPC turned off as well as two unused memory channels, I'd say NVidia has lots of "performance and configuration" room to play with by the end of November.

A differential of 10% between two competing cards is meaningless (especially with variable refresh rate tech). Sadly graphics card reviews make a mountain out of a 10% molehill. So it's entirely rational for AMD to keep NVidia guessing, because NVidia's going to have to decide how much of 3090's performance it's going to give gamers for a "Super" spoiler for RDNA2. All while keeping the price below $1000.

With RDNA2 being at the very least similar in performance to 3080 and being very likely to have lower power consumption, I'm concerned that RDNA2 is going to be spectacular for ETH mining and tens of thousands of cards will disappear into mines for the next six months.

I would assume Nvidia already knew what performance Navi 2 would offer when they launched their 3000 series. It seems incredibly unlikely that Nvidia is going to release refreshes a mere month or 2 later.
 
So it's just a coincidence? Yeah right... I don't buy it. At the very least, nVidia wouldn't have wanted to give AMD two weeks to adapt their presentation in their favor.

Don't buy it then, who cares? AMD has had their presentation locked for probably weeks by now, if not months. Where do you think many leakers get their info from? Why do you think the slides have an NDA expiration date in the first place? You think marketing team can make them, get them approved by higher ups and distribute them all around the world in 2 weeks? Come on...
 
At the very least, nVidia wouldn't have wanted to give AMD two weeks to adapt their presentation in their favor.
See, that would make sense if the 3070 were a mystery. But it's not. Specs, performance tier and price are known. There's nothing left to adapt. I also don't buy this narrative that Nvidia is leaving room for price-adjustment within 1 day. Historically have they ever done something this reactionary? (It's a genuine question, I don't know the answer.)
 
See, that would make sense if the 3070 were a mystery. But it's not. Specs, performance tier and price are known. There's nothing left to adapt. I also don't buy this narrative that Nvidia is leaving room for price-adjustment within 1 day. Historically have they ever done something this reactionary? (It's a genuine question, I don't know the answer.)

The only thing left to adapt is price, they can lower the price all they want. They haven't sold a single one, they can just slash the price by a hundred, nothing's there to stop them. Probably they have a good idea of what the performance tiers of RDNA2 are, but AMD can set the prices to whatever they want for the next few weeks as well, there's no reason Nvidia can't adapt to them the day after. It's what AMD did last year.

Shockingly, width scaling never leads to perfectly linear performance gains. This has only happened always between VSA-100 and now..

Except this is a last gen, 60 fps title running at native 4k. If anything should be width friendly it should be that. Thus either this is somehow their mid high end bin, the 12gb one, priced probably at $600-650 msrp (and probably sold at $700 this year given shortages) or just as likely their highest end just doesn't look like the supposed leaks; at least as my best guesses anyway.

In fact, going through it, Modern Warfare is just 50% faster and Gears just 65% faster. It seems severely unlikely that this is an 80cu part at all, let alone one hitting 2.2ghz.
 
Last edited:
See, that would make sense if the 3070 were a mystery. But it's not. Specs, performance tier and price are known. There's nothing left to adapt. I also don't buy this narrative that Nvidia is leaving room for price-adjustment within 1 day. Historically have they ever done something this reactionary? (It's a genuine question, I don't know the answer.)

When AMD (still branded as ATI at the time) 4xxx launched shortly after Nvidia's 2xx series. This forced a very significant price cut just weeks after the 2xx series launch for the GTX 260 and 280, to the point Nvida/AiB's even offered rebates for existing buyers. This to me is the only time historically AMD( or ATI) has really disrupted Nvdia's pricing.

After that it's debatable with the AMD's Hawaii launch and 2xx series refresh, this did cause a significant price cut' to Nvidia's GTX 770 and 780 (Nvidia also announced 780ti in reaction) but this took place in the 6 month time period after Nvidia launched those.

Outside of those two you're really just talking about standard product generational pricing changes as the time periods are in the 1 year (or more range) and it also becomes debatable if changes were due to AMD's product stack or just Nvidia updating their stack due to generational progression.

Which is why (and we know how controversial these types of comments end up being, but I'll risk it) I personally don't really understand the view point some have on AMD's pricing "friendliness" (so to speak) with respect to Nvidia (or Intel, as we see today with Zen 3). There's a somewhat popular belief (particularly) among certain demographics that AMD is very proactively aggressive in pricing, which I feel is more of a result of successful marketing messaging, while it's actually aggressive due to reactionary necessity (whether that's against Nvidia or Intel), nor is it as aggressive as it might seem even.

This is getting a bit long winded but to followup on that last point in general I think some people often overstate the perf/$ competitiveness historically of graphics cards from a practical stand point. The truth is if we look at things historically for the most part perf/$ roughly settles in terms both sides if we look at the entire product stack (especially in the bulk middle). One side having a a clear advantage that obliterates the other in this aspect is the minority occurrence, that is unless we overstate things and start comparing <10% differentials and think that's some dramatic difference in practice.
 
And had a much worse quality in comparison to DLSS 2.0 as a result of it being little more than the usual TAA supersampler / upscaler.

It doesn't matter, unless you are saying that Nvidia lied by calling it DLSS. You asked for examples where DLSS was made without tensor cores. Here is one that DF considered decent enough. So it is possible, end of story.
 
"Super" cards were quite an impressive reaction to 5700 series, I'm sure AMD is expecting NVidia to do the same again. Given that 3080 10MB is configured with an entire GPC turned off as well as two unused memory channels, I'd say NVidia has lots of "performance and configuration" room to play with by the end of November.
Super cards came a year after Turing launch. They had the luxury of building up their margins on initial products which allowed them to do a price cut essentially in the form of Super refresh. There is no such luxury with 30 series right now. And 3090 shows you what they can do by unlocking anything on 3080 - it's +10% at best due to power constraints. With 3070 shipping already the only thing they can relatively easy change on it is its MSRP but again - it is low enough already and I have doubts about AMD willing to go even lower with Navi2 here. From everything we know about N7 and 8N it's NV who is likely to dictate prices again due to lower production costs.

In fact, going through it, Modern Warfare is just 50% faster and Gears just 65% faster. It seems severely unlikely that this is an 80cu part at all, let alone one hitting 2.2ghz.
You've answered yourself there I think.

It doesn't matter, unless you are saying that Nvidia lied by calling it DLSS. You asked for examples where DLSS was made without tensor cores. Here is one that DF considered decent enough. So it is possible, end of story.
Of course. There was no DL in DLSS 1.9.
 
Super cards came a year after Turing launch. They had the luxury of building up their margins on initial products which allowed them to do a price cut essentially in the form of Super refresh. There is no such luxury with 30 series right now. And 3090 shows you what they can do by unlocking anything on 3080 - it's +10% at best due to power constraints. With 3070 shipping already the only thing they can relatively easy change on it is its MSRP but again - it is low enough already and I have doubts about AMD willing to go even lower with Navi2 here. From everything we know about N7 and 8N it's NV who is likely to dictate prices again due to lower production costs.


You've answered yourself there I think.


Of course. There was no DL in DLSS 1.9.

Yeah there was! There was still AI training.... I have no idea why you are being so obtuse with this. AI training and inferencing existed long before tensor cores were brought about.
 
Do we have a list of DXR games?

The other silly season reason for AMD being vague with the Big Navi numbers is that it was running RT...
 
Yeah there was! There was still AI training....
There wasn't. They've used "hints" which they gathered from AI reconstruction running in offline and used them to make a purely shader based temporal SS solution in DLSS 1.9.

I have no idea why you are being so obtuse with this.
I'm not the one being obtuse here.

AI training and inferencing existed long before tensor cores were brought about.
What's this have to do with the topic we're discussing?
 
Status
Not open for further replies.
Back
Top