Nvidia Post-Volta (Ampere?) Rumor and Speculation Thread

Status
Not open for further replies.
Lastly, AMD is on 7nm with RDNA, and NVIDIA is still on 14nm. Though this will change too.... 7nm with the new lineup, so I don't know where you got the process disadvantage from? Or have I missed something and NVIDIA is jumping to 5nm?

Since their 7nm cards are pretty much equal to Nvidia's Turing lineup in terms of power/performance wise, it is more like they are generation or two behind from Nvidia on architeture standpoint. I think this is what he meant to say.
 
Halo effect is important, that's one, NVIDIA has 4 GPUs above AMD's highest Navi choice, 2070S, 2080, 2080S, 2080Ti, not counting Titan RTX of course.

Secondly, looking at the market right now, that's not true, AMD still doesn't provide competition when their GPUs lack essential hardware features, NVIDIA has way options than them at every price point and is selling way more GPUs. The 5500XT had poor reception, same for the 5600XT, the 5700 series is being outmatched by the super series in sales, especially with the current driver woes, the 5700XT is the only successful Navi choice for AMD right now, but the recent driver problems have cast a big shadow over it.

Thirdly, on the process front AMD is a node behind NVIDIA as well, that doesn't matter to consumers right now, but it matters a hell of a lot more next gen. It gives NVIDIA headroom to experiment and push their advantage further.

Essential hardware features? Seriously, what are they lacking - RT support, which is supported by one architecture, handful of games and performance is lacking, and VRS, which is used in what, 1 or 2 games?
5500 is a bit overpriced for it's performance, but 5600 as a product didn't get poor reception, only the hassle around the launch did. As for sales, looking at big player in Germany (because their sales data is easily available), 42.7% of GPUs sold in January were AMD, 57.3% NVIDIA. When, as you pointed out, NVIDIA has 4 products which don't have competing AMD part, it doesn't really look that good does it? In fact, 5700 and 5700 XT both outsold the closest Super, RTX 2060 Super, 2070 S was the only Super selling more.

Process node is irrelevant, what matters is what is on the market. NVIDIA might take a big lead with their next gen again, or then they won't, we'll know when they're out.
 
As for sales, looking at big player in Germany (because their sales data is easily available), 42.7% of GPUs sold in January were AMD, 57.3% NVIDIA. When, as you pointed out, NVIDIA has 4 products which don't have competing AMD part, it doesn't really look that good does it? In fact, 5700 and 5700 XT both outsold the closest Super, RTX 2060 Super, 2070 S was the only Super selling more.
Is this where you are getting your Germany numbers?
 
Mindfactory is just one retailer in germany. And the fact that there are more SKUs of Turing (alone the RTX2060 has as many SKUs as 5700(XT)) sold by more retailers makes this a biased comparision.

And downplaying Turing's superior architecture is just strange. GeforceNow offers Raytracing (Google doesnt), Turing is dominating the DL/AI market (T4 is 4x more efficient than MI60...), the Notebook market is in the hand of nVidia (18 months lead over AMD...), the workstation market has adopted DXR and Optix within 18 months and makes AMD hardware and software obsolete.

I think AMD wanted to play it safe for the console ports but nVidia just did a Apple and reinvented the market with Turing (bringing TensorCores and Raytracing to the mass market). And the biggest push will come with the next console generation supporting VRS and Raytracing.
 
Mindfactory is just one retailer in germany. And the fact that there are more SKUs of Turing (alone the RTX2060 has as many SKUs as 5700(XT)) sold by more retailers makes this a biased comparision.

And downplaying Turing's superior architecture is just strange. GeforceNow offers Raytracing (Google doesnt), Turing is dominating the DL/AI market (T4 is 4x more efficient than MI60...), the Notebook market is in the hand of nVidia (18 months lead over AMD...), the workstation market has adopted DXR and Optix within 18 months and makes AMD hardware and software obsolete.

I think AMD wanted to play it safe for the console ports but nVidia just did a Apple and reinvented the market with Turing (bringing TensorCores and Raytracing to the mass market). And the biggest push will come with the next console generation supporting VRS and Raytracing.
I'm not questioning Turings superior architecture, I'm questioning that RT & VRS are "essential hardware features" at this point in time. They will surely will be someday in the future, but that's still at least couple years ahead (even with the new consoles it won't happen over night). And Tensors are next to useless in gaming card.

And yes, it's just one big retailer in germany, but at least it's data vs just words. The MindFactory data has been coming in solid

edit:
The data is originally from here https://www.3dcenter.org/artikel/mindfactory-grafikkarten-verkaufsreport-januar-2020
And they gather it from the publicly available data from MindFactory, anyone can verify the numbers.
 
Essential hardware features? Seriously, what are they lacking - RT support, which is supported by one architecture, handful of games and performance is lacking, and VRS, which is used in what, 1 or 2 games?
Yes, those are essential forward DX features that are going to be featured in consoles one year from now, which makes RDNA1 solutions not future proof at all.

Also as mentioned above, professional Turing cards made Vega and Navi professional cards irrelevant to pro consumers, due to lacking hardware RT, same thing for AI and Compute.

As for sales, looking at big player in Germany (because their sales data is easily available), 42.7% of GPUs sold in January were AMD, 57.3% NVIDIA.
Half of those AMD sales are cheap ass Polaris GPUs which AMD is forced to sell for pennies anyway, while 80% of NVIDIA sales are Turing GPUs, and according to that retailer NVIDIA is selling truckloads of high end GPUs and getting away with all all the money, The 2070Super is outselling the 5700XT almost 2 to 1 despite being of a higher price.

And according to Steam, which AMD themselves use to estimate product reception, Turing super cards are outselling the 5700 series 2 to 1.

Process node is irrelevant, what matters is what is on the market. N
Of course it matters, this a technical forum, not walmart, we analyse technical aspects of products and their architecture.
 
Yes, those are essential forward DX features that are going to be featured in consoles one year from now, which makes RDNA1 solutions not future proof at all.

Also as mentioned above, professional Turing cards made Vega and Navi professional cards irrelevant to pro consumers, due to lacking hardware RT, same thing for AI and Compute.
They are futureproof enough for their lifetime, most don't upgrade their GPUs that often and it takes few years for new tech to really catch on the level it really matters. Specialized professional workloads are completely another story of course.
Half of those AMD sales are cheap ass Polaris GPUs which AMD is forced to sell for pennies anyway, while 80% of NVIDIA sales are Turing GPUs, and according to that retailer NVIDIA is selling truckloads of high end GPUs and getting away with all all the money, The 2070Super is outselling the 5700XT almost 2 to 1 despite being of a higher price.
How is 23% more "almost 2:1"? 2070S sold 2905 units and 5700XT 2360 units. You're quite good at exaggarating things out of proportions overall thoughl
And according to Steam, which AMD themselves use to estimate product reception, Turing super cards are outselling the 5700 series 2 to 1.
Steam surveys are so broken no matter who refers to them that you can't seriously take it as solid numbers. There has been countless irregularities in it over the years of which only one ever has been acknowledged and fixed, it's ridiculous.
Of course it matters, this a technical forum, not walmart, we analyse technical aspects of products and their architecture.
Before we have data from NVIDIAs 7nm chips we can't really read that much into the numbers, it's not given they will improve leaps and bounds more than AMD even when their current architecture is doing so well on older process - heck, we don't even know if they'll use N7 or N7+, while AMD is known to move to N7+
 
They are futureproof enough for their lifetime, most don't upgrade their GPUs that often and it takes few years for new tech to really catch on the level it really matters. Specialized professional workloads are completely another story of course.
There are 8 RTX enabled games right now, with more to come this year, these are DX visual features that are not available to current 5700XT or Vega 7 cards despite them being high end models, If buyers are keeping their cards for 3 or 4 years, they will not be experiencing RT on AMD hardware for the entirety of that time, that's a failure in and of itself.
How is 23% more "almost 2:1"? 2070S sold 2905 units and 5700XT 2360 units.
I meant according to Steam Survey. Must have gotten the phrases mixed up.

Steam surveys are so broken no matter who refers to them that you can't seriously take it as solid numbers. There has been countless irregularities in it over the years of which only one ever has been acknowledged and fixed, it's ridiculous.
If you are downplaying the importance of Steam, then why are you holding the Mindfactory results with such a high regard? Those results cover only one month, compared to an entire year for the Steam survey, which makes Mindfactory numbers completely worthless in comparison, and severely limited in scope and number.

AMD themselves referenced Steam when they were bragging about Jebating NVIDIA during the 5700 launch, which means Steam survey are much more important than you think.

Before we have data from NVIDIAs 7nm chips we can't really read that much into the numbers,
I disagree, yes we can. Process node is one of the prime factors of determining the trajectories of archetictures. That's tech 101.
 
Last edited:
I'm not questioning Turings superior architecture, I'm questioning that RT & VRS are "essential hardware features" at this point in time. They will surely will be someday in the future, but that's still at least couple years ahead (even with the new consoles it won't happen over night). And Tensors are next to useless in gaming card.

So, why would Sony and Microsoft support Raytracing (and at least Microsoft VRS) when these features "are [not] 'essential hardware features' at this point in time"? Wouldnt it make more sense for them to just use Navi as we know it? Raytracing is essential, because it will be used more and more from this point of time. I just found this free2play game through twitter with DXR support: https://davethefreak.itch.io/beyond-evolution

TensorCores are like the first iPhone. Introduction of something new which will be changing the way how graphics will be generated. Hardware has always come first.
 
I think AMD wanted to play it safe for the console ports but nVidia

Imagine if the next NV arch was to be the base for the next gen consoles.

I'm questioning that RT & VRS are "essential hardware features" at this point in time.

Essantial maybe not right now, it depends, but both are the future, nv already having it in their 2018 GPU's ensures support is solid in the future.



They are futureproof enough for their lifetime, most don't upgrade their GPUs that often and it takes few years for new tech to really catch on the level it really matters. Specialized professional workloads are completely another story of course.

Most upgrade their boxex when they really need to, usually sometime after a generational leap happened in the console space, were close to that time-era,

Steam surveys are so broken no matter who refers to them that you can't seriously take it as solid numbers.

The steam survey has been used against me multiple times on this forum, lastly regarding how many users have a gpu more powerfull then premium machines , it goes both ways i assume, but i'l keep your comment in mind ;)

Before we have data from NVIDIAs 7nm chips we can't really read that much into the numbers, it's not given they will improve leaps and bounds more than AMD even when their current architecture is doing so well on older process - heck, we don't even know if they'll use N7 or N7+, while AMD is known to move to N7+

True, but seeing how current Turing outperforms AMD's 7nm rather massivly, one can assume, to an extend, how NV's next 7nm product might perform. Besides die size, there's also a increase in performance to be had. It's been two years almost since Turing debuted.

There are 8 RTX enabled games right now, with more to come this year

That's without including mods to games that enable ray tracing, with probably many more to come.

And Tensors are next to useless in gaming card.

DLSS has improved alot lately. In it's best form, it does a much better job then any other upscaling hardware or software out there. In fact, next gen consoles could have a nice feature with something like that, 4k is and will still be rather taxing.
 
Some people in this thread seem to be in denial.

The fact nV currently offers a number of SKUs with performance higher than anything from AMD. The situation is even worse given Radeon VII being a weird EOLed product. I recall the R600 times when people were shocked when nV's both 8800GTX and Ultra topped AMD's flagship...

The Mindfactory data shows highend (>900EUR) having 2.7% market share but directly making 9.1% revenue. Not counting the marketing, mindshare fluff. Not bad.

//EDIT
Just a reminder, RTX 2080Ti launched over 16 months ago.
 
Some people in this thread seem to be in denial.

The fact nV currently offers a number of SKUs with performance higher than anything from AMD. The situation is even worse given Radeon VII being a weird EOLed product. I recall the R600 times when people were shocked when nV's both 8800GTX and Ultra topped AMD's flagship...

The Mindfactory data shows highend (>900EUR) having 2.7% market share but directly making 9.1% revenue. Not counting the marketing, mindshare fluff. Not bad.

//EDIT
Just a reminder, RTX 2080Ti launched over 16 months ago.
No-one is denying NVIDIA having currently better lineup, just that the situation isn't as dire as @DavidGraham makes it look like
 
No-one is denying NVIDIA having currently better lineup, just that the situation isn't as dire as @DavidGraham makes it look like
Yes it is dire, outmatched in all segments: Pro, AI, Data Centers, Mobile, Gaming, features, node ..etc, for all intents and purposes NVIDIA is having a free reign over their pricing structure with no one forcing them to correct course except in very limited situations. What you call proper competition was the era of HD 4800 and HD 5800 series, when AMD forced NVIDIA to adopt lower prices and keep them low, this isn't the situation here, this is just a glimpse of competition.
 
Last edited:
True, but seeing how current Turing outperforms AMD's 7nm rather massivly, one can assume, to an extend, how NV's next 7nm product might perform. Besides die size, there's also a increase in performance to be had. It's been two years almost since Turing debuted.

That's without including mods to games that enable ray tracing, with probably many more to come.

DLSS has improved alot lately. In it's best form, it does a much better job then any other upscaling hardware or software out there. In fact, next gen consoles could have a nice feature with something like that, 4k is and will still be rather taxing.

What, what are you talking about? Per mm and per watt RDNA is about level with Turing in game performance, and we have to wait for both next gen architectures which appear to be launching this year (unless the Wuhan virus kills us all or whatever). Comparing a hypothetical next gen nvidia arch against AMD's last gen arch is fascile.

DLSS is never compared to good temporal upscaling, which is what it should be compared against because they aim for the same thing, and it's entirely doable outside Nvidia's ecosystem. It's not like Nvidia have some magic patent on "running a neural net".

AMD already confirmed RDNA2 has DXR support, so I don't get at all what that has to do with requirements. This whole line of questioning seems utterly pointless.
 
According to Igorslab, NVIDIA's board partners are gearing up for a massive overhaul of their PCB designs, in anticipation of NVIDIA's next gen cards, a method called "backdrill" will be used to allow for much higher operating frequencies than possible now with the current board designs, it's also more expensive. The interpretation of this is that NVIDIA is gearing up to introduce really fast new GPUs.

Interesting. Not sure how this has any bearing on Nvidia's next architecture though. Off-chip communication is currently limited to PCIe, NVlink and GDDR6.

Maybe NVLink is turning the dial up to 11. Can't see why GDDR6 or PCIe 4.0 would require exotic PCBs.
 
Status
Not open for further replies.
Back
Top