NVIDIA discussion [2024]

  • Thread starter Deleted member 2197
  • Start date
Wafers are NOT the liming factor. stop.
I can't believe we have to repeat this several times already, NVIDIA can triple their current AI output, and wafer allocation is still not going to be the limiting factor.
At the expense? AMD increased it's AIB market share last year by nearly 60 % from 12 to 19 %
That's merely a return to semi normalcy, they had an all time low market share before that.
That only shows back to 2020... Let's look further back than even 2014, which I posted before...

dgpuship.png
From 2005 to 2015 it's showing the effect of ultra low dGPUs, stuff like (on the Geforce side) the FX 5200, 6200, 7100, 8400 ..etc. People were buying those GPUs in droves to run their monitors and do some lite gaming, as iGPUs at that time were much more limited, and were integrated on the motherboard side, and not the CPU side (meaning you can a buy a board without an iGPU, which was quite common). Such low GPUs all have disappeared in favor of more capable iGPUs. So in a sense they are still there, just in the form of the modern iGPUs inside the CPU.

As for the other dGPUs, they have been stable around ~10 million units (plus or minus) per quarter ever since 2015.
 
Last edited:
It is a scary drop compared to the times of 20+ million. You are right this is not some bubbles. But is it caused by Nvidia? Why not other factors like longer upgrade cycles? More people with integrated graphics? Many people are now content with laptops.

The posts just with desktop numbers is somewhat misleading because desktop PC sales in general peaked back around the 2010 range and had been in continul year on year decline until covid hit. The numbers would be even worse for consumer desktops. Do also note that desktop PCs does not mean DIY PCs. I tend to notice this in terms of enthuasist discussions in that ethuasists seem to think the PC gaming hardware revolves around DIY and a lot of the business related commentary seems to be primarily framed around that (and really a sub segment of enthusist DIY).

Once you also really go into the business side complaints it's really that the market has been stretched out on the consumer side and how in terms of optics that just doesn't sit well with people psychologically. Despite all the complaints in practice a $500 GPU today will beat the experience of a $500 GPU from 8 years ago (the so called last golden era with Pascal) in any type of blind test, so in terms of actual user experience GPUs (and tech in general) is still vastly outperforming general inflation. But that isn't what is upsetting certain people, it's about where $500 sits in the product stack compared to 8 years ago.
 
I basically started this line of the thread.
This discourse is incredibly popular on the internet in general, peddled by techtubers for whom any Nvidia bashing essentially equals money for additional views.
It didn't make any sense back when it started. "AI will result in rising prices! We are doomed!!!" was a thing with them a year ago, nothing happened yet we're still discussing now how inevitable it is.

When I look at the gpu market where I live, the RTX 4080 is still about $1300 CAD, which is $300 more than I paid for my $3080 right as the crypto boom took off.
Does 4080 have the same performance as 3080? No? Then it's not a "price increase" by any measure.
Also a 1000 CAD something in 2020 is 1170 CAD in 2024 thanks to inflation alone. I don't know if Canada added import tariffs as US did but that can be another part of the retail price added between these years.
The whole blind comparison by name alone makes no sense whatsoever and shouldn't even be done at all.

Now we have an ai boom, and people that want to work on ai buy up the highest-end model gaming gpus.
I don't see these people have any effect on said GPUs pricing, do you? This is a relatively small market likely way smaller than the gaming one for that same GPU. 4090 is not production constrained, it is being produced exactly in quantities needed for the demand at hand. Why would this change with 50 series?
The crypto problem was that one person could be buying 1000s of GPUs and then they instantly printed money for them. AI dev/researchers won't be buying more than 1 per person and even then they will think twice on if they even need it - or it would be cheaper for their work/research to just rent a server with the same AI capabilities. AI on a 4090 does not print any money and I don't think that it ever will.

Nvidia never released a desktop 4050, or something really low-end. Maybe that'll change with 50 series.
Nvidia sells RTX GPUs starting with 3050 6GB at $170 now (everyone was laughing when that launched yet it's a great GPU for a HTPC-like config which doesn't really have any competition on the market). And IIRC there are cards being made for prices below that, GT1030 and such. The fact that they didn't release anything from 40 series specifically doesn't mean that they don't have any products for <$300 markets.

The trend seems to be that prices are staying high, Nvidia is less interested in the mid to budget market. They can build bigger and more expensive gpus and sell them to professionals and ai people. The trend looks like it's towards lower volume and higher margins. We'll see if that continues but if there's going to be any price competition and sanity, I think it's going to have to come from intel and amd offering super competitive low to mid-range gpus, because Nvidia doesn't look interested anymore.
The prices haven't really changed since the (re)introduction of SLI on GFFX. Tripple SLI configs were putting you at about the same $$1500-2000 price range where current top end GPUs are sitting. The only range which kinda became stale and is being filled with old chips is the lowest end one where iGPUs became enough for the business users. This segment is the most likely candidate for the loss of sales volume since ~2005 you can see on the graph above. It also was basically unprofitable for the GPU makers.
 
Yup, prices are basically the same today as when you could buy three high end GPU's for the same price! That argument makes perfect sense.
 
It’s hard to talk about pricing without specifying the price of what. E.g. “high end” by itself isn't really meaningful when looking at price changes across generations. Perf/$ is what matters.

If some IHV decides to launch a $2500 card with twice the performance of a 4090 that doesn’t necessarily mean prices have increased. It could just be a net new tier.
 
It’s hard to talk about pricing without specifying the price of what. E.g. “high end” by itself isn't really meaningful when looking at price changes across generations. Perf/$ is what matters.

If some IHV decides to launch a $2500 card with twice the performance of a 4090 that doesn’t necessarily mean prices have increased. It could just be a net new tier.
If Nvidia decides to launch a $2500 card with twice the performance of a 4090, and they call it the 5090... then that means prices have increased.

We've been over this before. The cards ALL follow basic naming schemes so people can easily tell what tier of card they are within their generations. This has remained true for MULTIPLE generations.

So if there's a 5060, 5070, 5080, and 5090 lineup.. then prices HAVE increased. It doesn't matter if you can buy a 5070 cheaper and it outperforms a 4090... that's literally the point of generational increases in performance. The point is that price has increased relative to where it falls within in the lineup.

If Nvidia wants to add a $2500 GPU on top of the 5090 and call it a 6000... then that's fine... so long as the price of the 5090 isn't increased over the 4090. It will be tho. 🤷
 
We've been over this before. The cards ALL follow basic naming schemes so people can easily tell what tier of card they are within their generations. This has remained true for MULTIPLE generations.
Yes, we've been over this. That doesn't mean that the conclusion was what you want it to be, or that there is any agreement about that.
While the naming schemes have that general intention, doesn't mean they succeed at all or that there aren't any other factors in the naming.

If Nvidia decides to launch a $2500 card with twice the performance of a 4090, and they call it the 5090... then that means prices have increased.
[..]
If Nvidia wants to add a $2500 GPU on top of the 5090 and call it a 6000... then that's fine... so long as the price of the 5090 isn't increased over the 4090. It will be tho. 🤷
This shows how arbitrary your reasoning sounds. You are determining if there's a price increase or not purely on how nvidia decides to name a sku.
 
Last edited:
Yes, we've been over this. That doesn't mean that the conclusion was what you want it to be, or that there is any agreement about that.
While the naming schemes have that general intention, doesn't mean they succeed at all or that there aren't any other factors in the maming.


This shows how arbitrary you reasoning sounds. You are determining if there's a price increase or not purely on how nvidia decides to name a sku.
Don't care if you agree or not. I'm stating my stance on it.

There's nothing arbitrary about my reasoning. Nvidia has set a clear naming convention which, for generations now, has remained the same. It's not "the name" it's the tiers of their GPUs within a generation. They're clearly defined.

If you were to compare an improvement gen over gen... you would compare the 4060 to the 3060.. the 3060 to the 2060, the 2060 to the 1060... That exists for a reason.

-xx50 (Entry level)
-xx60 (Mid range)
-xx70 (High end)
-xx80 (High end)
-xx90 or Titan (Enthusiast)

It's expected that lower end cards of a future generation will out perform higher end cards of previous generations for less money... That's why there's tiers of cards. If the new "highest tier" card costs more than the previous highest tier cards.. then they are increasing prices.
 
Nvidia has set a clear naming convention which, for generations now, has remained the same. It's not "the name" it's the tiers of their GPUs within a generation. They're clearly defined.
Yes Nvidia has established a branding convention to differentiate tiers of products within a generation. However nothing about that convention mandates that the x70 part of the next generation should be the same price as the x70 part of this generation. We aren’t paying for the numbers on the box. We’re paying for performance and features.

You are defining a price increase as “box with same suffix now costs more”. That only works if the numbers on the box are perfectly correlated with performance increases (the thing we actually should be measuring).

Which of these is the better upgrade from a $300 4060?

1. $300 5060, 20% faster
2. $350 5060, 50% faster
 
Yes Nvidia has established a branding convention to differentiate tiers of products within a generation. However nothing about that convention mandates that the x70 part of the next generation should be the same price as the x70 part of this generation. We aren’t paying for the numbers on the box. We’re paying for performance and features.

You are defining a price increase as “box with same suffix now costs more”. That only works if the numbers on the box are perfectly correlated with performance increases (the thing we actually should be measuring).

Which of these is the better upgrade from a $300 4060?

1. $300 5060, 20% faster
2. $350 5060, 50% faster
Yes it does. It should fall within an expected range limit. If that number creeps higher and higher generation after generation... then prices are being increased.

Again... it's expected that lower end products of future generations will perform better than higher end products from past generations. This has absolutely nothing to do with performance per dollar... it has everything to do with performance relative to the other products in the new stack.

Nvidia creates a GPU architecture, they bin the chips into different tiers and create products out of them. They can only reasonably create so many different tiers of products from those designs. We've already got a well known and established convention which easily illustrates where each GPU falls within the stack. THAT is what you compare it to. If the mid range cards from this generation cost more than the mid-range cards from last generation... then they are increasing the cost of that range of GPUs.
 
xx80 (High end)
xx90 or Titan (Enthusiast)
We've already got a well known and established convention

If you calculate it like that, then you should know that in NVIDIA's covention, a Titan used to cost between 1000 and 3000$, the latest Titan was Titan RTX which launched for 2500$.

If the xx90 class hardware is the replacement for Titans then NVIDIA have effectively slashed prices, not increased it. The 3090 launched for 1500$, and the 4090 launched for 1600$.
 
Last edited:
This seems a really superficial complaint because based on that reasoning if they changed the name every single generation you would no longer have an issue?

As for the product stack argument why is that set in stone? Is Intel for example forever now restricted in that their most expensive product follows the same naming convention as a the A770 (so I guess B770) and is $350. That can literally never change now based on your viewpoint.

Or what if AMD, based on the rumours, goes for a smaller release next change that isn't allowed either. It has to be a 8900 XTX and $1000. The product stack can literally never change.

I mean if we want to go to earliest precedent than all these GPU companies had 1 product per generation. We need to go back to that, it's precedent and what people were used to!
 
then prices are being increased.

Prices of what exactly? The only thing you seem to have referenced is the number on the box. You’re arguing for the right to own the “middle” card for the same price as last generation’s “middle” card. That implies we have a right to own the “best” card for some fixed price too which of course is silly.

Why not just look at actual performance? The number printed on the box is then Irrelevant to your purchasing decision.
 
Prices of what exactly? The only thing you seem to have referenced is the number on the box. You’re arguing for the right to own the “middle” card for the same price as last generation’s “middle” card. That implies we have a right to own the “best” card for some fixed price too which of course is silly.

Why not just look at actual performance? The number printed on the box is then Irrelevant to your purchasing decision.
Prices of tiers relative to each other..

We know Nvidia bins chips based on "how much" of the chip is fully active.. we know prices are determined by many many other factors.. It's not saying you have the right to the best card for some fixed price... it's saying that what THEY classify their bins as has an actual meaning... Their pricing should reflect what they're getting out of the chip. If before the xx90 chip cost $1500 and the next generation xx90 chip costs $1800.. they're raising the costs... it's that simple.
 
This seems a really superficial complaint because based on that reasoning if they changed the name every single generation you would no longer have an issue?

As for the product stack argument why is that set in stone? Is Intel for example forever now restricted in that their most expensive product follows the same naming convention as a the A770 (so I guess B770) and is $350. That can literally never change now based on your viewpoint.

Or what if AMD, based on the rumours, goes for a smaller release next change that isn't allowed either. It has to be a 8900 XTX and $1000. The product stack can literally never change.

I mean if we want to go to earliest precedent than all these GPU companies had 1 product per generation. We need to go back to that, it's precedent and what people were used to!
Never said it had to be set in stone. Regardless of what they call it... it will be classified by them as low end, mid range, high end, and Enthusiast. Expectations will remain what they are.
 
Prices of tiers relative to each other..

We know Nvidia bins chips based on "how much" of the chip is fully active.. we know prices are determined by many many other factors.. It's not saying you have the right to the best card for some fixed price... it's saying that what THEY classify their bins as has an actual meaning... Their pricing should reflect what they're getting out of the chip. If before the xx90 chip cost $1500 and the next generation xx90 chip costs $1800.. they're raising the costs... it's that simple.

So all Nvidia needs to do is name their slowest product next generation as the 5090 for $300 and the fastest the 5099 for $30000 and that would mean prices went down?

And when they released the Geforce 260 and 280 there was no way to know at all if prices went up or down.
 
So all Nvidia needs to do is name their slowest product next generation as the 5090 for $300 and the fastest the 5099 for $30000 and that would mean prices went down?

And when they released the Geforce 260 and 280 there was no way to know at all if prices went up or down.
No, because we'd know the slowest product relative to the highest product regardless of what they are called......

They COULD call the slowest product the 5090... but then the 5090 would be an entry level product....
 
Their pricing should reflect what they're getting out of the chip.

That’s not how prices are determined. At best costs create a floor for minimum pricing. They definitely do not determine maximum pricing. Competition and consumer demand drives that. The average person has zero idea what the manufacturer is “getting out of the chip” and it’s not part of the value proposition.

We’re not buying tiers or brand names. Your entire framework for thinking about this is based on immaterial attributes of the product.
 
Back
Top