Nvidia GeForce RTX 50-series product value

The header of that article that it’s “neutrality is in question” lol.

Your salary not going up 1:1 with company revenue is not only not ‘greedflation’, that’s not even what inflation is.

To bring it back to GPUs: there’s no malevolent force that’s making them more expensive, it’s just that people are willing to pay more for GPUs now than they did 6-8 years ago.
I don't know who was making that argument. I just said that greedflation exists and it's driving GPU prices upwards. Whether you believe greedflation is malevolent or not, that is a strawman you created by yourself.

As for wages not increasing 1:1 with revenue and profits(I like how you excluded this part), that is also another wonderful strawman you constructed. It's not an argument I made. I pointed out that my salary has only increased 7% and that it was disproportionate to the increases in revenue, profits, and stock price seen at my company. I never suggested that it should be 1:1.

Finally, the link was shared to show that the topic greedflation exists and people discuss it all the time. The neutrality of the author is irrelevant as you initially asserted that it doesn't exist.
 
Last edited:
The type of person that would purchase a 2080ti at launch is not a price sensitive customer in the first place.

I bought a 3090 and I’m price sensitive and I also chose to skip the 4090. What category do I fall into?

To many if not most people it does matter, past experiences (including that last time) is something you compare it to, all of the actual changes in the world, technology etc that lead to what's available now are if not completely irrelevant, at least very little concern compared to that. Psychology is a funny thing.

Are you saying that someone who wants to upgrade may decide not to because it’s only 2x the performance of their current card and not 3x? When you guys say the 5070 Ti is poor value what card do you envision the shopper currently owns?
 
I bought a 3090 and I’m price sensitive and I also chose to skip the 4090. What category do I fall into?
I don't know really. What I can say is that in the context of people purchasing GPUs , someone who spends $1499 msrp on a 3090 is not traditionally considered as a price sensitive customer. Especially when that person pays >2x the msrp price for 14% more performance over a 3080 at $699.
 
The type of person that would purchase a 2080ti at launch is not a price sensitive customer in the first place. We've seen historically that those who buy high end products upgrade frequently. Now if this niche high end consumer that you hypothetically invented exists, then it's likely that they would have been just as enticed with the prospect of buying the 4080super last year at the exact same price that purchased their 2080ti. So if they chose not to buy a 4080 super at $1000, I don't see how a $900 -$1000 5070ti would change that narrative especially since we know the $750 price point is fake.

Then again, I think we both know the general buying patterns of those who purchase high end gpus and they don't traditionally align with the scenario you've presented.
A 2080 Ti user who hasn't upgraded, is likely waiting until they are forced to upgrade by new games. And when the 4080 Super arrived, neither Black Myth Wukong, nor Indiana Jones had launched, nor had id announced that the next Doom would use path tracing. When ray tracing was the max settings baseline, a 2080 Ti could still do ok with DLSS, but that's not the case with path tracing.

But even when comparing with a 2080 Super, a $900 5070 Ti would be expected to deliver 87% more performance per dollar, relative to the original MSRP. Or comparing with the 2080, it would deliver 96% more performance per dollar. Presumably you are now going to say that 2080 Super and 2080 users don't exist either?
 
I don't know really. What I can say is that in the context of people purchasing GPUs , someone who spends $1499 msrp on a 3090 is not traditionally considered as a price sensitive customer. Especially when that person pays >2x the msrp price for 14% more performance over a 3080 at $699.

Bingo. You can’t determine if someone is price sensitive without knowing their value system. I’m confident most people in the market for a 5070 Ti don’t care that the 1080 Ti was an amazing upgrade back in the day. It’s a completely irrelevant factoid to their purchase decision “today”.

The decision to buy a 3090 was an easy one. The 3080 was nowhere close to its MSRP at the time and its 10GB buffer is a significant handicap for someone playing at 4K and higher (DSR). Looking back I personally would’ve seriously regretted getting a 3080 while other people consider it one of the best GPUs ever made.
 
A 2080 Ti user who hasn't upgraded, is likely waiting until they are forced to upgrade by new games. And when the 4080 Super arrived, neither Black Myth Wukong, nor Indiana Jones had launched, nor had id announced that the next Doom would use path tracing. When ray tracing was the max settings baseline, a 2080 Ti could still do ok with DLSS, but that's not the case with path tracing.

But even when comparing with a 2080 Super, a $900 5070 Ti would be expected to deliver 87% more performance per dollar, relative to the original MSRP. Or comparing with the 2080, it would deliver 96% more performance per dollar. Presumably you are now going to say that 2080 Super and 2080 users don't exist either?
I dunno, you keep reaching but it's not going anywhere. The 5070ti is priced between $900-$1000 and is marginally slower than the 4080super. Now you're shifting goalposts from your hypothetical 2080ti scenario to a 2080 super. The problem you face is that the 5070ti is relatively close in price and performance to the 4080super last year. So whatever "discount" you get vs the 4080super needs to be balanced against the additional utility you'd have received from having that level of performance for a an extra year. The additional utility you'd gain by purchasing the 4080super and using it for a year to play your games far surpasses the $70-$100 discount that you may find by purchasing the 5070ti this year.

If you were not convinced by the 4080 super, I don't see how you'd be convinced by the 5070ti since it's basically similar performance at a similar price a year later. I'm sure if many on here knew that the 5000 series would be this disappointing, at the launch of the 4000 super series, many would have hopped aboard that train. I mean, I purchased my 4080 super at the equivalent of $950usd brand new before tax. Based on the current pricing of the 5070ti, there's been absolutely no improvement whatsoever.
 
I've moved the tail end of the Blackwell consumer GPU reviews thread here because it had (very predictably) descended into yet another terrible back and forth about consumer GPU value. I'm broadly sick and tired of this kind of discussion now because it never goes anywhere useful. Some folks view a product or products as good value and like to say so a lot. Some other folks view them as bad value and like to say so a lot. Both sides talk past each other and tend to dismiss the other out of hand. The conversation eventually stops being civil, becomes a bit personal, and bad feeling creeps in. Posts are reported in bitter fashion, demanding moderation.

I'm very keen on discussions focused on the broader economic substrate that currently underpins semiconductor manufacturing (especially consumer products on advanced nodes). But the kind of discussion that formed this thread -- hot on the heels of a product announcement, gatekeeping from both sides about what consumer GPU value is or definitely means to people, the US-centricity of views, increasingly negative and personal tone -- is not something I'm keen on.

Please let's move on from that.
 
The problem with this argument is that you'd actually have a point if there was a 1 to 1 relationship between the rise in input costs and the rise in gpu prices. Where this argument falls flat on it's face is the fact that from just Jan 2020 to Jan 2025, Nvidia's gross margin has increased from ~62% to ~75%.
You've handwaved off a lot of logic here.

Let's ignore chip, board, and power complexity for a moment... Accounting for inflation and iiterally nothing else, today's cards are cheaper than anything you can rationally compare them to. 1080Ti in today's money? More than a thousand dollars... 2080Ti? Way more than a thousand dollars. Inflation is real, and whether you like it or not, none of it is NVIDIA's fault. Secondariliy, between 2020 and 2024, the total units shipped by NVIDIA grew by more than 5x. Care to guess where those shipments were going? Wanna guess where the REAL money is being made? It's in the datacenter, not in the consumer cards.

If you think DIY home customers have it so poorly paying $1k or $2k or even $3k for the top-end cards from NVIDIA, just imagine paying a middling 5-digit sum for an H-series. The profit NVIDIA makes on the DIY consumer side of things is laughable compared to the absolute profit insanity of the datacenter parts. Gross Profit on those is deeeeeep into the double digits after considering resellers (no, not scalpers) in the datacenter space. That 13% jump in profit wasn't consumers, it was datacenter by a mile.

After the incredible increase in chip, board, and power complexity, along with the very real affects of inflation (including more recent tariffs), the cost of ANY modern GPU is pretty tame when you think about it. The "value" is in the eyes of the beholder, but that's always been true.
 
Last edited:
I've moved the tail end of the Blackwell consumer GPU reviews thread here because it had (very predictably) descended into yet another terrible back and forth about consumer GPU value. I'm broadly sick and tired of this kind of discussion now because it never goes anywhere useful. Some folks view a product or products as good value and like to say so a lot. Some other folks view them as bad value and like to say so a lot. Both sides talk past each other and tend to dismiss the other out of hand. The conversation eventually stops being civil, becomes a bit personal, and bad feeling creeps in. Posts are reported in bitter fashion, demanding moderation.

I'm very keen on discussions focused on the broader economic substrate that currently underpins semiconductor manufacturing (especially consumer products on advanced nodes). But the kind of discussion that formed this thread -- hot on the heels of a product announcement, gatekeeping from both sides about what consumer GPU value is or definitely means to people, the US-centricity of views, increasingly negative and personal tone -- is not something I'm keen on.

Please let's move on from that.
Ok but, the problem is that people like to make bad faith arguments that go along the line of, "the economics of advanced nodes, increase in input prices, etc are leading to increases in prices for gpus"..... Albuquerque made a similar argument but, it's just false. People keep regurgitating stuff they hear on youtube from people who absolutely don't know what they're talking about. It's true that GPU input costs have increased but it's certainly not the primary driving factor in cost increases.

Iphone 11(2019) to Iphone 16(2024), tsmc 7nm+ to tsmc 3nm, 2x ram, 2x storage, 2x performance, IPS to Oled, improved cameras, 14% msrp increase. 3080(2020) to 5080, Samsung 8nm to TSMC 4N, 1.6x ram increase, 1.66x performance increase, 42% msrp increase.

On a more advanced node and expensive TSMC node, we see how well Apple can still deliver a better increases at a more palatable cost increase in a similar time period compared to Nvidia.

You've handwaved off a lot of logic here.

Let's ignore chip, board, and power complexity for a moment... Accounting for inflation and iiterally nothing else, today's cards are cheaper than anything you can rationally compare them to. 1080Ti in today's money? More than a thousand dollars... 2080Ti? Way more than a thousand dollars. Inflation is real, and whether you like it or not, none of it is NVIDIA's fault. Secondariliy, between 2020 and 2024, the total units shipped by NVIDIA grew by more than 5x. Care to guess where those shipments were going? Wanna guess where the REAL money is being made? It's in the datacenter, not in the consumer cards.

If you think DIY home customers have it so poorly paying $1k or $2k or even $3k for the top-end cards from NVIDIA, just imagine paying a middling 5-digit sum for an H-series. The profit NVIDIA makes on the DIY consumer side of things is laughable compared to the absolute profit insanity of the datacenter parts. Gross Profit on those is deeeeeep into the double digits after considering resellers (no, not scalpers) in the datacenter space. That 13% jump in profit wasn't consumers, it was datacenter by a mile.

After the incredible increase in chip, board, and power complexity, along with the very real affects of inflation (including more recent tariffs), the cost of ANY modern GPU is pretty tame when you think about it. The "value" is in the eyes of the beholder, but that's always been true.
I'm not handwaving anything away. I think the problem is you only really look at the cost of gpus and not other products. If you looked at any other electronic product which is subject to the same input costs as Nvidia, you'd see the flaws with your argument.
 
Last edited:
Iphone 11(2019) to Iphone 16(2024), tsmc 7nm+ to tsmc 3nm, 2x ram, 2x storage, 2x performance, IPS to Oled, improved cameras, 14% msrp increase. 3080(2020) to 5080, Samsung 8nm to TSMC 4N, 1.6x ram increase, 1.66x performance increase, 42% msrp increase.
One of those sells ~150M units over the sales lifetime, almost like clockwork, where the silicon production and parts that the vendor doesn't make are all negotiated for using that incredible volume, often years in advance with billions of dollars laid down up front to secure the cost base. Inflation adjusted it actually got cheaper, doesn't have a traditional sales channel, rarely changes price over its sales lifetime, is (almost) never supply constrained after launch, and has a consumer upgrade cycle that GPU vendors can only dream of.

That vendor is therefore not "subject to the same input costs as Nvidia". You might as well compare consumer GPU pricing to the cost of sandwiches.

Plus, as you yourself have pointed out, GPU pricing is not ever just BOM + fixed margins. Many of the models that everyone gets so hot and bothered about are almost Veblen goods.

We need a different way to discuss consumer GPU value that attacks the heart of the matter: that it's part (consumer GPU-specific) semiconductor economics, and part effectively convincing the customer to hand over their money for the overall perceived value.
 
Something has clearly shifted in the value equation. Observable consumer behavior is out of sync with what we’re being told consumers should be doing. I find it incredible that gamers successfully pushed back on the $1200 4080 and the silly 4080 12GB yet are now camping outside in the cold to buy $1500 5080’s.
 
Let's ignore chip, board, and power complexity for a moment... Accounting for inflation and iiterally nothing else, today's cards are cheaper than anything you can rationally compare them to. 1080Ti in today's money? More than a thousand dollars... 2080Ti? Way more than a thousand dollars. Inflation is real, and whether you like it or not, none of it is NVIDIA's fault. Secondariliy, between 2020 and 2024, the total units shipped by NVIDIA grew by more than 5x. Care to guess where those shipments were going? Wanna guess where the REAL money is being made? It's in the datacenter, not in the consumer cards.
Actually 1080 Ti would be $910.79, not over thousand dollars. 2080 Ti is obviously more than thousand since it was 999 to begin with. More precisely it would be $1257.15.
 
Ok but, the problem is that people like to make bad faith arguments that go along the line of, "the economics of advanced nodes, increase in input prices, etc are leading to increases in prices for gpus"..... Albuquerque made a similar argument but, it's just false. People keep regurgitating stuff they hear on youtube from people who absolutely don't know what they're talking about. It's true that GPU input costs have increased but it's certainly not the primary driving factor in cost increases
Okay so why do you propose the prices are going up? If it was just Nvidia making a power play to raise margins, AMD could come in and undercut them, along with Intel. The fact that neither do tells me this is probably a supply side issue.
 
One of those sells ~150M units over the sales lifetime, almost like clockwork, where the silicon production and parts that the vendor doesn't make are all negotiated for using that incredible volume, often years in advance with billions of dollars laid down up front to secure the cost base.
That’s the difference between selling a component vs a full standalone product. Nevertheless, we do not know how many GPU units nvidia move. However we know it’s in the 10s of millions on a yearly basis when factoring in datacenter, consumer discrete, laptops, enterprise, etc. Nvidia also has the balance sheet to negotiate deals in advance, secure priority at TSMC and negotiate with suppliers years in advance. Let’s not act like this is some low volume production run.
Inflation adjusted it actually got cheaper, doesn't have a traditional sales channel, rarely changes price over its sales lifetime, is (almost) never supply constrained after launch, and has a consumer upgrade cycle that GPU vendors can only dream of.
That vendor is therefore not "subject to the same input costs as Nvidia". You might as well compare consumer GPU pricing to the cost of sandwiches.
Let’s just agree to disagree on this. While you can make the argument for better volume discounts on pricing of components, arguing that the sales channel, the lack of price reductions over lifetime (not true at all), play a role on the msrps is just false. You can’t really prove that claim and the balance sheet/financial statements do not support any of this speculative assertions at all.
Plus, as you yourself have pointed out, GPU pricing is not ever just BOM + fixed margins. Many of the models that everyone gets so hot and bothered about are almost Veblen goods.
GPUs are not Veblen goods. I don’t know where this notion is originating from. Maybe you can argue an x90 series card as a Veblen good and even then I’d disagree because I’ve seen absolutely no data/evidence to back this up.
We need a different way to discuss consumer GPU value that attacks the heart of the matter: that it's part (consumer GPU-specific) semiconductor economics, and part effectively convincing the customer to hand over their money for the overall perceived value.
With the way people brush data that they don’t like under the rug, in favour of arguments that aren’t backed by data or evidence, I cannot envision how that will happen. A genuine conversation about value and proper praising can only be had in an environment that recognizes the importance of evidence based discussions. An environment that recognizes the importance of curtailing and acknowledging biases.

Lots of times, you can very clearly see the biases in certain discussions and the obliviousness to said bias. Once you leave this environment and go to an environment where people aren’t as invested in the GPU vendors, then you can start to have discussions that are more inline with the perceptions of the general public.

Again, maybe this environment is just not capable of such discussions.
 
Something has clearly shifted in the value equation. Observable consumer behavior is out of sync with what we’re being told consumers should be doing. I find it incredible that gamers successfully pushed back on the $1200 4080 and the silly 4080 12GB yet are now camping outside in the cold to buy $1500 5080’s.
It's a pithy contradiction but the behavior of early adopters just can't be compared to that of general consumers, and I'm sure you know that.

But generally speaking I think we can assume that a large part of it is just pent up demand.
As much as reviewers are now holding up the various 4070s and 4080s as proof for their arguments that Blackwell is a poor generation in terms of both overall perf and value improvements, we shouldn't forget that those were also savaged by the same reviewers on release.
Some very vocal consumers continue to pay lip service to cheering on the poor reviewers who are sounding ever more exasperated that Nvidia just keeps ignoring their demands for another 3080 MSRP. But most people just want to be able to play the latest games. A handful will pay a premium now. Most will be saving up for Christmas time and buy the best Nvidia they can afford then.
A lot of people forget brand loyalty is still a thing. It's perfectly possible to be unhappy about general pricing but still be generally pleased with the Nvidia end user experience.
 
Yeah I meant like for like. Was there a mad rush for 4080’s in the first few weeks too? Maybe I forgot.
I don't believe so. The 4080 and Super are still sitting below the 4090 in the Steam survey, haha.
Of course, right now, I just watched 4 video reviews that boiled down to 'you coulda been playing on a 4080 Super for a while' and not been worse off instead of waiting on this here 5070 Ti.
 
I don't believe so. The 4080 and Super are still sitting below the 4090 in the Steam survey, haha.

Yeah my recollection is that buyers were pretty sour on the 4080 from day one.

Of course, right now, I just watched 4 video reviews that boiled down to 'you coulda been playing on a 4080 Super for a while' and not been worse off instead of waiting on this here 5070 Ti.

For someone with $1000 burning a hole in their pocket that’s 100% true. The question is whether a rational person would choose to defer gratification for another year or two because their prior decision to defer gratification didn’t pan out. If anything it may spur them to buy now to avoid further disappointment down the road. The other option is to hope 3nm cards offer a huge price/perf improvement 24 months from now.
 
Okay so why do you propose the prices are going up? If it was just Nvidia making a power play to raise margins, AMD could come in and undercut them, along with Intel. The fact that neither do tells me this is probably a supply side issue.
Yea that’s not true at all. Amd has undercut several times. I referenced several deals where amd sold the 7900xtx at 820, 850, etc when the 4080 super held firm at 1000. That’s a card with a bigger die, more ram, better designed power connector/delivery, higher input costs selling at a lower price.

Intel is undercutting with the B580 so to suggest that they don’t undercut is just not true at all.
 
The problem you face is that the 5070ti is relatively close in price and performance to the 4080super last year. So whatever "discount" you get vs the 4080super needs to be balanced against the additional utility you'd have received from having that level of performance for a an extra year. The additional utility you'd gain by purchasing the 4080super and using it for a year to play your games far surpasses the $70-$100 discount that you may find by purchasing the 5070ti this year.
That may be true, but the question I am addressing is whether the 5070 Ti would deliver a worthwhile upgrade for owners of the 2000 series. And there are a number of factors affecting the timing of an upgrade that have nothing to do with whether a given level of performance is available at a given price, such as available budget, the choice of games played, increasing hardware requirements over time and new features (eg. PT, RR).

If that were not the case then graphics cards would only experience sustained demand during product launches.

It is reasonable to expect increased value over time at a given price point, but we should note that the long term street prices of the 5070 Ti are not known yet. Since it's based on a cut down GB203 with 16GB, it is a very close match for the existing 4070 Ti Super, so similar prices should be possible.
 
Last edited:
Back
Top