Nvidia shows signs in [2023]

Status
Not open for further replies.
Instead I feel they're trying to obfuscate the pricing increase by rearranging the naming scheme. Essentially trying to fool less informed customers into thinking they're getting a higher end GPU (relative to the current and last gen stacks) than they actually are.
That's exactly what's happening and anybody with a brain can realize that.

Yes, price increases were inevitable, but NOT to the insane extent we're seeing with Nvidia's 40 series. That's just sheer greed and everybody knows that. Some people here are just arguing completely dishonestly.

It's also hilarious for one of these dishonest people to get all semantic and specific about how 'die sizes dont matter' while pushing the argument that cost per transistor matters, which actually literally doesn't, since prices are based on overall wafer pricing, not how many transistors a specific chip has. Chips on the same general process node can have widely varying transistor densities. And to be clear, I'm not saying transistor density doesn't matter, it's just hypocritical to say die sizes dont matter, but then bring up cost per transistor as if that's important.
 
People should really stop seeing various "tiers" in marketing namings and start looking at what the same price gets you
Except you yourself have argued that it's also not reasonable to expect significant improvements in price per dollar anymore, either! lol

Imagine if Nvidia sold a 1070 at $700 in 2016. That's technically an improvement on performance per dollar from the 980Ti, but boy absolutely nobody would have been defending that. That's what Nvidia is doing right now.

In the end, all your arguments can be summed up as this: "Just accept Nvidia's pricing". You dont care how or what arguments you have to make to get there, that's always your ultimately goal.
 
A new company called Inflection AI has raised 1.3 billion $ in funding and is using it to build the largest AI cluster in the world, containing 22K H100 GPUs.

So, this Supercomputer would have max 1.3 PetaFlops FP64 performance - only 15% slower than Frontier.

Must be the first time that a private company can rival the top spot in this short time of two to three years. I think it tells us how far ahead nVidia is with their datacenter business...
 
Last edited:
There is no connection between the two. lol?
Of course there is, but there is nothing stopping Nvidia from offering much better performance per dollar than they are now except greed.

Again, you're doing this extremely obnoxious thing of ignoring that I am accepting that price increases were inevitable, but the LEVEL of price increase we've gotten is wholly unreasonable and not justifiable.

There was(is) still plenty of room to make 40 series much better value, and offering a notable increase in performance per dollar and not just the pitiful gains we're getting right now. They aren't selling 150mm² 115w low end budget parts for $300 cuz they HAVE to, as you keep trying to laughably claim. They are simply seeing what they can get away with, and hoping to set a precedent for the future where people will accept exploitative prices as normal.

 

Of course there is
No, there isn't.
The fact that the only proper way to compare product "tiers" is by comparing them on the same price doesn't mean that you will get a good perf/price gain in such tiers.

but there is nothing stopping Nvidia from offering much better performance per dollar than they are now except greed
You have been repeating this for some time now with zero information to back that claim, or even provide any sort of reason why it's Nvidia specifically who is in the wrong here.
 
You have been repeating this for some time now with zero information to back that claim, or even provide any sort of reason why it's Nvidia specifically who is in the wrong here.
Their past generations seem to support the claim and no one is saying nVidia is the only one doing wrong in the GPU market, they're just the most popular and flagrant about it currently. If the 8GB 4060 didn't convince you I don't think anything will.
 
We've already discussed this at lengths. Past generations have very loose relation to current production and market realities.
We've discussed it but at the end of the day we still hold very different views on it. A sort of "agree to disagree" ending on that one. You still feel there is no meaning to the different tier cards 'tween generations of nVidia and I feel there should be, I can't say for sure which of us is right as it's subjective.

Not saying it's conclusive evidence, but I see where they're coming from in their argument.
 
You have been repeating this for some time now with zero information to back that claim
Good lord, it only takes a quick look at wafer pricing and knowledge that the chips themselves only make up a minority of the percent of the overall build cost to see that Nvidia's effective 100%+ increase in pricing at nearly every tier is just absolutely absurd.

But again, you're proving that you'll literally say ANYTHING to defend Nvidia(and your own purchase of a hugely overpriced 40 series part, totally uncoincidentally, of course...).
 
Good lord, it only takes a quick look at wafer pricing and knowledge that the chips themselves only make up a minority of the percent of the overall build cost to see that Nvidia's effective 100%+ increase in pricing at nearly every tier is just absolutely absurd.
Can you provide us with a quick look at wafer prices for 8N and 4N processes?
 

Mining Companies Consider Repurposing Idle GPUs to HPC and AI Markets​

July 6, 2023
Crypto companies that loaded up on GPUs in data centers for coin mining are now investigating whether to sell or repurpose the idle hardware to the exploding artificial intelligence market.

Hive Blockchain, which has 38,000 Nvidia GPUs, is repurposing the hardware for high-performance computing and artificial intelligence applications. The company started rethinking about what to do with its GPUs — which are still used to mine Bitcoin and Ethereum — after the crypto market crashed.
...
Crypto companies are realizing that GPUs could generate more money running AI or cloud operations. Hive ran a pilot program where it dedicated its GPUs to AI and high-performance computing, which was a success.

“We have much to accomplish to utilize our full fleet of GPU cards, however we are very pleased that our beta project with only approximately 500 GPU cards generated $230,000 revenue this quarter,” said Frank Holmes, the company’s executive chairman, in a filing within the last month.
Hopefully there will not be any AI craze outside these companies existing gpu inventory ...
 
Last edited:
I think, maybe, there is a fundamental disconnect here with regard to the state of the discrete GPU market. I could be off, but my reading of the last few pages suggests there is a camp arguing, in effect, the prices reflect what the market will bear. In other words, the elevated GPU prices represent the balance of the vendors' costs and expectations and all those market considerations we've covered (rising transistor costs, ballooning design costs, general and industry specific inflation, opportunity cost of discrete GPUs v. datacenter silicon, etc.). On the other hand, there seems to be a camp that, at least implicitly, is arguing that the market is a broken duopoly, and the market leader is taking undue advantage of its position to aggressively inflate prices and offer scant price per performance upgrades over last gen.

So, is the market "broken" or is this just the market reality? That's what I'm trying to noodle out.
 
I think, maybe, there is a fundamental disconnect here with regard to the state of the discrete GPU market. I could be off, but my reading of the last few pages suggests there is a camp arguing, in effect, the prices reflect what the market will bear. In other words, the elevated GPU prices represent the balance of the vendors' costs and expectations and all those market considerations we've covered (rising transistor costs, ballooning design costs, general and industry specific inflation, opportunity cost of discrete GPUs v. datacenter silicon, etc.). On the other hand, there seems to be a camp that, at least implicitly, is arguing that the market is a broken duopoly, and the market leader is taking undue advantage of its position to aggressively inflate prices and offer scant price per performance upgrades over last gen.

So, is the market "broken" or is this just the market reality? That's what I'm trying to noodle out.

It's a little of both. The price is where about 3/4 of the previous market can bear it. That's roughly how far down shipments of discrete GPUs are. The price is also where the market leader sets it. They could lower it and sell more consumer discrete GPUs, but they have no reason to when they could instead sell more AI silicon.

So, that allows them to maintain their high margins and even increase them because there's no downside to the consumer market shrinking due to inflated pricing (they're still sticking to the inflated covid/crypto mining bubble pricing levels). Basically, any shrinkage in the consumer dGPU market due to high prices is more than offset by the increased demand by the even higher margin burgeoning AI market.

The competition also isn't interested or able to significantly challenge that as they also would prefer to shift silicon allotments to more lucrative AI buyers and their console partners.

Regards,
SB
 
<snip> because there's no downside to the consumer market shrinking due to inflated pricing (they're still sticking to the inflated covid/crypto mining bubble pricing levels). Basically, any shrinkage in the consumer dGPU market due to high prices is more than offset by the increased demand by the even higher margin burgeoning AI market.</snip>
I wouldn't necessarily draw any connection between current IHV-set MSRPs and past market-inflated covid/crypto prices. You can see Nvidia and AMD deriving the present-day MSRPs from first principles instead of historical momentum. But that's almost an academic issue.

The important question (as you also contemplated) is whether the IHVs are indeed willing to discard an aging and shrinking market segment in pursuit of lush green AI pastures. Because once it's gone it's GONE. 25 years of history. Maybe I'm projecting my own emotional attachment but to me it doesn't seem to be a great idea for them to put all their eggs in one basket called AI. They need some hedges. IOW I'm disagreeing with you that "there's no downside". There is one, just not in the short term. I do think VERY highly of all 3 CEOs in charge of our favorite IHVs, and I believe they have enough foresight to make the right calls here.

All that said, I do think the xx50 and xx60-class discrete GPU market is done for. In the past we had "console-killer" xx60 or xx70 GPUs within 1-2 years of a console launch. This isn't going to happen again, the fundamentals just aren't there. No, I think the only interesting discrete GPU market is now relegated to the high-end (500+) and ultra-high-end (1000+) segments. Despite the well-specced configs of current-gen consoles, we're still seeing console games targeting 30fps and with visual compromises. So the traditional story of a high-end PC targeting higher fps+fidelity is still intact, but just at a higher price bracket. Question is, is the market robust enough at these brackets to sustain the discrete GPU business without the cost-amortizing volume provided by the lower tiers? I will say this -- I do not understand the laptop GPU market *at all*, and that's becoming an increasingly big blind spot preventing me from making any reasonable assessments.
 
So, that allows them to maintain their high margins and even increase them
1688736687538.png

Where can we see these "high margins" and an increase in them?
 
View attachment 9201

Where can we see these "high margins" and an increase in them?
What's nVidia's current market cap again?
 
Where can we see these "high margins" and an increase in them?
That data seems wrong. AFAIR GM has been consistently in the 60s over the past several years (ignoring the crypto busts). But that's a moot point because none of this is useful. Nvidia GM today is largely attributable to AI/DC business, but we don't have access to specific numbers. Without separate margin numbers specifically for discrete GPU products it's hard to draw any meaningful conclusion on whether profit margins have shrunk, increased or stayed the same for this segment.

What's nVidia's current market cap again?
Market cap is irrelevant. It's based on a speculation of future earnings potential, which is betting almost completely on explosive AI growth. It's certainly not betting on an explosion in desktop PC gaming!
 
Nvidia GM today is largely attributable to AI/DC business, but we don't have access to specific numbers.
Yes but even with this we're looking at GM falling over the course of Lovelace launch, not increasing or being at historical heights. Current Nv's margins are about on par with the period between 2011 and 2016 - which is Kepler/Maxwell/Pascal gens generally lauded as one of the best in perf/price gains. The rise after that is linked to the first crypto mining boom, then covid, then the second boom. I'm struggling to find any indication of Lovelace bringing unusually high margins in any of the data on hand yet people continue saying this as if it's a proven fact at this point.
Thus the question (multitude of them really) remain.
 
 
Status
Not open for further replies.
Back
Top