There is no way margins on these Ada GPUs aren't sky high.Why are you assuming that the pricing of both GPU vendors are because of "sky high margins"? Where can we see these margins?
There is no way margins on these Ada GPUs aren't sky high.Why are you assuming that the pricing of both GPU vendors are because of "sky high margins"? Where can we see these margins?
People on Twitter were estimating the cost of AD102 and it was like the most expensive GPU chip which went into a gaming product, ever.There is no way margins on these Ada GPUs aren't sky high.
What were the speculated costs?People on Twitter were estimating the cost of AD102 and it was like the most expensive GPU chip which went into a gaming product, ever.
Any reason to suspect that it's different down the line?
What were the speculated costs?
so 288-320$ for a 1600$ GPU.
Yes, and? AFAIR this is about 10X more for the chip than what chips used to cost back in the days when high end was at ~$700 mark.so 288-320$ for a 1600$ GPU.
Their margins have been steadily climbing for over a decade. And their R&D and other costs are not solely covered by Geforce sales. Many of those same costs cover their even higher margin sales in other sectors.Yes, and? AFAIR this is about 10X more for the chip than what chips used to cost back in the days when high end was at ~$700 mark.
It's also the cost of the chip alone to which you have to add the "sky high margin" of Nvidia (which exclude the amortization of R&D expenses which went into the architecture and chip development), cost of chip packaging, then add the rest of the board components, cost of assembly, logistics, marketing and margins for partners (including those supplying all components and doing the assembly) and retail.
Last time I've checked Nvidia margin still hasn't changed much over the last several years. Where can we see these "sky high margins"?
Their margins have been steadily climbing for over a decade.
GeForce sales has to cover their part in all of this according to its revenue share. There is no reason why Nvidia would want to sponsor gaming business just to have cards selling for cheap. They are not a charity organization.And their R&D and other costs are not solely covered by Geforce sales.
+ overheadso 288-320$ for a 1600$ GPU.
Well from these we can see that the period when it was "steadily climbing" a) was when Nv was selling Pascal chips and b) has ended around Turing launch?Totally haven't been steadily climbing.
NVIDIA Profit Margin 2010-2023 | NVDA
Current and historical gross margin, operating margin and net profit margin for NVIDIA (NVDA) over the last 10 years. Profit margin can be defined as the percentage of revenue that a company retains as income after the deduction of expenses. NVIDIA net profit margin as of July 31, 2023 is...www.macrotrends.net
Totally haven't been steadily climbing.
Or there was a temporary dip because not many people wanted the highly over priced Turing GPUs that offered little value over their predecessors. Margins continued to climb once more reasonably priced models that offered better value were released. Pascal was released mid 2016. The gain during its run pales in comparison to the 25% gain seen leading up to its release. What an odd point to focus on.Well from these we can see that the period when it was "steadily climbing" a) was when Nv was selling Pascal chips and b) has ended around Turing launch?
Or maybe (just maybe) the dip was due to the same thing as right now and happened because of crypto crash of late 2018. There are no "stable climbing" of margins after that.Or there was a temporary dip because not many people wanted the highly over priced Turing GPUs that offered little value over their predecessors. Margins continued to climb once more reasonably priced models that offered better value were released.
I'm curious, what do you suggest? Selling at loss and have an even worse quarterly report?
The 4090 is significantly cut down, however, so the yield will be much higher than that estimate
Considering NV's hefty greater than 60% margins, they would have to price their GPU kits massively lower in order to even start selling at a loss.
Regards,
SB
MSRP | Price Cut | Margin | Margin Loss % | unit sales needed for $1m | percentage increase over base |
$800 | $0 | $500 | 2000 | ||
$750 | $50 | $450 | 10% | 2222 | 11% |
$700 | $100 | $400 | 20% | 2500 | 25% |
$600 | $200 | $300 | 40% | 3333 | 67% |
While the example is for the 4090, this discussion is not really about it is it? Especially because the 4090 did not see as much of a price increase as the 4080 and 4070 did. Plus those chips are smaller so they can get more per wafer and probably higher yields too. So if an AD103 is 295 mm2 and they can get 1.5x more per wafer (134) with a yield of say 90%, so 120 dies, the cost per good die would be 141.It's not really that simple though. Let's use the numbers from Dr Ian Cutress as an example. If the cost per die is $288 (it could be lower if yields are better but not likely to be significantly lower), with 60% margin they are making ~$170 per chip. Now of course there are other cost including packaging, shipping, operating costs, etc. but it's unlikely a chip for a $1,600 product to cost something like $500 or $600.
If we accept that number, then it's clear that even if NVIDIA sells at cost the whole thing is likely to be just $200 or maybe at most $300 less. Is a $1,300 4090 better than a $1,600 4090? Of course it's better, but probably not that better.