Nvidia GeForce RTX 4070 TI Reviews

so 288-320$ for a 1600$ GPU.
Yes, and? AFAIR this is about 10X more for the chip than what chips used to cost back in the days when high end was at ~$700 mark.
It's also the cost of the chip alone to which you have to add the "sky high margin" of Nvidia (which exclude the amortization of R&D expenses which went into the architecture and chip development), cost of chip packaging, then add the rest of the board components, cost of assembly, logistics, marketing and margins for partners (including those supplying all components and doing the assembly) and retail.
Last time I've checked Nvidia margin still hasn't changed much over the last several years. Where can we see these "sky high margins"?
 
Yes, and? AFAIR this is about 10X more for the chip than what chips used to cost back in the days when high end was at ~$700 mark.
It's also the cost of the chip alone to which you have to add the "sky high margin" of Nvidia (which exclude the amortization of R&D expenses which went into the architecture and chip development), cost of chip packaging, then add the rest of the board components, cost of assembly, logistics, marketing and margins for partners (including those supplying all components and doing the assembly) and retail.
Last time I've checked Nvidia margin still hasn't changed much over the last several years. Where can we see these "sky high margins"?
Their margins have been steadily climbing for over a decade. And their R&D and other costs are not solely covered by Geforce sales. Many of those same costs cover their even higher margin sales in other sectors.
 
Last edited:
Their margins have been steadily climbing for over a decade.
1672888783989.png

Checked the reports. Got bored around 2018.
Where are the "sky high margins"?

And their R&D and other costs are not solely covered by Geforce sales.
GeForce sales has to cover their part in all of this according to its revenue share. There is no reason why Nvidia would want to sponsor gaming business just to have cards selling for cheap. They are not a charity organization.
 

Totally haven't been steadily climbing.
 

Totally haven't been steadily climbing.

Net margins right around AMD’s recently. Are you saying companies should operate at a loss?
 
Well from these we can see that the period when it was "steadily climbing" a) was when Nv was selling Pascal chips and b) has ended around Turing launch?
Or there was a temporary dip because not many people wanted the highly over priced Turing GPUs that offered little value over their predecessors. Margins continued to climb once more reasonably priced models that offered better value were released. Pascal was released mid 2016. The gain during its run pales in comparison to the 25% gain seen leading up to its release. What an odd point to focus on.
 
Or there was a temporary dip because not many people wanted the highly over priced Turing GPUs that offered little value over their predecessors. Margins continued to climb once more reasonably priced models that offered better value were released.
Or maybe (just maybe) the dip was due to the same thing as right now and happened because of crypto crash of late 2018. There are no "stable climbing" of margins after that.
 
Considering NV's hefty greater than 60% margins, they would have to price their GPU kits massively lower in order to even start selling at a loss.

Regards,
SB

It's not really that simple though. Let's use the numbers from Dr Ian Cutress as an example. If the cost per die is $288 (it could be lower if yields are better but not likely to be significantly lower), with 60% margin they are making ~$170 per chip. Now of course there are other cost including packaging, shipping, operating costs, etc. but it's unlikely a chip for a $1,600 product to cost something like $500 or $600.
If we accept that number, then it's clear that even if NVIDIA sells at cost the whole thing is likely to be just $200 or maybe at most $300 less. Is a $1,300 4090 better than a $1,600 4090? Of course it's better, but probably not that better.
 
The issue I think is that people are looking at businesses from a very simple basically "mom and pop" restaurant type of perspective in which your goods are not durable and you might very well just be pricing at cost+. Whereas technology, and the GPUs, are durable goods for one (more on this later). More importantly the consideration is not just immediate per unit profitability but opportunity.

Let's make some assumptions with the RTX 4070ti. Let's assume out of the $800 base MSRP $700 goes to Nvidia (the rest of the $100 is the AiB, retailer, distribution, etc.). Let's give it a very generous margin of $500 (or about 71.4% on $700 revenue).

MSRPPrice CutMarginMargin Loss %unit sales needed for $1mpercentage increase over base
$800$0$5002000
$750$50$45010%222211%
$700$100$40020%250025%
$600$200$30040%333367%

It's counter intuitive but in a contracting market in which increased sales opportunities aren't present chasing them via aggressive pricing might not make sense.

Now back to GPUs being durable goods this adds additional considerations as they aren't perishable and people do not constantly consume them (unlike the enthusiast circle group think in which you must upgrade every new release that is not how it works for the vast majority of the market), as such a sale now effectively also means that sale is gone. The majority of customers who buy a GPU will not look at upgrading it against for at least 4 years +, and given the current longevity from a functional stand point I can see even longer.

However back to pricing I think the main issue people are having this generation is that we are not seeing the typical value scaling down the product stack. The RTX 4090 is actually fine and in line with past expectations. It's the 4080 and 4070ti that are not. So the actual question is why has the business model shifted to no longer offering more value down stack as opposed to up stack?

Lastly I still have this huge issue with people saying current prices are actually pricing people out of PC gaming. Can the people who actually believe this give some actualy substantial supporting argument and numbers on what they feel the current GPU price of entry is for PC gaming? What constitutes a GPU that acts as an entry into PC gaming? And if they believe it's say $800 with the RTX 4070ti (or even $1600 for the 4090) why that level of GPU is the entry point to PC gaming? Also what is the actual price for a console equivalent GPU currently? If they think it's a 4070ti at $800 explain why that that is so as opposed to something lesser (eg. 6700XT).
 
It's not really that simple though. Let's use the numbers from Dr Ian Cutress as an example. If the cost per die is $288 (it could be lower if yields are better but not likely to be significantly lower), with 60% margin they are making ~$170 per chip. Now of course there are other cost including packaging, shipping, operating costs, etc. but it's unlikely a chip for a $1,600 product to cost something like $500 or $600.
If we accept that number, then it's clear that even if NVIDIA sells at cost the whole thing is likely to be just $200 or maybe at most $300 less. Is a $1,300 4090 better than a $1,600 4090? Of course it's better, but probably not that better.
While the example is for the 4090, this discussion is not really about it is it? Especially because the 4090 did not see as much of a price increase as the 4080 and 4070 did. Plus those chips are smaller so they can get more per wafer and probably higher yields too. So if an AD103 is 295 mm2 and they can get 1.5x more per wafer (134) with a yield of say 90%, so 120 dies, the cost per good die would be 141.

1600 for a 288 (5.5x) vs 799 for 141 (5.6x). What does this tell us? That Nvidia is trying to extract as much margin from lower end parts as from the top halo part. From a lower end part that traditionally sold more so economies of scale apply where margins are usually lower and high profits come from sheer volume. I'm sorry but I don't buy it that this is all just because of high waffer costs. Unless there is so much competition at the foundry that they cannot get as much waffers as they would like, so they can't scale production accordingly? Or worse, Ada as bad yields although we didn't hear anything about that.
 
Back
Top