NVIDIA discussion [2024]

Internal target is over 110 billion for this year... in DC revenue alone (Q1 / Q2 / Q3 / Q4 of 22B / 26B / 30B / 34B)
So in this year NVIDIA will earn exactly 1/4 of its $400 billion TAM by 2027. Dr. Lisa Su's market predictions seem pretty accurate. :yep2:
 
NVIDIA-Q1-FY25-EARNINGS-SEGMENT-BREAKDOWN-1920x894.png
 
the Gaming division has long been more of a side track for the company than something that Nvidia would stand and fall on. The importance of gamers is becoming lower and lower for Nvidia. After all, even within the achievements of this division, the company mentions not only the introduction of RTX technologies into games such as Star Wars Outlaws and Black Myth Wukong, but above all successes in the field of AI, i.e. AI in games (Nvidia ACE), AI optimization within the Windows OS and support for AI models Google Gemma and the ChatRTX feature. This division brought "only" USD 2.65 billion to Nvidia's pocket. In other words, this division, which was once the most important for Nvidia, is now only 10% of the company. It is an 18% year-on-year increase (compared to USD 2.24 billion), but the decline in the mobile market caused an 8% decrease compared to the end of last year.
where are the days when the server division and gaming made up 50/50 of each other's revenue
 
the Gaming division has long been more of a side track for the company than something that Nvidia would stand and fall on. The importance of gamers is becoming lower and lower for Nvidia. After all, even within the achievements of this division, the company mentions not only the introduction of RTX technologies into games such as Star Wars Outlaws and Black Myth Wukong, but above all successes in the field of AI, i.e. AI in games (Nvidia ACE), AI optimization within the Windows OS and support for AI models Google Gemma and the ChatRTX feature. This division brought "only" USD 2.65 billion to Nvidia's pocket. In other words, this division, which was once the most important for Nvidia, is now only 10% of the company. It is an 18% year-on-year increase (compared to USD 2.24 billion), but the decline in the mobile market caused an 8% decrease compared to the end of last year.
where are the days when the server division and gaming made up 50/50 of each other's revenue
In the same period, AMD gaming is down 48% YoY. Their reason is weak gaming market, yet NVDA is up 18%
Into context, it's a good result...
 
Crazy that gaming is just 10% of the overall revenue now. Just over 2 years back DC overtook gaming and today gaming is almost like a blip in NV's radar. And automotive hasn't yet fully taken off, that should also grow substantially next year with Drive Thor.

They also hit the highest gross margin ever, 78.4% GAAP and 78.9% non GAAP. I'm curious as to why NV has guided down on margin for the next quarter though. They project revenue of $28B +/- 2% and GAAP and non GAAP margins of 74.8% and 75.5%, respectively, plus or minus 50 basis points.
 
Crazy that gaming is just 10% of the overall revenue now. Just over 2 years back DC overtook gaming and today gaming is almost like a blip in NV's radar. And automotive hasn't yet fully taken off, that should also grow substantially next year with Drive Thor.

They also hit the highest gross margin ever, 78.4% GAAP and 78.9% non GAAP. I'm curious as to why NV has guided down on margin for the next quarter though. They project revenue of $28B +/- 2% and GAAP and non GAAP margins of 74.8% and 75.5%, respectively, plus or minus 50 basis points.

They mentioned component pricing was lower in the last few quarters which reduced costs to NV. Those discounts are presumably over.
 
Nvidia just made $14 billion worth of profit in a single quarter thanks to AI chips, and it’s hitting the gas from here on out: Nvidia will now design new chips every year instead of once every two years, according to Nvidia CEO Jensen Huang.

“I can announce that after Blackwell, there’s another chip. We’re on a one-year rhythm,” Huang just said on the company’s Q1 2025 earnings call.
Until now, Nvidia’s produced a new architecture roughly once every two years — revealing Ampere in 2020, Hopper in 2022, and Blackwell in 2024, for example.
 
I wonder what percentage of AD102(/AD104) gross profit is for AI datacenter inference L40/L4 and RTX Ada compared to gaming/GeForce. I suspect they are selling a *lot* less L40/L4 than H100s but it might still be a very significant percentage (or even a majority!) of AD102 which makes those chips extremely profitable overall when you add both segments together.

So my prediction is that NVIDIA will continue to invest strongly in gaming chips, but that they will aggressively push developers towards more AI tech in games in order to maximise the overlap between the two markets, whether that's DLSS/denoising or full-blown GPT-4o-level AI(/LLM for now) NPCs.

I have very mixed feelings about AGI NPCs, we might be closer to this happening organically than we think ;)

 
Probably but that has nothing to do with AI or datacenter. It’s due to lack of competition in gaming hardware.
you trying to convince me or yourself ? ;) from a business point of view, and from a purely economic point of view, if one of the products makes up only 1/10 of income, it is completely logical that the company will prioritize those products on which it has 90% of revenue and if keen to keep minority product on the market, then its price will either skyrocket or end completely. ( if I were on Huang´s place I would rise cost of gaminig GPU twice and there you have it , as you said there´s no competition )
 
Last edited:
from a business point of view, and from a purely economic point of view, if one of the products makes up only 1/10 of income, it is completely logical that the company will prioritize those products on which it has 90% of revenue and if keen to keep minority product on the market, then its price will either skyrocket or end completely
What business point of view? Amateur business?

Automotive has been giving NVIDIA peanuts revenue for decades, did you see them end it or skyrocketing their prices?

Console SoCs are a very small part of NVIDIA's revenue, did you see them raise the prices like crazy for these SoCs? Did you see NVIDIA end the development of such SoCs?

Custom SoCs like MediaTek are a small part of NVIDIA's revenue, do you see them giving these SoCs to MediaTek at absurd prices?

Fact is, true business is about placing the product in it's own bubble, not in other products bubbles, gaming GPUs exist in their own bubble and they overlap multiple other products (pro visualization, automotive, consoles, custom SoCs for PCs, etc), they will be treated according to the rules governing such bubble, they will not end or get exorbitant prices, they will be priced according to competition and demand just like any other product.
 
you trying to convince me or yourself ? ;) from a business point of view, and from a purely economic point of view, if one of the products makes up only 1/10 of income, it is completely logical that the company will prioritize those products on which it has 90% of revenue and if keen to keep minority product on the market, then its price will either skyrocket or end completely. ( if I were on Huang´s place I would rise cost of gaminig GPU twice and there you have it )
You can't just spend 10x as much on AI GPU R&D and get 10x better results, the Mythical Man Month applies here, amongst other things. As long as these are independent projects and don't decrease efficiency or increase risk for their primary cash cow (AI), it's just good diversification, and it's what every company does.

I am just speculating here, but what I would expect from an engineering perspective is that the "shared" parts of the design like the SM processor will be more and more optimised towards AI and the graphics engineering teams will struggle more to get features they want into the SM processor because they have to compete with AI features for the time of a relatively small number of specialised engineers/architects. On the other hand, if they want to design a new TPU with practically the same interface to the rest of the SM as the current one? Go wild. Want to improve raytracing, again without changing the rest of the SM too much? Again, go wild. The return-on-investment for improving perf/mm2 and perf/watt of the graphics-specific parts is still extremely good with ~$10B revenue per year... as long as it doesn't affect their AI roadmap.

We have actually already seen this happen: Turing was a massive change for graphics with lots of graphics-related changes in the SM processor as well, but Ampere and Ada changes in the SM processor were practically all focused on AI, with the graphics-focused changes being mostly about things like raytracing and the cache hierarchy.
 
well, one thing is certain, Nvidia gaming graphics will continue to get extremely expensive
"Continue" as in it already is getting "extremely expensive" somewhere?
Right now you can buy an Nvidia GPU for $45 and an RTX GPU for $170.
I don't expect this to change much - unless iGPUs will start being capable of competing at these price points.

Also yeah the fact that gaming perf/price doesn't increase much lately has zero to do with AI anything.
 
nVidia increased compute throughput 2x with Ampere and clockrate by 50% with Lovelace. I think these two changes are huge for gaming.

In the end it doesnt matter when the baseline are outdated GPUs in the console which are pre Turing level...
 
Last edited:
I am just speculating here, but what I would expect from an engineering perspective is that the "shared" parts of the design like the SM processor will be more and more optimised towards AI and the graphics engineering teams will struggle more to get features they want into the SM processor because they have to compete with AI features for the time of a relatively small number of specialised engineers/architects.
I dunno about that. The fact that AI can hold its own R&D now says to me that the divergence between gaming and AI h/w will likely increase because there will essentially be teams working on h/w for these two markets instead of one team working on h/w for everything. Maybe not to the degree of AMD's RDNA/CDNA split (which never made much sense to me really) but enough to warrant a vastly different SM design between HPC/AI and gaming h/w.
It is already the case really, between Volta-Turing, versions of Ampere, Hopper-Lovelace. Whether this will break down the synergy to the point where AI side would prevent gaming team from implementing something into gaming h/w is hard to tell but I wouldn't expect that to happen any more than Nvidia suddenly "not caring about gaming".
 
Back
Top