Acres and Acres of Nvidia GPU's. Can you imagine a field of Nvidia multi-million dollar GPU racks powering the next wave of super advanced AI? That's what Oracle just reported in a bombshell earnings report. I can't remember many earnings report where they literally flex another company's technology as the reason why they are accelerating and generating record breaking revenue but here we are.
But here is the thing that people don't get about Nvidia's revenue model for data centers and I think it needs repeating. It was something that Jensen mentioned on Nvidia's past earnings call. An analyst asked him why doesn't Nvidia make distribute their own chips directly. Jensen said no, "we work directly through are OEM/ORM's to fulfill our distribution and that's how it will always be".
But another analyst question was even more peculiar because I showed the analyst doesn't realize how Nvidia's data center business model works and I think it needs repeating. When major cloud providers, including Oracle, purchase Nvidia GPU hardware they have 2 hierarchy options of what they will do with Nvidia GPU's.
- They use the hardware for their own compute needs and produce an output through an offering such as an LLM, gaming, or other accelerated compute needs directly. And end product if you will.
- They provide GPU's as actual hardware to lease out to enterprises and businesses such as startups that want to do their own accelerated compute. The offering here is called DGX Cloud.
The second delivery method here is in fact recurring revenue in many cases. A startup doesn't have to worry about going out and buying a datacenter and install on prem Nvidia hardware when they can just lease a node directly all major cloud providers. Now here's the thing, when using Nvidia hardware you will purchase the underlying DGX platform capabilities including CUDA.
...
Why is the "G" along with the "B" so important. Imagine, all of the revenue that Nvidia has done TO DATE is solely with the H100; not even the H200/GH200 AI factory systems. lol, think about that. ALL OF THESE BILLIONS and BILLIONS OF DOLLARS have only been via the H100 chip. The H200/GH200 just recently came out so while customers are needing to purchase the H200's the real platform GH200 SuperPOD server systems probably have not even begun to take hold with a lot of anticipation for the more powerful GB200 systems.
So you see, when Jensen told that analyst that NO they won't deliver direct as a cloud vender is because they don't have to and
they already are delivering as a cloud provider via stronger contractual agreements while allowing others to also profit and eat from the hardware purchase which is exactly what Oracle reported today.
Others buy the hardware and Nvidia reaps the benefit of that plus the platform instance leasing for the entire stack including software which will always be recurring revenue.
In this way, Nvidia won't have a hard landing and in fact will be one of the largest companies the world has ever seen and it already is.
However, people just don't realize that Nvidia is a cloud company in it's own way. It's just doing it in a way where everyone eats at his table. It's really amazing when you think about it.
There you have it folks, acres and acres of recurring revenue through DGX Cloud and CUDA software licensing.