NVIDIA discussion [2025]

Here in germany there was maybe one 5090 to buy. Dont know what the point is of a GPU which even at these prices cant be bought by gamers.

/edit: It is even more absurd. In germany it's nearly impossible to buy a new 4090. This makes the 5080 to the fastest available GPU on the market...
 
Source? Everyone is anxious about where the returns will come from but I haven’t seen any signs of capex slowing down.
Microsoft CFO Amy Hood said capital spending in the third and fourth quarters would remain around the $22.6 billion, a level seen in the second quarter.
"In fiscal 2026, we expect to continue to invest against strong demand signals. However, the growth rate will be lower than fiscal 2025 (which ends in June)," she said.


I may be misinterpreting that statement but the way I read it is that FY 26 will not see as heavy of as investment in AI hardware... Unfortunately I wasn't able to attend the call but, I'll be definitely attending the next investors call.

For me, Nvidia has 2-3 quarters left and then we should start seeing revenue either consolidate or regress. I don't see the next accelerant to keep the momentum going and if they're going to be on blackwell/refresh till 2027, I don't have the highest of hopes.

As an aside, Nvidia's gross margin is now ~75%.... Absolutely disgusting imo.. Great if they actually paid out good dividends but they don't.
 
Microsoft CFO Amy Hood said capital spending in the third and fourth quarters would remain around the $22.6 billion, a level seen in the second quarter.
"In fiscal 2026, we expect to continue to invest against strong demand signals. However, the growth rate will be lower than fiscal 2025 (which ends in June)," she said.


I may be misinterpreting that statement but the way I read it is that FY 26 will not see as heavy of as investment in AI hardware... Unfortunately I wasn't able to attend the call but, I'll be definitely attending the next investors call.

For me, Nvidia has 2-3 quarters left and then we should start seeing revenue either consolidate or regress. I don't see the next accelerant to keep the momentum going and if they're going to be on blackwell/refresh till 2027, I don't have the highest of hopes.

As an aside, Nvidia's gross margin is now ~75%.... Absolutely disgusting imo.. Great if they actually paid out good dividends but they don't.

Lower growth rate means they’re still growing their spend just more slowly which makes perfect sense. Red flag for Nvidia is when spending starts declining. These companies can’t keep spending forever of course so Nvidia will need to find other customers to maintain their rosy growth picture.
 
Growth rate slowing means instead of going from 80 billion (in 2025) to 120 billion in 2026, it will be 95 billion instead.
Lower growth rate means they’re still growing their spend just more slowly which makes perfect sense. Red flag for Nvidia is when spending starts declining. These companies can’t keep spending forever of course so Nvidia will need to find other customers to maintain their rosy growth picture.
I know what growth rate traditionally means.... However, in this instance, I just think Amy is being coy.... We'll see in Q1 FY26.
 
For me, Nvidia has 2-3 quarters left and then we should start seeing revenue either consolidate or regress. I don't see the next accelerant to keep the momentum going and if they're going to be on blackwell/refresh till 2027, I don't have the highest of hopes.
I highly doubt it. Rubin has been tape out and brings another big gain in AI performance over Blackwell NVL72, both in training and inference. The leading AI models will benefit tremendously from it, so they won't lower their investment in the next hardware cycle, unless they want to loose over the competition...
 
I highly doubt it. Rubin has been tape out and brings another big gain in AI performance over Blackwell NVL72, both in training and inference. The leading AI models will benefit tremendously from it, so they won't lower their investment in the next hardware cycle, unless they want to loose over the competition...
Wait, who’s Nvidia’s competitor? It’s not AMD, it’s not Intel…. Im not an expert in the AI field but leading AI models use transformer architecture. If we choose to believe the experts then that will certainly not lead to the creation of AGI which everyone is chasing. Further more, transformer architectures are reaching their scaling limit.

So like I said, until there is a new accelerant to fuel new hardware purchases, I don’t see how this momentum will continue. Instead what will happen will be a period of consolidation till a new breakthrough arises. When the new breakthrough arises, will Nvidia still be at the forefront? I don’t know and cannot say…
 
Wait, who’s Nvidia’s competitor? It’s not AMD, it’s not Intel….
Nvidia competition is themselves, like in the consumer GPU space. They must continue to innovate to grow and please their shareholders.

Im not an expert in the AI field but leading AI models use transformer architecture. If we choose to believe the experts then that will certainly not lead to the creation of AGI which everyone is chasing. Further more, transformer architectures are reaching their scaling limit.
So like I said, until there is a new accelerant to fuel new hardware purchases, I don’t see how this momentum will continue. Instead what will happen will be a period of consolidation till a new breakthrough arises. When the new breakthrough arises, will Nvidia still be at the forefront? I don’t know and cannot say…
Transformers are far from their limit. We are just at the very beginning of code optimization. Just look at Deepseek and what they did to lower the hardware requirement. And unlike uneducated opinion thinks, this kind of breakthrough won't slowdown the market but will accelerate the AI adoption, ie Jevons paradox.
 
Nvidia competition is themselves, like in the consumer GPU space. They must continue to innovate to grow and please their shareholders.
There's very little competition going on with Nvidia especially since they released a 5070 masquerading as a 5080. You can't even in good faith propose that argument after Nvidia released the worst gpu generation they've ever released in over a decade.
Transformers are far from their limit. We are just at the very beginning of code optimization. Just look at Deepseek and what they did to lower the hardware requirement. And unlike uneducated opinion thinks, this kind of breakthrough won't slowdown the market but will accelerate the AI adoption, ie Jevons paradox.
Deepseek is less hardware intensive but the outcome in terms of accuracy is still similar. There are several videos and papers discussing how the transformer model scales and how irs reaching its limit. I'll take the word of people who worked on and designed these models over yours.
 
Deepseek is less hardware intensive but the outcome in terms of accuracy is still similar. There are several videos and papers discussing how the transformer model scales and how irs reaching its limit. I'll take the word of people who worked on and designed these models over yours.
1. You should read more and 2. it's not my words but from the IC industry that I'm working in for more than 30 years.
one quick example:

and it's very naïve to think that with billions of dollars of transformer GPUs in the field, researchers will stop to optimize and increase the performance of the hardware. (edit: Sorry for this Captain obvious moment, I can't believe I write this)
 
Last edited:
1. You should read more and 2. it's not my words but from the IC industry that I'm working in for more than 30 years.
one quick example:

and it's very naïve to think that with billions of dollars of transformer GPUs in the field, researchers will stop to optimize and increase the performance of the hardware. (edit: Sorry for this Captain obvious moment, I can't believe I write this)
Optimizing and increasing the performance of the transformer model will not overcome the inherent flaws of the transformer architecture. I don't have time to watch the whole video you linked but I imagine that you didn't watch the video you linked at all. You were just looking to confirmation bias yourself. The video author just in the intro section calls this paper "5% ideas, 95% smoke and mirrors". If you even watched this video in any length, you'd realize that this does not actually significantly change the fundamental structure of the transformer architecture in terms of outcome. The inherent limitations still remain. As for further AI discussions with you, I'll be bowing out. It's not my field of expertise and it doesn't appear to be yours either. I don't have the time to invest in a conversation that comprises of you posting youtube videos that you don't even bother to watch.
 
This selloff is just a knee jerk panic reaction from the less educated masses.
The “less educated masses” don’t move markets, market movements are almost always from institutional players. For a mega cap like Nvidia having it drop 20% in one day means an institutional selloff, not retail nonsense.
 
There's very little competition going on with Nvidia especially since they released a 5070 masquerading as a 5080...

I love this new world where content creators churn out talking points that immediately get picked up by people. The HUB video comes out today, and here we are.
 
I love this new world where content creators churn out talking points that immediately get picked up by people. The HUB video comes out today, and here we are.
Well, to be frank, I've been calling this the worst Nvidia generation for a while and I raised several concerns about the GPUs based on the configurations prior to the release. Hub calling it a 5070 is just another way of verbalizing the concerns I had raised immediately when I saw the specs. Hub is not wrong and the data backs it up.... Imagine a new x80 series not beating the prior 90 series but instead being multiple percentage points slower...
 
Back
Top