NVIDIA discussion [2024]

  • Thread starter Deleted member 2197
  • Start date
"Good" would be a quality product that you need/would be useful at a fair price. It doesn't exclude anykind of AI, the question would be what value does that AI application have to the product.
 
"Good" would be a quality product that you need/would be useful at a fair price. It doesn't exclude anykind of AI, the question would be what value does that AI application have to the product.

Right and my point is that AI isn’t special in this regard. Products get new features and capabilities all the time but the hype and scrutiny around AI is unique. Asking someone how much more they’re willing to pay for AI is like asking them how much are they willing to pay for “better software”. It’s a pointless question.
 
Right and my point is that AI isn’t special in this regard. Products get new features and capabilities all the time but the hype and scrutiny around AI is unique. Asking someone how much more they’re willing to pay for AI is like asking them how much are they willing to pay for “better software”. It’s a pointless question.
Right now to me is that AI is more akin to RGB where everyone is just slapping it on everything to add a marketing bullet.

EDITED BITS:: Also reminds me a bit of "VR READY"
 
Right and my point is that AI isn’t special in this regard. Products get new features and capabilities all the time but the hype and scrutiny around AI is unique. Asking someone how much more they’re willing to pay for AI is like asking them how much are they willing to pay for “better software”. It’s a pointless question.
It has definitely become difficult to talk about AI. People tend to regress into camps where something can only be good or bad, and AI is a particularly tricky topic to cover in this regard, especially when a lot of the benefits that have or might come from it wont necessarily be things transparent to end users. But also because AI is definitely being used in many insidious ways(also in ways that aren't transparent to end users), and is trying to get rammed down everybody's throats through cringey marketing efforts. It's easy for people to turn against it by getting a bad taste in their mouth from it.

I personally dont think AI is going to be as 'transformative' as it's been drummed up to be. It wont be as revolutionary as the internet or anything. As you kind of hinted at before, I think its real best uses cases will simply come from enhancing existing technologies to various degrees.

It is here to stay, though. It's not a fad and companies aren't likely to stop investing heavily in it for quite a while. What we really need to hope for is that better competition in AI hardware comes online so that Nvidia isn't as big of a dominant force in it. Though that's a general 'we' as I get the impression sometimes people in this forum in particular are quite happy with Nvidia domination for some bizarre reason.
 
Last edited:
Right and my point is that AI isn’t special in this regard. Products get new features and capabilities all the time but the hype and scrutiny around AI is unique. Asking someone how much more they’re willing to pay for AI is like asking them how much are they willing to pay for “better software”. It’s a pointless question.
One reason could be that AI is taking up quite a bit more of die space than most new features, RT included (based on estimations from Friztchens Fritzs die photos )
And it hogging die space isn't even tied to GPUs
 
But also because AI is definitely being used in many insidious ways(also in ways that aren't transparent to end users), and is trying to get rammed down everybody's throats through cringey marketing efforts. It's easy for people to turn against it by getting a bad taste in their mouth from it.

Yeah I assume this is where digi is coming from. But I don’t think the “AI” in washing machines and thermostats has anything to do with deep learning or GPUs so I’m ignoring the ridiculous AI marketing for the purpose of this convo.

I personally dont think AI is going to be as 'transformative' as it's been drummed up to be. It wont be as revolutionary as the internet or anything. As you kind of hinted at before, I think its real best uses cases will simply come from enhancing existing technologies to various degrees.

It’s hard to say since it’s still early days. I’ve spoken to some very smart and very sober people who are genuinely excited about the potential to generate significant value from large data sets. Languages are just an obvious first step. The underlying science is all about pattern detection and prediction. None of that is new of course - we’ve had algorithms, knowledge bases, regression analysis, correlation engines etc for a long time that attempt to do the same things. “AI” is just a way to throw massive compute at massive data to tease out those patterns faster than existing methods. It can be tremendously powerful if it works out.


It is here to stay, though. It's not a fad and companies aren't likely to stop investing heavily in it for quite a while. What we really need to hope for is that better competition in AI hardware comes online so that Nvidia isn't as big of a dominant force in it. Though that's a general 'we' as I get the impression sometimes people in this forum in particular are quite happy with Nvidia domination for some bizarre reason.

Nvidia is dominant today because they were the first mover. They are rightfully reaping the rewards of their early investment, foresight, luck etc. But it will eventually grow into an ecosystem that’s much bigger than Nvidia or CUDA.
 
Don't take the tensor and RT contributions as fact, only combined, since the breakdown is largely speculation on my part.
Not as fact, but probably close (and for CPUs the hogging is measurable)
 
Nvidia is dominant today because they were the first mover. They are rightfully reaping the rewards of their early investment, foresight, luck etc. But it will eventually grow into an ecosystem that’s much bigger than Nvidia or CUDA.

And this is not even the first time. There has been many AI hype-crash cycles, at least four. Yet in each cycle, we started to take something for granted, without even noticing.
For example, the first so-called "AI" that's well-known were those which can play chess-like games. They were probably the first ones that "shocked" ordinary people, because the idea was apparently hard to grasp (how can a machine only knows how to do calculation knows how to play chess and even defeat very good human players?!) Now every smartphones can play chess better than anyone in the world. Chess players cheat by using smartphones in the toilet.
Another example is machine translation, which were considered as nearly impossible. If you read AI history, you can find all kinds of jokes about machine translation. Today although most machine translations are not perfect but people certainly don't think it as impossible.
Other examples, such as object recognition (that's part of my research during my colllege days), are also vastly improved. Today's children probably don't think it's extraordinary for a computer to "know" what's in an image. Just 20 years ago it's almost as impossible as machine translation.

All in all I think the whole AI thing is just the manifestation of the fact that people need more computation power. I believe that in this age it's computation power that defines the advances of our civilization. It's just that sometimes people forget.
 
NVIDIA orders 25% more capacity from TSMC due to heightened demand from Amazon, Super Micro, Dell, Alphabet, Meta, and Microsoft.
But at what timeframe? It's not like TSMC has spare capacity on newest processes and they can't just push already agreed deals with other big clients back either.
 
But at what timeframe? It's not like TSMC has spare capacity on newest processes and they can't just push already agreed deals with other big clients back either.
Blackwell is still 5nm family process. More of that is opening up as Qualcomm, AMD, Mediatek, etc are moving to 3nm. Heck, Nvidia themselves are moving to 3nm for future iterations as well.

Given Nvidia's plan of releasing new flagship AI products every year, I imagine they're planning on having more overlapping production schedules.

It really is just a total frenzy of buying from all these companies, and Nvidia is in a position to just sell everything they can possibly make. They dont want to have to turn down companies today waving billion dollar orders at them cuz they lack capacity.
 
But at what timeframe? It's not like TSMC has spare capacity on newest processes and they can't just push already agreed deals with other big clients back either.
Likely by H1'25 I would think as by then some of the large players as mentioned above, Qcom, Mediatek, AMD, Apple, etc will be ramping 3nm for some new products which should free up 5nm supply. In addition, the Arizona fab which is a 5nm fab may also come online by then so there should be additional 5nm capacity available.

So capacity at TSMC shouldn't be a problem, rather HBM supply may be more of a limiting factor as the memory fabs have been running full so not sure how they will manage that.
 
Back
Top