NVIDIA discussion [2024]

I don't understand how this is suddenly news, it used to be common knowledge not too long ago

It's probably because Nvidia is one of the news topics in vogue currently. This is also like the corporate equivalent story of if you'd bought into this stock years ago that draws clicks.

Nvidia has been a reported acquisition target of both AMD and Intel (with Intel on/off I think until the late 2010s, when it switched to Nvidia possibly aquiring Intel) with the reported sticking points being both who would be the CEO as well as regulatory concerns.

Also in this case I think maybe people might have been aware of Intel Nvidia acquisition discussions with the same rumoured sticking point, while the AMD/ATI actual acquisition ended up overshadowing the awareness of this. At this point I'd guess the majority likely just might assume (or immediately think) that AMD has always been in graphics, with how long now that ATI branding has been completely removed.
 
Musk shared the latest details for xAI's 100,000 GPU cluster in response to a media report that outlined that talks between the AI firm and Oracle to expand their existing agreement have ended.
...
Musk shared that xAI is building its 100,000 GPU AI system internally to achieve the "fastest time to completion." He believes this is necessary to "catch up" with other AI companies, as, according to him, "being faster than any other AI company" is very important for xAI's "fundamental competitiveness."
...
Today's details follow Musk's statements early last month which revealed xAI's plans to build a multi billion dollar system with NVIDIA's Blackwell chips. He had outlined that the system would use roughly 300,000 B200 GPUs. When taken together with price details shared by NVIDIA CEO Jensen Huang, the system could cost as much as $9 billion.
...
Musk believes that by building the H100 system instead of working with Oracle, xAI can achieve the "fastest time to completion." The system will start training this month and will be the "most powerful training cluster in the world by a wide margin," believes the executive. Before models like Grok or ChatGPT are ready to respond to queries, they are trained on existing data sets. These allow them to mathematically predict what the response to a user's question will be based on what they have already learned
So he is spending an additional amount for the 100,000 H100s. Seems like they aren't even considering alternatives from Intel or AMD.
 
The launch of HBM4 memory is going to be huge for the AI segment since companies are now moving towards switching their strategies. SK hynix is one of the first firms to implement a "multi-function HBM." In a modern-day implementation, advanced memory semiconductors are attached closely to different dies, such as the GPU one, for computational efficiency, & to bridge everything up, the industry utilizes packaging technologies such as the renowned CoWoS.

Since this isn't an optimal route, SK hynix previously revealed that they plan to integrate memory and logic semiconductors into a single package, which means that there won't be a need for packaging technology and given that individual dies would be much closer to this implementation, it would prove to be much more performance efficient. To achieve this, SK hynix plans to establish a strategic "triangular alliance," which will involve TSMC for semiconductors and NVIDIA for the product design, resulting in an end product that might be revolutionary.

While we aren't sure for now how SK hynix plans to implement HBM4 memory, given that TSMC and NVIDIA are now involved with the Korean giant, they indeed have figured it out. The upcoming SEMICON is important in this manner, as it will set the tone for future AI accelerators utilizing the memory standard. Apart from this, the alliance shows that the involved firms are ready to capitalize on the markets, giving competitors no space for potential growth or exposure.
 
Last edited:
The Russian-founded tech giant Yandex has left Russia after finalizing one of the country’s most significant foreign corporate exits since the Russo-Ukraine war started.

The company claims its deployment in Finland is the most powerful commercially available supercomputer on the continent, but it plans to triple its footprint with new Nvidia GPUs to compete with Amazon, Google, and Microsoft in the AI sphere.

"It's in Nvidia's interest to diversify their client base; they're interested in growing guys like us," Volozh told the Financial Times. "We've had a working relationship with them for years. They know and trust us," said the Yandex founder.

 
The Russian-founded tech giant Yandex has left Russia after finalizing one of the country’s most significant foreign corporate exits since the Russo-Ukraine war started.
Not entirely accurate: they hasn't left Russia, they've split their operation in two parts and moved/sold the assets so that these two parts can be independent now. "Yandex" name remains with the Russian part, the non-Russian part will have a different name.
 
If anything I'm more surprised that's supposedly the largest cluster today - it's "only" $5B or something, right? You'd think with NVIDIA's current quarterly revenue, there'd be multiple ones of those already. The implication is hyperscalers are making a bunch of 10k-25k clusters instead of one big one, which makes me wonder what GPT5 is being trained on and how many more flops than GPT4 it will be...
 
implication is hyperscalers are making a bunch of 10k-25k clusters instead of one big one
Well, Elon is building a 300K cluster of B200s next year, OpenAI is also building another cluster of 100K of B200s, and Microsoft is planning a million cluster somewhere down the line (as a result of their $100 billion AI investment). So this trend is definitely catching up.
 
And what do they do with it so that the expenditure pays off? So far I haven't seen much in the way of AI that can make billions of dollars. Of course I've seen some interesting things in my branch in geo-remote sensing. For example, you can now use Deep Learning to determine the yield instead of sending people up to the fields. But you also have to be careful not to overfitting,
 
Last edited:
And what do they do with it so that the expenditure pays off? So far I haven't seen much in the way of AI that can make billions of dollars. Of course I've seen some interesting things in my branch in geo-remote sensing. For example, you can now use Deep Learning to determine the yield instead of sending people up to the fields. But you also have to be careful not to overfitting,

Why does everyone keep saying that AI needs to provide some specific discrete payoff that we can all point to and says “Aha there’s the AI”! It’s just another technology advancement that makes products and services more capable over time.

The other way to look at it is what would these companies do to entice people to keep buying stuff without AI? What’s the alternative?
 
And what do they do with it so that the expenditure pays off? So far I haven't seen much in the way of AI that can make billions of dollars. Of course I've seen some interesting things in my branch in geo-remote sensing. For example, you can now use Deep Learning to determine the yield instead of sending people up to the fields. But you also have to be careful not to overfitting,
Shhh, this is tech baby, we don't talk about "earning money" here. We talk about "earning scale" "dominating the competition" and "not getting sued for anti competitive practices". All that "actually making money" stuff can come later, this is a strategy that has always worked out 100% of the time in the past and always will.
 
And what do they do with it so that the expenditure pays off?
Well for Twitter, nothing. It's a colossal waste of money and energy and silicon/GPU's. It's just Elon thinking that Grok can in any way contribute or compete in the space, and it just wont. Especially as its training gets put through the ever devolving garbage that is Twitter posts.

Money absolutely being thrown down the drain all so Elon can feel he's playing in the big boy party. He already massively overspent on Twitter, big advertisers have slunk away, and now he's spent another what, $5b on this AI supercomputer? It's especially ridiculous when Tesla is right there and all that money/power could be used to expand their AI training in a way that might actually contribute something to the world.
 
Hold up, all his bluster about AI spending is for twitter and not Tesla? Lol what a joke. Nobody should be training anything on twitter content. It’s 99% garbage.
So far xAI seem to be doing mainly LLM's, not Tesla autopilot type stuff. They also mention partnering closely with X Corp but no mention of Tesla in the same vein.

 
Back
Top