NVIDIA discussion [2024]

  • Thread starter Deleted member 2197
  • Start date
The NVIDIA/MediaTek collaboration has resulted in two things so far:
-An automotive chip with RTX graphics (Ada based) as it supports ray tracing + DLSS frame generation.
-A display scaler with integrated G-Sync capabilities.

Rumors point to a third thing, a laptop SoC with ARM + RTX graphics.
It also might mean a renewed push of something Tegra-like in phones and gaming handhelds.
They got to hit that +$1T TAM somehow.
 
It also might mean a renewed push of something Tegra-like in phones and gaming handhelds.
They got to hit that +$1T TAM somehow.

images
 
That's one way to read it, the other would be Bloomberg crying over designing their workloads for NVIDIA hardware and not being able to just drop-in replace hardware with AMD or Intel when they can't get enough GPUs from NVIDIA.
That would be the wrong way of looking at it, and the wrong way of understanding the situation, 90% of libraries out there are optimized only for CUDA and NVIDIA, no one is interested in redoing all the work, the debugging, the bug fixing and handling all the support necessary, it's easier and more cost effective to stick with NVIDIA even despite the GPU scarcity.

This sentiment is shared among the hyperscalers who can afford the time and money to switch, as evident by the recent comments from former CEO of Google Eric Schmidt.


Schmidt said it will be difficult for competitors to catch up with Nvidia because many of the most important open source tools that AI developers use are based on the company’s CUDA programming language. He said AMD’s software that translates Nvidia’s CUDA code for its own chips “doesn’t work yet.”

 
That would be the wrong way of looking at it, and the wrong way of understanding the situation, 90% of libraries out there are optimized only for CUDA and NVIDIA, no one is interested in redoing all the work, the debugging, the bug fixing and handling all the support necessary, it's easier and more cost effective to stick with NVIDIA even despite the GPU scarcity.

This sentiment is shared among the hyperscalers who can afford the time and money to switch, as evident by the recent comments from former CEO of Google Eric Schmidt.


Schmidt said it will be difficult for competitors to catch up with Nvidia because many of the most important open source tools that AI developers use are based on the company’s CUDA programming language. He said AMD’s software that translates Nvidia’s CUDA code for its own chips “doesn’t work yet.”


I mean, he has some points. Nvidia snuck in a closed ecosystem as a foundation, and is of course milking it into getting sued for breaking anti competition laws. And the sunk cost fallacy is a huge deficit most executive types can't overcome, because it would be tantamount to admitting they made a mistake, and that's not a thing the type of person that becomes an executive generally does.
 
Schmidt is an exec, aka doesn't actually know what he's talking about. Most don't, they are there for "business", engineers are for that "knowledge" thing. And most engineers directly familiar with the ecosystems agree that AMD's software support has gotten much better, and expect it to continue to ramp up without stop.
That's a stereotype I don't agree with, Eric's sentiment is echoed everywhere across the industry, he is a software engineer and is well informed and is speaking to a gathering of well informed engineers. He recognizes AMD made strides in software but still no where near enough to catch up to NVIDIA's enormous head start. He also mentions details not commonly mentioned in mainstream media. His lecture detailed the whole situation of every competitor in the field.
 
I mean, he has some points. Nvidia snuck in a closed ecosystem as a foundation, and is of course milking it into getting sued for breaking anti competition laws. And the sunk cost fallacy is a huge deficit most executive types can't overcome, because it would be tantamount to admitting they made a mistake, and that's not a thing the type of person that becomes an executive generally does.

What’s the alternative solution that wouldn’t have been a mistake?
 
I mean, he has some points. Nvidia snuck in a closed ecosystem as a foundation
You seem to have had edited your original post, anyway I remind you that this is the CEO of Google we are talking about, Google is the only other company with successful AI hardware and software (TPUs) that are deployed in the field in a wide manner. By saying those things, Google just admitted defeat in the face of an overwhelming NVIDIA advantage.

NVIDIA is perhaps the only company praised by almost every competitor CEO in the field, and for good reasons.

 
Last edited:
I mean, he has some points. Nvidia snuck in a closed ecosystem as a foundation, and is of course milking it into getting sued for breaking anti competition laws. And the sunk cost fallacy is a huge deficit most executive types can't overcome, because it would be tantamount to admitting they made a mistake, and that's not a thing the type of person that becomes an executive generally does.
When will Intel license x86 to Nvidia instead of keeping it as a "closed ecosystem"? Is Arm "snucking" a closed Arm ecosystem into every mobile device out there too?
 
Have there been other industries where similar thing happened? Like some company puts in many years of pushing something their competition doesn't seem overly interested in until it actually becomes useful then everyone wants in and then they are accused of a monopoly? Would they have been able to build the moat so successfully if others started to want in seriously a little bit sooner? Should they be penalised somehow for it? Part of me thinks they gambled and won and deserve the spoils. But i'm not sure if the others never manage to catch up and in 30 years one company is the only option is ideal either (even if it probably wont be something i'm caring about by then).

I don't think they should be taken to task so quickly when it booms for monopoly etc, yet it could cause even more strain on competition if it isn't. Going to be hard for regulators to get this right, but slightly going off tangent if they do something to pull nvidia in I would like to see a similar approach taken with drug companies maybe they shouldn't be able to lock down their formulas for so long either.
 
Going after Nvidia for anti-competitive behavior is fair game if they’re using their dominant position to box out the competition. But going after them for reaping the benefits of a long term investment is just crazy. It’s not like they have patents blocking other people from writing ML libraries. The barrier to entry here is time and money.

The argument seems to be that Nvidia should have built their software ecosystem around open source tools and languages (e.g. OpenCL) so that other hardware vendors would have a fair shot at competing now that Nvidia has proven there’s money to be made. Regulators don’t like proprietary interfaces and they don’t like bundling. But you can’t force companies to spend billions of dollars developing technology just to hand it over on a silver platter. CUDA libraries are free to use. All the money comes from hardware sales and enterprise support contracts. If those libraries were written in a hardware agnostic language Nvidia would need some other way to recoup their investment.
 
Very interesting and unique interview with the founders of 3dfx about the history of the company, the competition, and the battles with NVIDIA, including some rare and modest admissions about how NVIDIA won these battles.

0:00 Intro
0:28 Scott Sellers on founding 3dfx
6:03 Focus on arcade games
8:55 Original Voodoo chipset
14:12 Outsourcing Geometry-Processing onto the CPU
21:10 3dfx vs. NVIDIA
25:32 Voodoo Improvements over Time vs the competition
28:00 Rampage - too late for a Gamechanger vs NVIDIA?
29:32 Rampage - rumored Specifications
31:58 Deferred Rendering
35:05 Rampage Fabrication Process
35:28 Glide API
40:58 What was the Killer-App for Voodoo? (Scott)
48:50 Are you (still) a Gamer? (Scott)
50:03 Ross Smith on the Names '3dfx' and 'Voodoo'
54:20 Business Plan and Game Recruitment
58:30 Entering the PC Gaming Market
1:02:05 Quake on Glide
1:04:30 Game Knowledge is important!
1:05:45 What was the Killer-App? (Ross)
1:07:30 Bad Decisions
1:11:30 Voodoo Rush
1:16:09 Success of Voodoo 2
1:17:15 Entering the Board-Market
1:20:10 Canceled Deal with Sega
1:24:21 Voodoo Impact on the Gaming World?
1:28:40 Are you (still) a Gamer? (Ross)

 
Going after Nvidia for anti-competitive behavior is fair game if they’re using their dominant position to box out the competition. But going after them for reaping the benefits of a long term investment is just crazy. It’s not like they have patents blocking other people from writing ML libraries. The barrier to entry here is time and money.

The argument seems to be that Nvidia should have built their software ecosystem around open source tools and languages (e.g. OpenCL) so that other hardware vendors would have a fair shot at competing now that Nvidia has proven there’s money to be made. Regulators don’t like proprietary interfaces and they don’t like bundling. But you can’t force companies to spend billions of dollars developing technology just to hand it over on a silver platter. CUDA libraries are free to use. All the money comes from hardware sales and enterprise support contracts. If those libraries were written in a hardware agnostic language Nvidia would need some other way to recoup their investment.

The lawsuits are specifically about threats to delay or refuse shipments for anyone buying competitors hardware. The free closed ecosystem, on the other hand, is also arguably dominant anyway and so a perfectly legitimate target for antitrust. Android is free, it's open source! They still charge 30% for an app fee and bully people out of competing apps stores. They still bully people out of forks. No one can compete because it would take too much money and time they don't have and can't get because "Android is free".

Similarly everyone outside China knows Google search is a monopoly (and now a US judge legally ruled monopoly). Anyone working in browsers absolutely knows Chrome is a monopoly and a huuuuge bully. Both are free though right? "Free" can still easily be abused, and actively is. Mark Zuckerberg is shouting to the heavens about how is "open" Llama AI model should be standard. Do you really believe for even a time travelling negative second that Mark Zuckerberg is somehow going to be a champion of open and fair competition?
 
NVIDIA has launched a series of new CUDA libraries aimed at expanding the capabilities of accelerated computing. According to the NVIDIA Blog, these libraries promise significant improvements in speed and power efficiency across a range of applications.

The new libraries target various applications, including large language models (LLM), data processing, and physical AI. Some of the key highlights include:

Businesses around the world are increasingly adopting NVIDIA's accelerated computing solutions, resulting in remarkable speed increases and energy savings. For example, CPFD's Barracuda Virtual Reactor software, which is used in recycling facilities, runs 400 times faster and 140 times more energy efficient on CUDA GPU-accelerated virtual machines compared to CPU-based workstations.

A popular video conferencing application saw a 66x increase in speed and a 25x improvement in energy efficiency after migrating its live captioning system from CPUs to GPUs in the cloud. Similarly, an e-commerce platform reduced latency and achieved a 33x increase in speed as well as a nearly 12x improvement in energy efficiency by switching to NVIDIA's accelerated cloud computing system.
 

We are now seeking a Senior Research Scientist, Generative AI for Graphics!

NVIDIA is on the lookout for world-class researchers in deep learning, specializing in generative models such as GANs, Diffusion models, conditional image synthesis, and image reconstruction. Join our Applied Deep Learning Research team, renowned for revolutionizing real-time graphics with groundbreaking technologies like DLSS2 Super Resolution, DLSS3 Frame Generation, and DLSS 3.5 Ray Reconstruction.

Our team is dedicated to pushing the boundaries of what's possible in real-time graphics, using the latest advancements in GenAI. If you're passionate about generative models, real-time graphics, and delivering transformative technology to real users, you'll find a perfect match with our team. In this role, you'll be engaged in ground breaking research, developing and training advanced deep-learning models to drive innovation in real-time graphics. Once you've demonstrated the promise of your research through prototype building, you'll collaborate with product teams to integrate your innovative ideas into industry-leading real-world applications.

What you will be doing:
  • Research on AI models that will improve the quality and/or performance of real-time rendering.
  • Research on deep learning-based approaches for real-time image generation.
  • Prototype your AI models in real-time game engines.
  • Construct and curate datasets for large-scale machine learning.
  • Stay current on the latest research for graphics and generative AI to look for disruptive technologies for production.
  • Work closely with product and hardware architecture teams to integrate your research and developments into products.
What we need to see:
  • PhD in Computer Science/Engineering, or a related field or equivalent experience (or an MS with equivalent relevant / meaningful experience).
  • 3+ years of on-the-job or research experience in deep learning, image generation, or related fields.
  • Excellent programming skills in rapid prototyping environments such as Python; C++ and parallel programming (e.g., CUDA) are a plus. Strong proficiency in PyTorch is required.
  • Experience with traditional graphics pipelines, game engines, shading models, and path tracing is a plus.
  • A track record of research excellence demonstrated in publications at leading conferences and journals.
NVIDIA’s invention of the GPU 1999 sparked the growth of the PC gaming market, redefined modern computer graphics, and revolutionized parallel computing. More recently, GPU deep learning ignited modern AI — the next era of computing — with the GPU acting as the brain of computers, robots, and self-driving cars that can perceive and understand the world. Today, we are increasingly known as “the AI computing company”. Do you love the challenge of pushing what's possible in realtime interactive graphics and Generative AI? If so, we want to hear from you!
 
Last edited:
Last edited:
NVIDIA also posted its highest-ever gaming revenue during Q2 FY2025, raking in $2.880 Billion.

"Gaming revenue was up 16% from a year ago
and up 9% sequentially. These increases reflect higher sales of our GeForce RTX 40 Series GPUs and game console SOCs. We had solid demand in the second quarter for our gaming GPUs as part of the back-to-school season"


 
Nice results but shares are retreating. Market seems unimpressed. They probably need to get closer to $50B quarterly revenue for people to get excited again.
 
Back
Top