NVIDIA discussion [2024]

you trying to convince me or yourself ? ;) from a business point of view, and from a purely economic point of view, if one of the products makes up only 1/10 of income, it is completely logical that the company will prioritize those products on which it has 90% of revenue and if keen to keep minority product on the market, then its price will either skyrocket or end completely. ( if I were on Huang´s place I would rise cost of gaminig GPU twice and there you have it , as you said there´s no competition )

Like others you’re jumping to the conclusion that Nvidia doesn’t care about gaming and is willing to hand it over to the competition by pricing their cards out of control. This conclusion is based on their success in enterprise AI even though the two markets are unrelated. However there’s zero evidence to support this line of thinking. I’m not trying to convince you otherwise.
 
Like others you’re jumping to the conclusion that Nvidia doesn’t care about gaming and is willing to hand it over to the competition by pricing their cards out of control. This conclusion is based on their success in enterprise AI even though the two markets are unrelated. However there’s zero evidence to support this line of thinking. I’m not trying to convince you otherwise.
How do you interpret the state of their graphics division now that one of their recent research paper involves inventing a new texture filtering method and requires DLSS to get image quality within acceptable noise levels ? The authors also specifically mention fixing a flaw in one of the previous papers where the texture sampling unit couldn't be used to perform filtering for compressed neural texture format ...
 
How do you interpret the state of their graphics division now that one of their recent research paper involves inventing a new texture filtering method and requires DLSS to get image quality within acceptable noise levels ? The authors also specifically mention fixing a flaw in one of the previous papers where the texture sampling unit couldn't be used to perform filtering for compressed neural texture format ...

I imagine that one research paper doesn’t tell us anything at all about the state of their graphics division. What do you think it means?
 
They mentioned component pricing was lower in the last few quarters which reduced costs to NV. Those discounts are presumably over.
Most of that must be HBM I'm guessing. From recent reports, all the memory suppliers are also essentially sold out through 2025 and must have increased prices as well.
you trying to convince me or yourself ? ;) from a business point of view, and from a purely economic point of view, if one of the products makes up only 1/10 of income, it is completely logical that the company will prioritize those products on which it has 90% of revenue and if keen to keep minority product on the market, then its price will either skyrocket or end completely. ( if I were on Huang´s place I would rise cost of gaminig GPU twice and there you have it , as you said there´s no competition )
Gaming is also a very respectable revenue and making a significant margin. There is absolutely no sense in doing anything to disturb that. It's also good to diversify, which is why even automotive, which is even smaller in revenue, continues to be a priority for Nvidia. The AI boom will not last indefinitely and NV wouldn't want to put all it's eggs in one basket. While in gaming there is practically only one competitor, and not much incentive for anyone else to try to enter, AI is a much bigger market and everyone wants a piece of the pie. There are many companies working on AI chips of their own to get a share of this market, and there is also a limit to how much power usage we can really sustain in the long term for AI. I see things slowing down around 2026 so while the core focus for now is AI, it would make sense to have some focus on the other divisions as well.
 
I imagine that one research paper doesn’t tell us anything at all about the state of their graphics division. What do you think it means?
Well considering their other real-time rendering applications includes neural denoising and neural radiance caching, is a pattern not starting to emerge ?
 
Well considering their other real-time rendering applications includes neural denoising and neural radiance caching, is a pattern not starting to emerge ?

Oh there's a pattern, but it's not confined to NV. I think it used to be called a 'synergy' in business-speke but I'm showing my age, I'm not sure what the current buzzword-of-the-week is.

Point is AI can help drive graphics and graphics can help drive AI. Getting out of either right now would be a massive risk, given a) they have the money to do both so why not? and b) trying to predict where AI will be in a couple of years is a bit like trying to predict what crypto going to do.

Honestly, what's next from the peak doom people? Which of the other big companies are going to do something ... radical?
 
Well considering their other real-time rendering applications includes neural denoising and neural radiance caching, is a pattern not starting to emerge ?
Whats the problem with neural rendering? UE5 cant do a basic reflection and yet nobody is calling EPIC out for leaving the graphics engine market. In fact because UE5 is so limited EPIC has moved away from the professional market while nVidia has invested a lot more into it (realtime Pathtracing, AI, Omniverse etc.).
 
Well considering their other real-time rendering applications includes neural denoising and neural radiance caching, is a pattern not starting to emerge ?

Yep people are researching applications of neural networks in rendering just like they’re doing in many other fields. AI is everywhere. I don’t see how that implies they’re abandoning gaming though. If anything it shows even more investment in game rendering tech.
 
You want to say that AI applications to graphics shouldn't be researched? Because why exactly?
Yep people are researching applications of neural networks in rendering just like they’re doing in many other fields. AI is everywhere. I don’t see how that implies they’re abandoning gaming though. If anything it shows even more investment in game rendering tech.
The problem is not researching AI applications for graphics. The problem is moving heaven and earth to not leave AI HW behind graphics because they're EXACTLY stuck in their own position doing this not knowing if they've truly reached the end goal in graphics or if they've trapped themselves in a possible dead end ...
 
The problem is not researching AI applications for graphics. The problem is moving heaven and earth to not leave AI HW behind graphics because they're EXACTLY stuck in their own position doing this not knowing if they've truly reached the end goal in graphics or if they've trapped themselves in a possible dead end ...

Still not clear what problem you're highlighting. What's the specific risk in investing in AI and what's the more compelling alternative? This is exactly the right time to take risk when the competition is lagging. If they needed to throw every last transistor at classic (i.e. safe) rendering methods in order to stay competitive then I would understand your point. But that's not the situation today.
 
Still not clear what problem you're highlighting. What's the specific risk in investing in AI and what's the more compelling alternative? This is exactly the right time to take risk when the competition is lagging. If they needed to throw every last transistor at classic (i.e. safe) rendering methods in order to stay competitive then I would understand your point. But that's not the situation today.
You could potentially fall behind in other (gaming) performance metrics and you risk creating a bubble for consumer AI HW (what will happen to DLSS wo/ AI HW ?) ...

I don't know if I'd describe the competition to be 'lagging' when they've scored a major API win with GPU driven rendering and they could do it again with bindless or somewhere else since the other player is distracted making AI work for rendering ...
 
The problem is not researching AI applications for graphics. The problem is moving heaven and earth to not leave AI HW behind graphics because they're EXACTLY stuck in their own position doing this not knowing if they've truly reached the end goal in graphics or if they've trapped themselves in a possible dead end ...
What "heaven and hell"? It's a research paper on an ML based texture compression. I'm not sure how it requires DLSS exactly but DLSS isn't something which should be forced in use by such methods, it is very much the other way around right now. And there are gobs of research which they are doing on graphics without any AI. So what "dead end" are you even talking about and if what position is Nvidia "stuck" in? In a position of controlling 80% of the market?
 
You could potentially fall behind in other (gaming) performance metrics
How exactly would you do that from a single AI application to graphics research paper?

and you risk creating a bubble for consumer AI HW (what will happen to DLSS wo/ AI HW ?) ...
What "bubble"? This h/w is sold already, they got their revenue from it.

I don't know if I'd describe the competition to be 'lagging' when they've scored a major API win with GPU driven rendering
What "win" is that?

and they could do it again with bindless or somewhere else since the other player is distracted making AI work for rendering ...
Could they probably also do proper ray tracing and DLSS equivalent while they are at it?
 
I guess some are probably worried that since current generative AI will focus on low precision arithmatics, NVIDIA (and other vendors) might be tempted to make a lot of units for low precision computation while floating point units will be neglected. However, IMHO this is not what going to happen, at least in 10 years. The reason is that generative AI is only part of the equation. A lot of AI applications require higher precision floating point calculations, especially those involving simulations. While generative AI like LLM are currently the hot topics, they are not the only game in town.

On the other hand, I believe rasterization is approching its limit. We will need a new paradigm for real time graphics. Real time raytracing and path tracing are just a start.
 
NVIDIA (and other vendors) might be tempted to make a lot of units for low precision computation while floating point units will be neglected
If you want to sell GPUs to gamers then you can't "neglect floating point units". Nvidia hasn't really changed their balance of ALUs since Ampere (arguably Turing even if we consider INTs there being usable in gaming code) so they seem to be happy with the split between vector and matrix math units. From what was disclosed about Blackwell so far I don't see that ratio changing there either.
 
NVIDIA (and other vendors) might be tempted to make a lot of units for low precision computation while floating point units will be neglected.

That would entail them not only pulling out of real-time graphics, but HPC as well.
 
I guess some are probably worried that since current generative AI will focus on low precision arithmatics, NVIDIA (and other vendors) might be tempted to make a lot of units for low precision computation while floating point units will be neglected. However, IMHO this is not what going to happen, at least in 10 years. The reason is that generative AI is only part of the equation. A lot of AI applications require higher precision floating point calculations, especially those involving simulations. While generative AI like LLM are currently the hot topics, they are not the only game in town.

On the other hand, I believe rasterization is approching its limit. We will need a new paradigm for real time graphics. Real time raytracing and path tracing are just a start.
Compute performance isnt limiting rendering. We need better software.
 
don't know if I'd describe the competition to be 'lagging' when they've scored a major API win with GPU driven rendering
How do you classify something as a major win when it's not even based on a single useable application and when it's still in beta stage and when NVIDIA supports it on more hardware (Ampere and Ada) vs AMD (only RDNA3).

Compare that against the scores of "wins" NVIDIA already developed and integrated into real applications: Ray Tracing, huge performance advantage in Ray Tracing, Path Tracing, integrating path tracing into old graphics, AI Upscaling, AI downsampling, AI Frame Generation, AI denoising, AI HDR, AI Anti Aliasing, AI downsampling, Reflex, etc.

If the competition is not lagging behind when they have none of that, then I don't what criteria you have (if any) to classify something as a major win or a minor lag?
 
How do you classify something as a major win when it's not even based on a single useable application and when it's still in beta stage and when NVIDIA supports it on more hardware (Ampere and Ada) vs AMD (only RDNA3).
Given how well they've performed in games (Halo Infinite & Starfield) with the more pathological cases of the predecessor API (ExecuteIndirect) and that there's publicized preliminary data from a big ISV (Epic Games) too, it's not hard to see which vendor has the most robust implementation of GPU driven rendering functionality ...

They've even implemented the device generated commands extension in their Vulkan driver. When is Nvidia going to do the same for their competitor's shader enqueue extension ?
Compare that against the scores of "wins" NVIDIA already developed and integrated into real applications: Ray Tracing, huge performance advantage in Ray Tracing, Path Tracing, integrating path tracing into old graphics, AI Upscaling, AI downsampling, AI Frame Generation, AI denoising, AI HDR, AI Anti Aliasing, AI downsampling, Reflex, etc
All of that is their own self-inflicted problem to make HW not redundant in gaming ...
If the competition is not lagging behind when they have none of that, then I don't what criteria you have (if any) to classify something as a major win or a minor lag?
They're behind feature/integration wise but by no means do they have 'bad' HW design when they can have a smaller physical HW implementation for a given performance profile and they have more freedom to improve this aspect as well ...
 
Back
Top