NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
That said, Tegra was supposed to compensate for the shrinkage of the discrete graphics market under the influence of much better integrated graphics.

No, Tegra was created to address the advent and growth of low power mobile computing devices such as smartphones and tablets.

Worse, Tegra 4 appears to be much less successful than Tegra 3, since the only design win I'm aware of is one ZTE phone which may or may not actually make it to market.

NVIDIA claims that Tegra 4 already has more design wins than Tegra 3, and that there is much interest in Tegra 4i. Considering that Tegra 4/4i/i500 only started sampling relatively recently, it is no surprise that design wins haven't been announced yet. With 4G LTE-equipped tablets and smartphones in particular, there is a six month lag between carrier certification and device production.

Tegra 3 was in the Google Nexus, but apparently NVIDIA's out of that. It was also in the Surface RT, but I think JH Huang's recent comments about Windows RT being a disappointment pretty clearly point to Tegra 4 being out of the next Surface RT tablet, if such a tablet is indeed planned.

The Nexus 7 still seems to be selling reasonably well with Tegra 3, so why would Google rush to replace it? In my opinion, it would make more sense to offer a higher performance and higher priced variant (say, Nexus 7.7 or Nexus 8). Tegra 4 isn't production ready yet, so if Google needs something now, then they will need to go with S4 Pro or S600. This isn't the end of the world for NVIDIA, because product cycles come and go every 6-9 months. NVIDIA will be ready and waiting for the next refresh.

As for the comments about Windows RT, he never said that the operating system or the idea of creating a Windows on ARM operating system was bad, he simply said that sales of the RT tablets were disappointing. He said the exact same thing about Android one or two years ago, and now his view and outlook on Android have completely changed as the Android OS and ecosystem have evolved and grown. NVIDIA's partnership with Microsoft is absolutely critical in helping to ensure that Windows on ARM evolves and grows in the future, so don't construe short-term disappointment as a sign that NVIDIA would abandon this OS and this ecosystem in any way.

Generally speaking, NVIDIA is not a very large company, and Tegra is only about 20% of its business. Under such circumstances, competing against the much larger, and very much Snapdragon-focused Qualcomm is not going to get any easier.

All of a sudden, a company with more than 7000 employees is considered "not a very large company"? Right. NVIDIA has significantly more experience in visual computing than Qualcomm, while Qualcomm has significantly more experience in mobile computing than NVIDIA, but these two companies are moving more and more towards convergence. These are both fabless semiconductor companies too. Qualcomm currently has a large advantage in the smartphone space, while NVIDIA currently has a large advantage in the tablet space. The main piece of the puzzle that NVIDIA was missing to take advantage of the growth in the smartphone space was a 4G LTE baseband modem, and that will be resolved in the very near future.

In the short term, NVIDIA should do OK. In the long term, they might make enough money in pro graphics and fancy cloud stuff, but honestly I don't expect NVIDIA to still be around 10 years from now, maybe even not that much.

No need to promote FUD. NVIDIA is currently very well positioned to take advantage of the growth in mobile computing devices, and it appears that NVIDIA is investing more in Tegra and GPU businesses now than ever before. There are very very few companies on the face of the earth that possess the hardware and software technology to create very low power fully custom SoC's with CPU, GPU, and 4G LTE baseband modem all integrated on the SoC die.
 
Last edited by a moderator:
Must have something to do with the end of the Silicon age 10 years from now, as if that won't affect any other semiconductor company either, and as if there will be little to no innovation in that time frame, right?
 
Well interestingly enough all the last posts no matter the view on Nvidia's fates makes me think that from his investor's pov the guy that wrote the article is right :LOL:
 
Last edited by a moderator:
The guy who wrote that article is not an industry analyst. In fact, there is very little analysis in the article at all, and the investment advice given is atrocious. Shorting NVIDIA's stock at the current valuation levels would be a colossal mistake unless one is hell bent on risking their own hard-earned money. The right strategy would be to hold for those who currently have a position, or to wait to buy into a position once there are more signs of upward price momentum.
 
How that would be a colossal mistake?
You don't short stuff unless you expect an imminent and colossal collapse. That's because shorting has limited upside and unlimited downside (just the opposite of buying stock.)

If you're bearish on Nvidia, it's good advice to sell what you have, but shorting requires a stronger case than what he's offering. Unless you just like gambling. Then it's fine.
 
If Google is putting a Qualcomm SoC instead of an nVidia SoC in the Nexus 7 successor is that really such a big deal? Google has always been diverse in SoC selections for their Nexus devices. It makes sense that they would be; so long as an SoC has any market potential at all support on a Google device will help build Android support for non-Google devices. And Google is going to care about Android's presence on more than just its own devices.

And it certainly won't mean that Qualcomm is the only sane choice for tablets and poised to take over that market.
 
No, Tegra was created to address the advent and growth of low power mobile computing devices such as smartphones and tablets.

NVIDIA claims that Tegra 4 already has more design wins than Tegra 3, and that there is much interest in Tegra 4i. Considering that Tegra 4/4i/i500 only started sampling relatively recently, it is no surprise that design wins haven't been announced yet. With 4G LTE-equipped tablets and smartphones in particular, there is a six month lag between carrier certification and device production.

The Nexus 7 still seems to be selling reasonably well with Tegra 3, so why would Google rush to replace it? In my opinion, it would make more sense to offer a higher performance and higher priced variant (say, Nexus 7.7 or Nexus 8). Tegra 4 isn't production ready yet, so if Google needs something now, then they will need to go with S4 Pro or S600. This isn't the end of the world for NVIDIA, because product cycles come and go every 6-9 months. NVIDIA will be ready and waiting for the next refresh.

As for the comments about Windows RT, he never said that the operating system or the idea of creating a Windows on ARM operating system was bad, he simply said that sales of the RT tablets were disappointing. He said the exact same thing about Android one or two years ago, and now his view and outlook on Android have completely changed as the Android OS and ecosystem have evolved and grown. NVIDIA's partnership with Microsoft is absolutely critical in helping to ensure that Windows on ARM evolves and grows in the future, so don't construe short-term disappointment as a sign that NVIDIA would abandon this OS and this ecosystem in any way.

I'm not convinced, but the coming months will answer all of these questions, so we shall see.

All of a sudden, a company with more than 7000 employees is considered "not a very large company"? Right. NVIDIA has significantly more experience in visual computing than Qualcomm, while Qualcomm has significantly more experience in mobile computing than NVIDIA, but these two companies are moving more and more towards convergence. These are both fabless semiconductor companies too. Qualcomm currently has a large advantage in the smartphone space, while NVIDIA currently has a large advantage in the tablet space. The main piece of the puzzle that NVIDIA was missing to take advantage of the growth in the smartphone space was a 4G LTE baseband modem, and that will be resolved in the very near future.

7,000 employees is a lot, but only ~20% or so ought to be doing something related to Tegra (assuming it's roughly proportional to the revenue it's bringing in). Let's round up and call it 2,000. That's pretty good, but not even close to Qualcomm's 26,600 employees. In fact, it's pretty similar to the AMD vs. Intel situation.

NVIDIA may have more experience in visual computing, but not in mobile visual computing.

I would say that the main piece of the puzzle NVIDIA was missing is power-efficiency, but we'll see how well Tegra 4i does.

No need to promote FUD. NVIDIA is currently very well positioned to take advantage of the growth in mobile computing devices, and it appears that NVIDIA is investing more in Tegra and GPU businesses now than ever before. There are very very few companies on the face of the earth that possess the hardware and software technology to create very low power fully custom SoC's with CPU, GPU, and 4G LTE baseband modem all integrated on the SoC die.

What's going to be different in 10 years?

The discrete graphics market will be a fraction of what it is now. This is pretty much a given, whereas NIVIDA ever making a profit in mobile devices is not. I mean, Qualcomm managed to make even T.I. give up with OMAP. Now ST-Ericsson is going tits up. Can anyone compete against Qualcomm? Marvell seems to be doing OK at the low-end, I think Broadcom is doing well too, but not in the SoC market.
 
Last edited by a moderator:
The discrete graphics market will be a fraction of what it is now. This is pretty much a given, whereas nVidia ever making a profit in mobile devices is not.

I'm not so sure that's a given. There's no guarantee that APUs will continue their current march into "good enough" territory.

For one it could be cyclical, following the cadence of new console generations. It's no surprise that embedded graphics implementations are peaking at the end of a console generation. How will that change when UE4 defines the new baseline? Even the mighty Trinity struggles at the ubiquitous 1080p resolution in modern games.

Then there's the potential for disruptive technologies to increase demand for powerful graphics hardware in the PC space. Given the push toward general programmability it will be even easier to implement complex rendering methods that can scale with faster hardware.

nVidia could also try to lower prices on entry-level discrete cards in order to stay competitive but I don't know how much wiggle room they have at that end of the market.

With respect to the mobile market nVidia doesn't have to beat Qualcomm at its own game. The entire market is growing so any company with a piece of the pie will see increased revenues. nVidia just has to establish a foothold in mobile and ride the wave.
 
I'm not so sure that's a given. There's no guarantee that APUs will continue their current march into "good enough" territory.

For one it could be cyclical, following the cadence of new console generations. It's no surprise that embedded graphics implementations are peaking at the end of a console generation. How will that change when UE4 defines the new baseline? Even the mighty Trinity struggles at the ubiquitous 1080p resolution in modern games.

Then there's the potential for disruptive technologies to increase demand for powerful graphics hardware in the PC space. Given the push toward general programmability it will be even easier to implement complex rendering methods that can scale with faster hardware.

nVidia could also try to lower prices on entry-level discrete cards in order to stay competitive but I don't know how much wiggle room they have at that end of the market.

What's stopping APUs from continuing on this trend? Mainstream CPUs have been stuck at 4 cores since the Core 2 Quad in 2006, and neither Intel nor AMD seems interested in adding more cores. That means every process shrink from now on (22nm included, for Intel) will be to the benefit of graphics for the most part.

Then there's the bandwidth problem, but if NVIDIA is confident that they can stack DRAM on top of a ~200W Volta GPU with TSVs in 2015, surely Intel and AMD can do the same on 10~150W APUs in the same time frame, if not earlier.

For the time being, Intel will use Crystalwell, and Kaveri is rumored to use GDDR5.

Given these developments, discrete graphics will only really make sense on desktops, and even then only over 100W, which is not a very large market, and a shrinking one too.

With respect to the mobile market nVidia doesn't have to beat Qualcomm at its own game. The entire market is growing so any company with a piece of the pie will see increased revenues. nVidia just has to establish a foothold in mobile and ride the wave.

The entire market is growing, but Apple is not a potential customer for NVIDIA, and Samsung tends to mostly use its own chips as well. Samsung sometimes relies on Qualcomm, but if they want standard Cortex AXX cores, they might as well use their own designs. Beyond that, there's the very cost-conscious Amazon, and I don't see NVIDIA getting into a Kindle any time soon.

A year ago, Samsung had about 25% of the Android market share. Now it's close to 50% (source). How high will that percentage be in 2014? 2015?

In other words, just how large is the market, and how fast is it really growing (if at all) if you take out Apple, Samsung and cost-conscious device makers like Amazon? Again, there's a reason why OMAP and NovaThor are no longer with us.
 
What's stopping APUs from continuing on this trend? Mainstream CPUs have been stuck at 4 cores since the Core 2 Quad in 2006, and neither Intel nor AMD seems interested in adding more cores. That means every process shrink from now on (22nm included, for Intel) will be to the benefit of graphics for the most part.

Yes, of course APUs will increase in performance as well. However, power consumption and upgradeability of highly integrated solutions will continue to be important factors.

We're moving toward even tighter integration so it will be interesting to see how consumers react. It's also going to be more difficult to target consumer needs when CPU, GPU, memory and motherboard are bundled together in one tightly integrated package.

In other words, just how large is the market, and how fast is it really growing (if at all) if you take out Apple, Samsung and cost-conscious device makers like Amazon? Again, there's a reason why OMAP and NovaThor are no longer with us.

How long did it take for Samsung to emerge as a leader in the Android handset space? What guarantee is there that either Apple or Samsung will remain in the SoC business? You're making a 10 year prediction based on the status quo. Why are you so sure that Amazon, Google, Samsung, Microsoft, HTC or anyone else won't use Tegra chips in the future?
 
NVIDIA has gone on the record at GTC 2013 saying that they are willing to license their GPU technology to other companies. So that opens up extra opportunities. Having an integrated 4G LTE baseband modem opens up extra opportunities. Having a high performance 64-bit fully custom ARM CPU integrated with a Geforce GPU opens up extra opportunities. At the moment, NVIDIA is largely shut out in licensed technology used by vertically integrated companies, NVIDIA is largely shut out in 4G LTE-equipped smartphones/tablets, and NVIDIA is largely shut out in low cost APU-equipped systems. This is all likely to change within the next one or two years. PC gaming is still a vibrant and thriving space, and PC gamers will always demand the fastest GPU and CPU hardware, irrespective of whether or not APU's get better over time. Android gaming continues to evolve and grow. So NVIDIA's future actually looks bright rather than bleak in my opinion. If the Windows on ARM operating system and ecosystem continues to evolve and grow and eventually flourish, then that would be icing on the cake.
 
Last edited by a moderator:
You don't short stuff unless you expect an imminent and colossal collapse. That's because shorting has limited upside and unlimited downside (just the opposite of buying stock.)

If you're bearish on Nvidia, it's good advice to sell what you have, but shorting requires a stronger case than what he's offering. Unless you just like gambling. Then it's fine.
I guess it depends on what you mean by "short", I mean mostly only big investors (/financial institutions) can actually do "short selling", it is plain usury as they sell a bunch of shares they don't own, enough to affect the quotation of a given company and then buy back shares (/ circumstances can increase the effect).

I'm not sure that is what the guy means, I read it more like "stay short vs stay long" keep your share or sale them" (the later in our case).
Big investors (/financial institutions) don't need any advices on the matter, so I think the guy addresses the average "private/personal" investors.

The outlook is not promising, I see no reasons for the value of the stock to go up (more lightly down short term) so it could make sense for one to put its money elsewhere.

Then I won't go into the on going terrible crisis, if I had money I would be out of the stock market all together, PM and commodities and others real asset in an effort to save/preserve my capital as much as possible. But it is such a scheme right now, lets be bullish :LOL:
 
Yes, of course APUs will increase in performance as well. However, power consumption and upgradeability of highly integrated solutions will continue to be important factors.

We're moving toward even tighter integration so it will be interesting to see how consumers react. It's also going to be more difficult to target consumer needs when CPU, GPU, memory and motherboard are bundled together in one tightly integrated package.

Consumers, for the most part, probably won't care, or even notice. Some complained when Intel started slapping GPUs onto the packaging of their CPUs, and people complained, but still bought them. Now they're on-die, and I haven't read any complaints on the Internet in a long time.

People dissatisfied with integrated graphics will keep using discrete cards, and I will remain one of those people for the foreseeable future, but that doesn't mean there will be many of us.

Besides, on desktops, APUs will only be limited by demand. What I mean by that is that in early 2014, AMD could technically make a 4-core Steamroller APU with ~30 Compute Units and a 384-bit GDDR5 interface. It would be very big, draw a lot of power and require a big cooler, but no more than a traditional quad-core + discrete graphics would. It's just that the market is too small for such a design to be worth making.

But at some point in the near-ish future, would it make sense to start producing ~150W APUs with powerful GPUs? Maybe, if there's enough demand for that. The point is that there's little discrete GPUs can do that APUs could not.

How long did it take for Samsung to emerge as a leader in the Android handset space? What guarantee is there that either Apple or Samsung will remain in the SoC business? You're making a 10 year prediction based on the status quo. Why are you so sure that Amazon, Google, Samsung, Microsoft, HTC or anyone else won't use Tegra chips in the future?

I can't see the future, all I can do is extrapolate current trends. And current trends say discrete graphics will shrink into a small niche, Apple will keep investing more and more into its own hardware IP and SoCs, and so will Samsung.

Speaking of trends, NVIDIA has been bragging about growing Tegra revenue, and that's true. Here are the numbers by fiscal year (remember that NVIDIA's calendar is a year ahead of everyone else's) in thousands of $:

2011: 197,613
2012: 591,166
2013: 764,447

Growth indeed. But what about net profits (losses)?:

2011: (49,238)
2012: (60,417)
2013: (157,923)
(See NVIDIA's latest yearly SEC filing, page 101.)

Oops.

Qualcomm is making a lot of money, TI and ST-E weren't, NVIDIA isn't either, and now Renesas is throwing in the towel as well: http://www.japantimes.co.jp/news/20...-to-sell-mobile-chip-operations/#.UU3MbjX2BQ4

How much money will the Tegra unit lose this year (2014)? How long can NVIDIA keep sinking money into this on the off chance that it might pay off later? Right now, GeForces/Quadros/Teslas make enough money that NVIDIA is still comfortably profitable overall. But for how long?

So once again I don't know what the future holds, all I can do is extrapolate, and it doesn't look good for NVIDIA. You can choose to believe that all of these trends will somehow be reversed in NVIDIA's favor. It could happen. I don't think it will.
 
Besides, on desktops, APUs will only be limited by demand. What I mean by that is that in early 2014, AMD could technically make a 4-core Steamroller APU with ~30 Compute Units and a 384-bit GDDR5 interface. It would be very big, draw a lot of power and require a big cooler, but no more than a traditional quad-core + discrete graphics would. It's just that the market is too small for such a design to be worth making.

Yes, that can happen but remains to be seen. I imagine that works if CPU performance becomes further commoditized and SKUs are primarily distinguished by compute/gpu throughput. Otherwise that segment of the market may not be too thrilled with forced CPU upgrades when they want faster graphics. Of course, people have gotten used to forced socket and motherboard upgrades so maybe this is just the next rung of that slippery ladder. In the end you still have to accommodate all the combinations of CPU and GPU performance that consumers demand.

How much money will the Tegra unit lose this year (2014)? How long can NVIDIA keep sinking money into this on the off chance that it might pay off later? Right now, GeForces/Quadros/Teslas make enough money that NVIDIA is still comfortably profitable overall. But for how long?

So once again I don't know what the future holds, all I can do is extrapolate, and it doesn't look good for NVIDIA. You can choose to believe that all of these trends will somehow be reversed in NVIDIA's favor. It could happen. I don't think it will.

As ams pointed out they have actually done pretty well considering they started from nothing a few years ago and haven't had a single integrated offering for smartphones until this year. They're sitting on a few billion in liquid assets so it looks like they can continue investing in Tegra for a long time. If anything their progress to date is positive given the behemoths that they're up against in the mobile space.
 
Yes, that can happen but remains to be seen. I imagine that works if CPU performance becomes further commoditized and SKUs are primarily distinguished by compute/gpu throughput. Otherwise that segment of the market may not be too thrilled with forced CPU upgrades when they want faster graphics. Of course, people have gotten used to forced socket and motherboard upgrades so maybe this is just the next rung of that slippery ladder. In the end you still have to accommodate all the combinations of CPU and GPU performance that consumers demand.

Yeah I think flexibility concerns will preclude 250W APUs. But 150W? It's not much of a stretch since Trinity is already at 100W. And I could be wrong, but I think that covers the bulk of the GPU market.

As ams pointed out they have actually done pretty well considering they started from nothing a few years ago and haven't had a single integrated offering for smartphones until this year. They're sitting on a few billion in liquid assets so it looks like they can continue investing in Tegra for a long time. If anything their progress to date is positive given the behemoths that they're up against in the mobile space.

Sure, considering the challenges they're facing, they're doing pretty damn well, at least in terms of market share. But as the Seeking Alpha article mentioned, some of that was due to TSMC's limited 28nm volume. I would imagine that some of it also has to do with aggressive pricing. Their very aggressive roadmaps helped too: Tegra 2 was the first dual-A9. It was the worst one, too, but for a while it didn't matter because it was the only one available. Same thing for Tegra 3 and being the first quad-A9, and probably the cheapest since it was on 40nm unlike all the others. Sadly, Tegra 4 didn't manage to be the first quad-A15, but it just might be, once again, the worst one. So this time it doesn't work quite as well.

Yes, come the end of the year, NVIDIA will have an integrated LTE offering. But how good will the modem be compared to Qualcomm's? And is there really a market for a quad-A9 in late 2013/2014? That might be a pretty tight squeeze between much cheaper and more power-efficient quad-A7 offerings from Marvell, and the much faster S800 from Qualcomm. And presumably, OMAP5 will still be offered, although that's not certain.

Intel is a bit of a wild card in this game. I suppose AMD might be a factor too, but whether they intend to compete for the Android market is unclear at this point. Such competition would probably be indirect, however, i.e. AMD might get into bigger/higher-power devices, Windows-based or otherwise, that ultimately compete with the Android devices that NVIDIA and Qualcomm compete for. But I don't see AMD trying to get into really low-power devices directly, I don't think Jaguar is really suitable for that.
 
Considering how much Tegra is costing Nvidia, AMD are wise to stay well out of it. I see no upturn in Nvidia's fortunes here for the near-mid future. It's just going to get worse when Intel starts to turn the screw. They will be looking to increase presence and share, and what easier target than Tegra?
 
Speaking of trends, NVIDIA has been bragging about growing Tegra revenue, and that's true. Here are the numbers by fiscal year (remember that NVIDIA's calendar is a year ahead of everyone else's) in thousands of $:

2011: 197,613
2012: 591,166
2013: 764,447

Growth indeed. But what about net profits (losses)?:

2011: (49,238)
2012: (60,417)
2013: (157,923)
(See NVIDIA's latest yearly SEC filing, page 101.)

IIRC, the increase in magnitude of the loss in FY 2013 (relative to prior fiscal years) is largely due to the cost it took to bring the Icera 4G LTE baseband modem to the market. These baseband-related costs are incurred in FY 2013, but revenues are not realized until FY 2014 at the earliest. So one should look at that as an investment made to enhance future revenues.
 
Considering how much Tegra is costing Nvidia, AMD are wise to stay well out of it. I see no upturn in Nvidia's fortunes here for the near-mid future. It's just going to get worse when Intel starts to turn the screw. They will be looking to increase presence and share, and what easier target than Tegra?

It is funny you say that because it is precisely Intel's licensing payments to NVIDIA that help NVIDIA to keep investing more and more in the Tegra business without seriously hurting their financial stability in the short-term. Yes, Intel will be a serious competitor in the future, but it would be very short-sided to think that NVIDIA's investment in mobile computing in the short-term will not pay dividends in the long-term.
 
Status
Not open for further replies.
Back
Top