NVIDIA Kepler speculation thread

What were the initial projections? They are the leading provider of SoC's for non-Apple tablets and have significant smartphone wins as well. Given the strength of the establishment (TI/Qualcomm/Imagination) it's a damn near miracle. The transformer is 400k units a month all by itself.

Are they really? The Galaxy Tab, BlackBerry PlayBook and HP TouchPad all use other chips. They do have design wins with Motorola and LG, but I don't think they sell nearly as many units as Samsung et al.


I don't recall nVidia promising that CUDA will run your OS and serial apps. They claim GPUs are more efficient for parallel workloads and it seems that supercomputer designers and users agree. PhysX is available as a product for anyone to use and it does change the games that use it.

I remember them saying that the CPU didn't matter anymore, and that GPUs would provide the computational power from then on. The overwhelming majority of the Top 500 is still purely CPU-based, starting from the #1 machine, actually (K Computer). And as a Linpack-derived listing, it's biased towards GPUs to begin with.

nVidia makes more money in the consumer market even when AMD has all possible advantages. They'll be just fine. They talk big and stumble often but they're ambitious and eventually get it done. That's what matters. Look at how difficult it is for others to do the same - AMD v Intel, Everybody v Apple etc.

I think most of the profits come from Quadros.
 
Are they really? The Galaxy Tab, BlackBerry PlayBook and HP TouchPad all use other chips. They do have design wins with Motorola and LG, but I don't think they sell nearly as many units as Samsung et al.

Actually, the Galaxy Tab 10.1 does use Tegra 2 (along with the Galaxy Z phone, btw), as does the Acer Iconia A500, and the Dell Streak. In the meanwhile, the Asus Transformer is the best selling tablet after the iPad. Being the reference model for Honeycomb has hurt none.
 
Are they really? The Galaxy Tab, BlackBerry PlayBook and HP TouchPad all use other chips. They do have design wins with Motorola and LG, but I don't think they sell nearly as many units as Samsung et al.

Like Florin said, the Galaxy Tab uses Tegra 2, and so do some variants of the Galaxy S II (thought that may be more to do with the fact that Samsung's own Exynos is in short supply). Tegra 2 being the reference platform for Honeycomb, is definitely far ahead of anyone else in tablet sales. Tegra 2 may not be the best SoC in town, but NV executed well and it was available roughly a quarter before its competition. Tegra may not be selling as well as they had hoped but its still doing well regardless

I remember them saying that the CPU didn't matter anymore, and that GPUs would provide the computational power from then on. The overwhelming majority of the Top 500 is still purely CPU-based, starting from the #1 machine, actually (K Computer). And as a Linpack-derived listing, it's biased towards GPUs to begin with.

While #1 does not use GPU's, #2, 4 and 5 do. Of course this discussion is a moot point as you cant really compare. GPU's can only be used to process certain workloads efficiently and are not faster than CPU's for all workloads that supercomputers are used for (as the #1 machine aptly demonstrates by not using GPU's at all)

I think most of the profits come from Quadros.

I don't recall them breaking out profits for Geforces from the rest of products.

We've had this discussion multiple times before so lets not go there again. Sure the profit margins on Quadros are quite high but the Quadro line would not be as profitable without the development cost being split by the consumer lines. Professional only sold $200M in the last quarter compared to $600M for consumer GPU's. NV wouldnt be able to sustain the professional line on its own.

Despite AMD having executed brilliantly since RV670, the fact is Nvidia have still been very profitable despite having inferior hardware at times. Im not an Nvidia fanboy but i still acknowledge that they have performed well.
 
Not sure why you say that. IIRC, AMD had >50% share for a while now. Besides, the dx11 numbers from steam suggest AMD won big there.
With only 5.6% share of the total market in DX11, it is far too early to call that.

Edit: Looks like I looked at the numbers a bit wrong. 5.6% is the percentage of total systems that are DX11-capable. There are many more systems that use DX11 GPUs but an older OS (roughly 3-4 times as many as those who have both DX11 and Windows Vista/7!). However, DX11 isn't everything, and nVidia still has massive overall marketshare. nVidia has shown a remarkable capacity to pick itself up after each setback, so I definitely wouldn't count them out just because more people are buying ATI DX11 GPU's at present.
 
Last edited by a moderator:
However, DX11 isn't everything, and nVidia still has massive overall marketshare. nVidia has shown a remarkable capacity to pick itself up after each setback, so I definitely wouldn't count them out just because more people are buying ATI DX11 GPU's at present.

It appears that you are confusing install base with market share. And nvidia is hardly out of the game.
 
My bad about the tablets, I just did a quick Google search on "leading tablets" but I really don't know which models are leading the market.

Still, I don't think NVIDIA sells Tegra 2 for more money when it's sold for a tablet than a smartphone, so aggregate (phones + tablets) volume should be the key metric.
 
Alexko said:
I think most of the profits come from Quadros.
If, as said above, quadro is $200M and Geforce is $600M, then that's very unlikely. Even if GF gross margins are only a low 33%, then it's gross profit already exceeds the total revenue of quadro.
 
That would still be a pretty big deal, considering that NVIDIA's net income last quarter was $135M. And the margins are so high for Quadros that $200M of revenue generate something like $150M gross profit, at least.
 
Alexko said:
That would still be a pretty big deal, considering that NVIDIA's net income last quarter was $135M. And the margins are so high for Quadros that $200M of revenue generate something like $150M gross profit, at least.
Sounds reasonable. But with that, the gross profit for GF is still going to be much higher than Quadro, not the other way around as you claimed.
 
I remember them saying that the CPU didn't matter anymore, and that GPUs would provide the computational power from then on. The overwhelming majority of the Top 500 is still purely CPU-based, starting from the #1 machine, actually (K Computer). And as a Linpack-derived listing, it's biased towards GPUs to begin with.

I suspect you remember things a bit incorrectly. nVidia's marketing has been laser focused on specific workloads where GPUs excel. In terms of Top500, CPUs have had a few decades headstart. You need to look at what's coming up, not what's been sitting around for years. K is impressive but will be eclipsed next year by a Bulldozer+Fermi setup.

Linpack seems to be doing very well on CPUs with performance very close to theoretical peaks. The primary GPU advantage today is perf/$.

I don't recall them breaking out profits for Geforces from the rest of products.

No need to see an nVidia breakdown. We know that AMD is losing money on graphics even with all the perf/w and perf/$ advantages they enjoy. That speaks for itself.

It appears that you are confusing install base with market share. And nvidia is hardly out of the game.

http://www.maximumpc.com/article/ne...esktop_gpu_market_share_amd_notebook_graphics.

nVidia is doing better on desktops than conventional wisdom would indicate. Must be China.
 
While #1 does not use GPU's, #2, 4 and 5 do. Of course this discussion is a moot point as you cant really compare. GPU's can only be used to process certain workloads efficiently and are not faster than CPU's for all workloads that supercomputers are used for (as the #1 machine aptly demonstrates by not using GPU's at all)

While it really is a moot point, I think raw G or rather PFLOPS are the most useless metric to compare here. Supercomputers are, after all, only budget limited - just look at the number of processing cores (not sure if everyone's counting GPUs in the same way btw). More useful would be Rpeak/Rmax ratio and total power per TFLOPS. But even those almost naturally lend themselves to more modern architectures thus generally favoring systems in the Top10.
 
Sounds reasonable. But with that, the gross profit for GF is still going to be much higher than Quadro, not the other way around as you claimed.

I remember reading that Quadros amounted to 1/3 of NVIDIA's revenue and 2/3 of their profits, but I suppose that may well be outdated, or even wrong in the first place.

That said, $150M gross profit from Quadros would be enough to explain why AMD's graphics division is losing money while NVIDIA is making $135M. I guess Tegra should also be profitable by now, but the profits it generates are probably still dwarfed by those due to Quadros.

I suspect you remember things a bit incorrectly. nVidia's marketing has been laser focused on specific workloads where GPUs excel. In terms of Top500, CPUs have had a few decades headstart. You need to look at what's coming up, not what's been sitting around for years. K is impressive but will be eclipsed next year by a Bulldozer+Fermi setup.

Linpack seems to be doing very well on CPUs with performance very close to theoretical peaks. The primary GPU advantage today is perf/$.

Well, I get older every day, so my memory may not be what it used to, but I'm pretty sure JHH made some rather misleading statements about CPUs becoming of very little importance in the future of computing vs. GPUs. I'll try to find some quotes.

Now I won't deny a certain amount of inertia in HPC (or any professional domain, really) but still, the "slowest" computer in the Top500 was built in 2010 around Nehalem-EP CPUs. GPUs have been available for a while now, if they were really doing so well in HPC, I think we'd see more of them among the 500 fastest machines.

No need to see an nVidia breakdown. We know that AMD is losing money on graphics even with all the perf/w and perf/$ advantages they enjoy. That speaks for itself.

As I pointed out above, that's probably mostly due to Quadros.

Another thing worth considering is that with APUs, it's really not clear what part of APU-related spending and revenue ends up on the graphics division's balance sheet, and what part ends up on the CPU division's. This could potentially generate discrepancies in the hundreds of millions.

That's not to say that it's happening or, even if it is, that it's not to the benefit of the graphics division, but it makes it very difficult to break down AMD's finances. It really depends on what AMD would like us to believe about their different divisions.


http://www.maximumpc.com/article/ne...esktop_gpu_market_share_amd_notebook_graphics.

nVidia is doing better on desktops than conventional wisdom would indicate. Must be China.

Yeah, these days the split seems to be 60/40, to NVIDIA's advantage on desktops and to AMD's on notebooks. I'm not entirely sure why, perhaps it has to do with better power-efficiency on AMD's parts, or simply sales strategies. After all, OEM deals have little to do with list prices—as conventional wisdom goes, anyway—so who knows what's really happening behind closed doors?
 
Now I won't deny a certain amount of inertia in HPC (or any professional domain, really) but still, the "slowest" computer in the Top500 was built in 2010 around Nehalem-EP CPUs. GPUs have been available for a while now, if they were really doing so well in HPC, I think we'd see more of them among the 500 fastest machines.

The first GPU with ECC (a requirement for serious HPC) came to market less than 18 months ago and already has a strong presence in the fastest computers in the world. That's a very fast uptake by any measure.

Yeah, these days the split seems to be 60/40, to NVIDIA's advantage on desktops and to AMD's on notebooks. I'm not entirely sure why, perhaps it has to do with better power-efficiency on AMD's parts, or simply sales strategies. After all, OEM deals have little to do with list prices—as conventional wisdom goes, anyway—so who knows what's really happening behind closed doors?

Well whatever it is, they're currently making less from their half of the market than nVidia. At first blush it would seem they're trading margin for marketshare.

Anyhow, my point is that Tegra and Tesla successfully invaded two new markets previously unavailable to nVidia. They seem to know what they're doing and if they claim Kepler is 3x as power efficient as Fermi I'll take their word on it unless proven otherwise.
 
Anyhow, my point is that Tegra and Tesla successfully invaded two new markets previously unavailable to nVidia. They seem to know what they're doing and if they claim Kepler is 3x as power efficient as Fermi I'll take their word on it unless proven otherwise.
Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.
 
NVidia's still lying about the introduction date of Fermi (beginning of 2009 on that cute graph, well over a year earlier than reality), why the hell would anyone take anything else NVidia says seriously?
 
Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.

It depends on Kepler's SP/DP ratio. If it remains at 2:1 (and why wouldn't it) then yeah not so great given Fermi's unstellar record w.r.t power consumption. Any more than that and it would be a big deal. I'm reserving all judgement until we see GCN though. If AMD can adopt all the advantages of nVidia's approach while leaving behind the excess baggage it would change the landscape considerably.
 
Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.
Fermi is broken and unfixable, remember? :)

NVidia's still lying about the introduction date of Fermi (beginning of 2009 on that cute graph, well over a year earlier than reality), why the hell would anyone take anything else NVidia says seriously?
Hey, get out of Jawed's account Charlie! ;)

Seriously though: The graph seems to have data points aligned to calender years, nothing indicates that they're meant to mark the beginning of the respective years. On ISC however, Nvidia had their slides updated to show Fermi for 2010 and Kepler for 2012:
http://www.pcgameshardware.de/aid,8...1/Grafikkarte/News/bildergalerie/?iid=1537849
Don't know why at this event they've reverted back to their old slide deck.
 
Last edited by a moderator:
Back
Top