What were the initial projections? They are the leading provider of SoC's for non-Apple tablets and have significant smartphone wins as well. Given the strength of the establishment (TI/Qualcomm/Imagination) it's a damn near miracle. The transformer is 400k units a month all by itself.
I don't recall nVidia promising that CUDA will run your OS and serial apps. They claim GPUs are more efficient for parallel workloads and it seems that supercomputer designers and users agree. PhysX is available as a product for anyone to use and it does change the games that use it.
nVidia makes more money in the consumer market even when AMD has all possible advantages. They'll be just fine. They talk big and stumble often but they're ambitious and eventually get it done. That's what matters. Look at how difficult it is for others to do the same - AMD v Intel, Everybody v Apple etc.
I don't recall them breaking out profits for Geforces from the rest of products.Financial results say otherwise.
Are they really? The Galaxy Tab, BlackBerry PlayBook and HP TouchPad all use other chips. They do have design wins with Motorola and LG, but I don't think they sell nearly as many units as Samsung et al.
Are they really? The Galaxy Tab, BlackBerry PlayBook and HP TouchPad all use other chips. They do have design wins with Motorola and LG, but I don't think they sell nearly as many units as Samsung et al.
I remember them saying that the CPU didn't matter anymore, and that GPUs would provide the computational power from then on. The overwhelming majority of the Top 500 is still purely CPU-based, starting from the #1 machine, actually (K Computer). And as a Linpack-derived listing, it's biased towards GPUs to begin with.
I think most of the profits come from Quadros.
I don't recall them breaking out profits for Geforces from the rest of products.
With only 5.6% share of the total market in DX11, it is far too early to call that.Not sure why you say that. IIRC, AMD had >50% share for a while now. Besides, the dx11 numbers from steam suggest AMD won big there.
However, DX11 isn't everything, and nVidia still has massive overall marketshare. nVidia has shown a remarkable capacity to pick itself up after each setback, so I definitely wouldn't count them out just because more people are buying ATI DX11 GPU's at present.
If, as said above, quadro is $200M and Geforce is $600M, then that's very unlikely. Even if GF gross margins are only a low 33%, then it's gross profit already exceeds the total revenue of quadro.Alexko said:I think most of the profits come from Quadros.
Sounds reasonable. But with that, the gross profit for GF is still going to be much higher than Quadro, not the other way around as you claimed.Alexko said:That would still be a pretty big deal, considering that NVIDIA's net income last quarter was $135M. And the margins are so high for Quadros that $200M of revenue generate something like $150M gross profit, at least.
I remember them saying that the CPU didn't matter anymore, and that GPUs would provide the computational power from then on. The overwhelming majority of the Top 500 is still purely CPU-based, starting from the #1 machine, actually (K Computer). And as a Linpack-derived listing, it's biased towards GPUs to begin with.
I don't recall them breaking out profits for Geforces from the rest of products.
It appears that you are confusing install base with market share. And nvidia is hardly out of the game.
While #1 does not use GPU's, #2, 4 and 5 do. Of course this discussion is a moot point as you cant really compare. GPU's can only be used to process certain workloads efficiently and are not faster than CPU's for all workloads that supercomputers are used for (as the #1 machine aptly demonstrates by not using GPU's at all)
Sounds reasonable. But with that, the gross profit for GF is still going to be much higher than Quadro, not the other way around as you claimed.
I suspect you remember things a bit incorrectly. nVidia's marketing has been laser focused on specific workloads where GPUs excel. In terms of Top500, CPUs have had a few decades headstart. You need to look at what's coming up, not what's been sitting around for years. K is impressive but will be eclipsed next year by a Bulldozer+Fermi setup.
Linpack seems to be doing very well on CPUs with performance very close to theoretical peaks. The primary GPU advantage today is perf/$.
No need to see an nVidia breakdown. We know that AMD is losing money on graphics even with all the perf/w and perf/$ advantages they enjoy. That speaks for itself.
http://www.maximumpc.com/article/ne...esktop_gpu_market_share_amd_notebook_graphics.
nVidia is doing better on desktops than conventional wisdom would indicate. Must be China.
Now I won't deny a certain amount of inertia in HPC (or any professional domain, really) but still, the "slowest" computer in the Top500 was built in 2010 around Nehalem-EP CPUs. GPUs have been available for a while now, if they were really doing so well in HPC, I think we'd see more of them among the 500 fastest machines.
Yeah, these days the split seems to be 60/40, to NVIDIA's advantage on desktops and to AMD's on notebooks. I'm not entirely sure why, perhaps it has to do with better power-efficiency on AMD's parts, or simply sales strategies. After all, OEM deals have little to do with list prices—as conventional wisdom goes, anyway—so who knows what's really happening behind closed doors?
Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.Anyhow, my point is that Tegra and Tesla successfully invaded two new markets previously unavailable to nVidia. They seem to know what they're doing and if they claim Kepler is 3x as power efficient as Fermi I'll take their word on it unless proven otherwise.
Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.
Fermi is broken and unfixable, remember?Could be. The comparison is against broken Fermi though (some GF100 based product) not fixed Fermi so that's more like a 2-2.5 increase against what we have today which is a bit less impressive.
Hey, get out of Jawed's account Charlie!NVidia's still lying about the introduction date of Fermi (beginning of 2009 on that cute graph, well over a year earlier than reality), why the hell would anyone take anything else NVidia says seriously?
Corporate schizophrenia?Don't know why at this event they've reverted back to their old slide deck.