JPR Q3 2011 shipments

I'm guessing Intel got a large boost due to practically all their consumer level CPU's containing an integrated GPU. And since they sell by far the most CPU's with integrated graphics...

AMD probably managed to hold the line roughly speaking with APU's shoring up the discrete losses. Although those same APU's are probably also why Nvidia gained so much in discrete desktop graphics sales. Each APU sale in the desktop space is one less entry/budget level discrete graphics card for AMD to sell.

And Nvidia obviously had the huge loss (23%) in share due to the loss of the integrated market.

So basically the most interesting thing from that report is that AMD and Intel "APU"s have cannibalized much of the budget/oem discrete card sales thus lowering their percentage of discrete desktop marketshare. That's even more evident if you consider that AMD budget/oem discrete desktop cards were probably likely sold at a discount when OEMs bought package (CPU + MB) deals hence AMD losing more discrete desktop card sales in OEM land than Nvidia leading to Nvidia gaining 10.9% in discrete desktop marketshare.

Regards,
SB
 
The real question is what's the double attach rate and how is it accounted for (if at all) in these statistics?
 
nVidia gained 30% overall in discrete which means a healthy gain on laptops with integrated gpus.

At least one and often two GPUs are present in every PC shipped. It can take the form of a discrete chip, a GPU integrated in the chipset, or a GPU embedded in the CPU. The average has grown from 115% in 2001 to almost 160% GPUs per PC.
 
nVidia gained 30% overall in discrete which means a healthy gain on laptops with integrated gpus.

Which in itself makes you wonder just how many discrete budget sales were lost overall. For AMD it wasn't really a loss as evidenced by their relatively flat overall marketshare. Basically whatever they lose in discrete was made up for by an equivalent increase in integrated.

For Nvidia despite gaining 30% in discrete (due to AMD losing probably virtually all budget discrete sales) they lost a large 23% of the overall market.

So it's entirely possible (although probably unlikely) that discrete sales are down for Nvidia but they still made large marketshare gains due to large chunks of AMD's discrete marketshare migrating to integrated.

Regards,
SB
 
Yup definitely possible and probable as well. I dont know how much llano volume is out there though or if it's enough to move the market significantly.
 
nVidia gained 30% overall in discrete which means a healthy gain on laptops with integrated gpus.
Right, but the 160% figure isn't as helpful as it seems because it doesn't tell you whether the integrated GPU is used for something like AMD Hybrid CrossFire/NVIDIA Optimus or if it's completely redundant. I'm also not sure what's the attach rate for Hybrid CrossFire for Llano since that would in theory mitigate the impact on budget GPUs for AMD - I doubt it's very high though.
 
Well, I can't understand one thing. I took the comparision chart from the current Q3 article (here) and Q2 article (here):

comparision_q2q3138b.png


Why are the Q2/2011 numbers so much different? And why are the Q2/2011 numbers in the new (upper) chart almost identical to Q1/2011 numbers in the old (lower) chart? Am I missing anything?
 
Why are the Q2/2011 numbers so much different? And why are the Q2/2011 numbers in the new (upper) chart almost identical to Q1/2011 numbers in the old (lower) chart? Am I missing anything?
One is Qtr*, the other is quarter, that's why.

*Qtr stands for "questionable trumped-up results." :)
 
Yes, their numbers are often adjusted additionally by tenths of percent. But this time it looks to me they screwed up the chart and filled in adjusted Q1/11 values instead of Q2/11 numbers. It's interesting, that no article noticed :???:
 
Seen that before with those numbers when trying to go back ;) I guess they adjust the estimates..

Btw, Mercury doesn't really agree on that discrete shift either: http://investorvillage.com/smbd.asp?mb=476&mn=224193&pt=msg&mid=11104856

That certainly contradicts JPR quite a bit.

As well it's interesting that from the numbers you can infer just how insignificant chipset integrated graphics are compared to CPU integrated graphics now.

15.9M out of Intel's 77.9M units. Amazing that those parts being phased out still almost equal all of AMD's discrete parts.
4M out of AMD's 32.1M although that's still a large chunk of their overall integrated (15.3M)

CPU integrated mobile graphics for AMD almost equals their discrete mobile graphics at 9.6M versus 9.9M. Fascinating.

Also fascinating is that integrated chipset graphics still account for 2.9M units for Nvidia. Although are CUDA parts without display outputs still considered GPUs?
 
Also fascinating is that integrated chipset graphics still account for 2.9M units for Nvidia. Although are CUDA parts without display outputs still considered GPUs?

I don't think those parts are considered, but their numbers are probably very small.

That 2.9M figure is indeed interesting (and worrying), because eventually, it's going to vanish (in a quarter or two, I'd expect). And if the low-end keeps being eroded by Llano and Sandy, and especially by Trinity and Ivy, then NVIDIA had better hold on really tight to their desktop discrete market share, because they won't have much more left.

Plus, if AMD does get to 28nm earlier, as is expected, even that might be tough.

I can't help wondering how relevant NVIDIA will be deemed by game developers if their share ends up around 10%.
 
Yes, their numbers are often adjusted additionally by tenths of percent. But this time it looks to me they screwed up the chart and filled in adjusted Q1/11 values instead of Q2/11 numbers. It's interesting, that no article noticed :???:

Ahh yes.. those Q1 numbers are awfully close to the "new Q2" numbers - looks like a plain screwup, which could also explain the difference to the Mercury trends.
 
That 2.9M figure is indeed interesting (and worrying), because eventually, it's going to vanish (in a quarter or two, I'd expect). And if the low-end keeps being eroded by Llano and Sandy, and especially by Trinity and Ivy, then NVIDIA had better hold on really tight to their desktop discrete market share, because they won't have much more left.
I'm willing to bet a lot of money that the majority of that 2.9M is MCP61 for the ultra-low-end desktop market. It does support AM2+ and combined with something like the 117mm² 45nm dual-core Regor chip, it's going to be a lot lot cheaper than Llano, and have much faster CPU performance than Zacate.

So there's still a niche for NVIDIA to be selling AMD IGPs (believe it or not). It's certainly going to go down over time and eventually disappear but I'd expect that to take much longer than 1 or 2 quarters.

I can't help wondering how relevant NVIDIA will be deemed by game developers if their share ends up around 10%.
The total share is completely irrelevant for game developers that also target high-end GPUs rather than just the low-end.

Wake me up when NVIDIA has less than 30% discrete desktop share. AMD has been unable to make that happen even in the RV770 era where they had an extremely impressive die size/cost advantage, and NVIDIA has proven they are ready to fight for market share by cutting their margins drastically if that's their only choice. On the other hand, AMD could certainly make a lot of money if they deliver on their 28nm generation.
 
I'm willing to bet a lot of money that the majority of that 2.9M is MCP61 for the ultra-low-end desktop market. It does support AM2+ and combined with something like the 117mm² 45nm dual-core Regor chip, it's going to be a lot lot cheaper than Llano, and have much faster CPU performance than Zacate.

So there's still a niche for NVIDIA to be selling AMD IGPs (believe it or not). It's certainly going to go down over time and eventually disappear but I'd expect that to take much longer than 1 or 2 quarters.

There was supposed to be a dual-core ASIC for Llano, with 160SPs, I believe. I don't know if it's actually available yet, but if it is, I'd expect it to be around 130mm², so not that much more expensive.

Still, Jon Peddie seems to agree that most of NVIDIA's remaining IGP market is actually for AMD CPUs. I'm not sure how long it will take for AMD to phase AM2+ out, but once they do, that market is gone.

With Trinity coming up, along with the Brazos refresh, I don't see much point in keeping AM2+ around, unless they can't meet demand with 40/32nm chips alone. We'll see, but I'd be really surprised if the AM2+ market still amounts to anything substantial in a year.


The total share is completely irrelevant for game developers that also target high-end GPUs rather than just the low-end.

Wake me up when NVIDIA has less than 30% discrete desktop share. AMD has been unable to make that happen even in the RV770 era where they had an extremely impressive die size/cost advantage, and NVIDIA has proven they are ready to fight for market share by cutting their margins drastically if that's their only choice. On the other hand, AMD could certainly make a lot of money if they deliver on their 28nm generation.

It wasn't so long ago that integrated GPUs such as Intel's GMA were completely irrelevant for most game developers. They were just too slow for actually playing anything on them, and in fact, many games didn't even work on them, let alone play smoothly.

I think it's fair to say that this is no longer the case, and that future APUs (from both Intel and AMD) will make it even less true. At some point, developers will start considering APUs as the primary target for (at least mainstream) games and when that happens, they'll have to choose where to spend the largest amount of effort. Intel will be the biggest player, but with relatively slow graphics, while AMD will come second but with faster integrated GPUs, and much faster discrete GPUs based on the same (or a very similar) architecture. How much time will be left to optimize for NVIDIA?

I don't know if this will become reality in 2012, 2013 or even later, but eventually it seems unavoidable.
 
I think it's fair to say that this is no longer the case,
I don't. While things have certainly improved the the integrated arena, the absolute performance is still abysmal (especially at 1080p).
and that future APUs (from both Intel and AMD) will make it even less true. At some point, developers will start considering APUs as the primary target for (at least mainstream) games and when that happens, they'll have to choose where to spend the largest amount of effort.
Certainly it will happen eventually, but I can't see it happening with either Ivy or Trinity...

Intel will be the biggest player, but with relatively slow graphics, while AMD will come second but with faster integrated GPUs, and much faster discrete GPUs based on the same (or a very similar) architecture. How much time will be left to optimize for NVIDIA?
I suppose that will depend on how WoA does in the consumer space, and how well Nvidia does in mobile.
 
I don't. While things have certainly improved the the integrated arena, the absolute performance is still abysmal (especially at 1080p).

At 1080p, sure, but the gaming PC world is increasingly mobile. And Llano does an acceptable job with most games at 720p with medium settings. Sandy doesn't do quite as well, of course, but most games are still playable, provided you don't mind low settings.

Certainly it will happen eventually, but I can't see it happening with either Ivy or Trinity...

I don't either, but they'll bring us closer to that point.

I suppose that will depend on how WoA does in the consumer space, and how well Nvidia does in mobile.

Given that we're talking about different platforms that often feature different games, and that the GPU architecture in Tegra has little in common with Fermi—let alone Kepler and future desktop architectures—I think that's largely orthogonal.
 
I don't. While things have certainly improved the the integrated arena, the absolute performance is still abysmal (especially at 1080p). Certainly it will happen eventually, but I can't see it happening with either Ivy or Trinity...

Sure but 720p is adequate in most cases. Especially if it's used in conjunction with a large screen TV in the living room. Uncommon perhaps, but I see that increasing in popularity. On my 46" TV, I honestly cannot tell the difference between a game in 720p and 1080p unless I get uncomfortably close to the screen.

And while it's not going to be compelling with the bleeding edge tech games (how many of those still exist or are in developement?) it's still going to be very relevant for the majority of developers who aren't pushing the envelope.

Regards,
SB
 
Sure but 720p is adequate in most cases. Especially if it's used in conjunction with a large screen TV in the living room. Uncommon perhaps, but I see that increasing in popularity. On my 46" TV, I honestly cannot tell the difference between a game in 720p and 1080p unless I get uncomfortably close to the screen.
??? The typical usage for a Desktop PC is connected to a widescreen monitor not a TV. People typically sit relatively close to their screens and want to be able to play at their monitor's native resolution.

So is 720p adequate in most cases? Sure as long as you are talking about playing on a TV while sitting several feet away from the screen. I'm not sure how much of the Desktop PC market that covers though...

Given that we're talking about different platforms that often feature different games, and that the GPU architecture in Tegra has little in common with Fermi—let alone Kepler and future desktop architectures—I think that's largely orthogonal.
I don't really see it as orthogonal. Your post was about devs focusing on where the market share is (unless I misinterpreted it) and mobile is where the market share is. http://www.reuters.com/article/2011/11/02/us-rovio-idUSTRE7A137Q20111102

The current GPU arch in Tegra has little in common with its desktop counter parts, but I imagine that will change over time. Nvidia could probably fit 1 Fermi SM minus DP on a 28nm SoC if they really wanted to...
 
Last edited by a moderator:
Back
Top