2015: Discrete = 10%

He seems quite confused over Sandy Bridge, electing to describe it as Embedded Processor Graphics while Ontario/Llano are described as Heterogeneous Processor Units. Huh?

Yes. It's not a very relevant distinction though. In 2015 about 80% of all graphics have moved in with the CPU according to JP. Discrete graphics have been reduced to one 3rd. If true, Intel and AMD will be fine, but what can nvidia do to avoid becoming a niche player in PC graphics?
 
Yes. It's not a very relevant distinction though. In 2015 about 80% of all graphics have moved in with the CPU according to JP. Discrete graphics have been reduced to one 3rd. If true, Intel and AMD will be fine, but what can nvidia do to avoid becoming a niche player in PC graphics?

Why nVidia? AMD will have bigger problems. They must beat Intel with their CPUs. They have a cpu mobile market share of 13%. With a dying discrete market - what will happen to a company which is selling must of their gpus in the low-end (combine with Intel CPUs!) and have no Workstation/HPC market?
 
Why nVidia? AMD will have bigger problems. They must beat Intel with their CPUs. They have a cpu mobile market share of 13%. With a dying discrete market - what will happen to a company which is selling must of their gpus in the low-end (combine with Intel CPUs!) and have no Workstation/HPC market?

AMD always had to beat Intel with their CPU's. Nothing has changed here. With the addition of embedded graphics, they should be better positioned though. I'm not saying it will be easy, but to me it's clear what they need to do.

I'm not sure what nvidia can do though. Will they be able to compete with embedded graphics? Or will their market simply shrink?
 
AMD always had to beat Intel with their CPU's. Nothing has changed here. With the addition of embedded graphics, they should be better positioned though. I'm not saying it will be easy, but to me it's clear what they need to do.

I'm not sure what nvidia can do though. Will they be able to compete with embedded graphics? Or will their market simply shrink?

The discrete market will shrink, it's unavoidable. The low end will disappear pretty quickly, and then the mainstream segment will slowly erode, until only the high end remains.

AMD should be fine because they'll just sell APUs instead of graphics cards, and at a moderately higher price than what they can charge for their current CPUs. In fact, that puts them in a slightly better competitive position, because if Llano is likely not going to be able to match Sandy Bridge in CPU performance, it should do much better on the GPU side. That's much more than Athlon II could say about Nehalem/Westmere, or even Penryn, for that matter.

NVIDIA will just have to accept that the GPU market is going to shrink by a lot. To make up for it, they rely on Tegra and Tesla. The former hasn't been successful so far, but the latter seems to be doing a little bit better.

Protecting their workstation (Quadro) market-share will also be crucial for NVIDIA.
 
the mainstream segment will slowly erode

The mainstream discrete power budget is 100w with access to ~80GB/s of bandwidth. I don't see the integrated graphics of mainstream CPUs matching the performance of a 100w graphics card, even with Intel's superior manufacturing capabilities.
 
The mainstream discrete power budget is 100w with access to ~80GB/s of bandwidth. I don't see the integrated graphics of mainstream CPUs matching the performance of a 100w graphics card, even with Intel's superior manufacturing capabilities.

Matching it? Probably not, but as time passes, I think the importance of integrated GPUs relative to CPU cores, and therefore the silicon area and power allocated to them will grow.

After a while, the relative performance gap between a 100W "APU" and a standard (50W?) CPU + 100W discrete graphics card will shrink, perhaps to 30% or so (bit of a WAG, here).

And at that point, you have to ask yourself whether investing in the discrete graphics card is really worth it, especially considering that, by then, there won't be any "standard" CPUs in this segment, just APUs, so no matter what, you have the integrated graphics. "Pure" CPUs will be reserved for the high end, and obviously servers.

At that point 150+W graphics cards should be fine, but anything below 100W isn't safe, in my opinion. Of course this is a ~2015 phenomenon, I don't expect to see that in 2011-2012.
 
Ultimately it doesn't matter how much power headroom a discrete card has - integration will eventually kill it because the volume isn't there.

The pressing issue for volume of high-end discrete: who's going to have a box big enough to insert a 300W discrete card? Can you fit one in here:

http://www.engadget.com/2010/09/14/msis-wind-top-ae2420-3d-hits-us-shores-for-1-800-blu-ray-and/

or in an iMac? I'm not sure of the power consumption of the HD5750 that is an option for the 27" iMac. Total power consumption of that iMac is apparently 365W.
 
The "death" of discrete GPU in the PC realm ironically could be a rebirth for mass pc gaming.
 
Last edited by a moderator:
Are these "projections" just in terms of sales or in terms of "what users will use"? If the former, it's an obviously irrelevant category in the long run, as everyone will have a CPU with some form of embedded graphics chip on it. Yay 100% market share...

If the latter, that seems hard to measure and even harder to predict. What about solutions like NVIDIA's Optimus too? How would that be counted?
 
The pressing issue for volume of high-end discrete: who's going to have a box big enough to insert a 300W discrete card? Can you fit one in here:
Why are All-in-one's going to afftect the market for >225W boards? They are in an entirely different segement and they have already been around for years and yet we still see systems new systems build with high end discrete.

OEM's like to offer differentiation and upsell capabilities, and one of the areas they look at is to upsell is graphics performance.

Are these "projections" just in terms of sales or in terms of "what users will use"? If the former, it's an obviously irrelevant category in the long run, as everyone will have a CPU with some form of embedded graphics chip on it. Yay 100% market share...

What are the raw numbers as well. PC's are still projected to grow in this timeframe - so how JP predicting the number to change? What class of devices is he including in these %'ages?
 
What are the raw numbers as well. PC's are still projected to grow in this timeframe - so how JP predicting the number to change? What class of devices is he including in these %'ages?
Good point. The fact that there are likely to be way more "computer-like devices" with integrated graphics in this time-frame is not surprising to anyone but skews the percentages. If however they are literally saying that 2/3 of the people who currently buy discrete GPUs are going to stop doing it in the next 5 years, that's a bit more bold prediction. I never really understood the sub-$100 discrete GPU market in the past few years, but above that, what is going to change that would make people stop wanting the performance that they are clearly already paying for? Or is there just a false economy somewhere and people are buying more power than they need (OEMs maybe)? Seems unlikely...

Among gamers, Steam seems to indicate that the demand for even high-end discrete cards continues to be fairly strong, and there's honestly no need for discrete graphics cards outside of gaming and professional applications. That has been true for years though, so I'm not sure what they imagine will change here.

The one thing that is clear is that processor-integrated-graphics will kill chipset graphics and <$100 discrete. I'm not sure how that really matters though...
 
Matching it? Probably not, but as time passes, I think the importance of integrated GPUs relative to CPU cores, and therefore the silicon area and power allocated to them will grow.

I'm not expecting a dramatic re-allocation though, especially not by Intel.

After a while, the relative performance gap between a 100W "APU" and a standard (50W?) CPU + 100W discrete graphics card will shrink, perhaps to 30% or so (bit of a WAG, here).

How fast do you think a hypothetical 100w Intel GPU would be in 2015 compared to AMD and Nvidia's stuff? It'll be great when everyone has a useful GPU in their machine but unless display tech or 3D APIs start to stagnate I don't see integrated catching up in just four years.

I'm not sure how that really matters though...

Fewer boring and inconsequential product reviews? :)
 
The mainstream discrete power budget is 100w with access to ~80GB/s of bandwidth. I don't see the integrated graphics of mainstream CPUs matching the performance of a 100w graphics card, even with Intel's superior manufacturing capabilities.

Mainstream discrete power budget may be 100W. Of course, the ever increasng power draw of discrete graphics is a contributing reason to why it's predicted to have a strongly declining market share. It is completely unacceptable for laptops, it's obviously unacceptable for anything in an office, it's unacceptable for htpcs, and so on. ATI and nVidia have evolved themselves into a shrinking niche. Not without reasons, obviously, but the trends have been plain to see for a long time now.

Jon Peddie simply extrapolates - the actual numbers may be a bit off, but the trend is indisputable.
 
http://www.xbitlabs.com/news/multim..._Computers_Will_Lose_Market_Share_Report.html

Note the article is generally about enthusiast hardware, not specifically about enthusiast graphics cards.

JPR is forecasting a shift in product mix demand as the worldwide PC gaming user base continues to increase in size. By 2013 the enthusiast class will lose market share to the performance and mainstream classes from 46% to 35% of dollars spent. The good news for enthusiast hardware producers is that this "market share shrink" occurs in an expanding market and expenditures on the enthusiast class will grow from $9.5 billion to almost $12.5 billion in 2013.

"PC hardware has caught up to most of the software and people are able to play computationally intensive games on performance level systems. Performance systems now even support high-resolution for all but the most demanding simulations and FPSs. The frequency of Direct X updates is also driving some people toward mid-range GPUs. Some gamers are buying performance GPUs at a higher refresh rate to engage the latest Direct X version, instead of a longer term investment for Enthusiast GPUs,” said Ted Pollak, video game industry analyst for JPR.
What proportion of an enthusiast system's cost is the graphics? Is that proportion going up or down?

How many gamers are going to shift downwards to Fusion?

What will the next round of consoles do? And will cloud gaming take off?
 
It is completely unacceptable for laptops, it's obviously unacceptable for anything in an office, it's unacceptable for htpcs, and so on.
Right, but none of those uses have needed discrete graphics cards for at least the past 2-3 years, if not more. The question is whether or not consumers just haven't realized this yet, and that is unrelated to processor-integrated-graphics really, it's just simply a potential false economy if people are still buying discrete cards for these applications.

Interesting. That information indeed implies that there is a strong (if not dominant) "increase in other segments" effect rather than a decrease in enthusiast-level hardware though, which makes a hell of a lot more sense than people just wanting less graphics all of a sudden (which steam results definitely do not support).
 
The one thing that is clear is that processor-integrated-graphics will kill chipset graphics and <$100 discrete. I'm not sure how that really matters though...
The <100$ discrete graphics market matter little only if you assume that AMD and nVidia make no profit from it. AMD may recoup some of those losses from increasing CPU average selling prices. Then again, their CPU market share isn't as high as their GPU share, and frankly I doubt the market will accept significantly higher CPU ASPs going forward. Rather, I'd tend to see integrating graphics as a measure to keep ASPs from dropping faster than they otherwise would.

nVidia, of course, doesn't have any such compensating effect to bolster their bottom line.
 
Right, but none of those uses have needed discrete graphics cards for at least the past 2-3 years, if not more. The question is whether or not consumers just haven't realized this yet, and that is unrelated to processor-integrated-graphics really, it's just simply a potential false economy if people are still buying discrete cards for these applications.

Oh, I'm quite certain there is more than a little inertia in consumer habits.
In this particular case I'm a perfect example myself. :)
 
Good point. The fact that there are likely to be way more "computer-like devices" with integrated graphics in this time-frame is not surprising to anyone but skews the percentages. If however they are literally saying that 2/3 of the people who currently buy discrete GPUs are going to stop doing it in the next 5 years, that's a bit more bold prediction. I never really understood the sub-$100 discrete GPU market in the past few years, but above that, what is going to change that would make people stop wanting the performance that they are clearly already paying for? Or is there just a false economy somewhere and people are buying more power than they need (OEMs maybe)? Seems unlikely...

Among gamers, Steam seems to indicate that the demand for even high-end discrete cards continues to be fairly strong, and there's honestly no need for discrete graphics cards outside of gaming and professional applications. That has been true for years though, so I'm not sure what they imagine will change here.

The one thing that is clear is that processor-integrated-graphics will kill chipset graphics and <$100 discrete. I'm not sure how that really matters though...

More than 2/3 of the people who currently buy discrete GPUs actually buy OEM designs, and, increasingly, laptops. Don't forget that discrete mobile graphics still counts as discrete.

So it's not as if those people were going out of their way to purchase a graphics card to put in their machine, they just ask the salesman which computer is the best choice to play games for less than $[whatever-they're-willing-to-spend] and then they just get that one.

And if you're not willing to spend, say, more than $500, you'll get more bang for your buck from an APU than from an APU + discrete graphics. If you're buying a laptop on top of it (which would put you in the current majority, and the 2015 vast majority) then it's also a matter of how much you can get for, say, less than 80W. Again, APU is the way to go.


I'm not expecting a dramatic re-allocation though, especially not by Intel.

How fast do you think a hypothetical 100w Intel GPU would be in 2015 compared to AMD and Nvidia's stuff? It'll be great when everyone has a useful GPU in their machine but unless display tech or 3D APIs start to stagnate I don't see integrated catching up in just four years.

I admittedly was thinking more of AMD's APUs than Intel's, mainly because I have no idea when Intel will catch up to AMD in graphics performance (in the broad sense, including the essential perf/mm² and perf/W metrics) if ever.
 
Back
Top