2015: Discrete = 10%

How fast do you think a hypothetical 100w Intel GPU would be in 2015 compared to AMD and Nvidia's stuff? It'll be great when everyone has a useful GPU in their machine but unless display tech or 3D APIs start to stagnate I don't see integrated catching up in just four years.
GPUs costing less than $150 represent 54% of the current market for gamers' cards, if Steam's credible:

http://www.hardocp.com/image.html?image=MTI3ODg5Njg2NjdLMGdWc3Z6VEtfMV80X2wuanBn

So, ahem, practically all of those gamers will find Fusion good enough. Oh, and presumably that excludes laptops.
 
Oh, I'm quite certain there is more than a little inertia in consumer habits.
In this particular case I'm a perfect example myself. :)

Inertia, sure, and ignorance too. Some people have been taught that "dedicated graphics memory is good", and just assume that the salesman is trying to unload his cheap crap on them if he tries to sell them something with integrated graphics. And OEMs are quite happy to sell you discrete graphics cards, because that just means more revenue for them, and slightly higher margins to boot.

I trust that Intel will bang hard enough on the Sandy Bridge drum to change than mentality. AMD will do the same with Fusion, but with 1/20th of the budget… :p
 
If however they are literally saying that 2/3 of the people who currently buy discrete GPUs are going to stop doing it in the next 5 years, that's a bit more bold prediction. I never really understood the sub-$100 discrete GPU market in the past few years, but above that, what is going to change that would make people stop wanting the performance that they are clearly already paying for?
You were saying about the sub $100 market? Obviously not in terms of revenue or margins, but that 2/3 of the discrete market in terms of unit sales (with 10 percent of the total market left in the swing) will be gone in 5 years doesn't seem that far fetched to me at all.
 
GPUs costing less than $150 represent 54% of the current market for gamers' cards, if Steam's credible
That's not a fair analysis because that includes GPUs that have *become* <$150 since they were bought. You need the original purchase price to do that analysis.

You were saying about the sub $100 market? Obviously not in terms of revenue or margins, but that 2/3 of the discrete market in terms of unit sales (with 10 percent of the total market left in the swing) will be gone in 5 years doesn't seem that far fetched to me at all.
Sure, but that's unrelated to processor-integrated graphics (and as you note, revenue obviously is the number we want). People still buying discrete cards in this price range are "doing it wrong" already :) They aren't good enough for any reasonable gaming and they are overkill for everything else.
 
Sure, but that's unrelated to processor-integrated graphics (and as you note, revenue obviously is the number we want). People still buying discrete cards in this price range are "doing it wrong" already :) They aren't good enough for any reasonable gaming and they are overkill for everything else.
Oh, yes, I agree. Those consumers are totally doing it wrong in terms of maximizing utility. :)

This, however, doesn't mean that part of the market isn't actually there to be contended. Were we to leave it purely up to the OEMs (who, I'm willing to bet, make up the bulk of these sales), they'd probably make do with the stickers alone... (Hey, stickers sell!) I'm fairly sure that would illegal though, so they'll still be buying something, and my bet is not on GT 210s, HD 5450s, or whatever.
 
GPUs costing less than $150 represent 54% of the current market for gamers' cards, if Steam's credible:

http://www.hardocp.com/image.html?image=MTI3ODg5Njg2NjdLMGdWc3Z6VEtfMV80X2wuanBn

So, ahem, practically all of those gamers will find Fusion good enough. Oh, and presumably that excludes laptops.

Not sure what you're saying. Is the Fusion GPU going to be 67xx class? or even 65xx?


I admittedly was thinking more of AMD's APUs than Intel's, mainly because I have no idea when Intel will catch up to AMD in graphics performance (in the broad sense, including the essential perf/mm² and perf/W metrics) if ever.

Fusion only matters if AMD increases its CPU market share. What good is Fusion if Intel still has 80% of the market?
 
That's not a fair analysis because that includes GPUs that have *become* <$150 since they were bought. You need the original purchase price to do that analysis.

Then again, there are a lot of people who just play Sims, Spore or World of Warcraft, or some other MMORPG and don't use Steam, because they have little interest in other games. And those people tend to have low-end graphics.

Sure, but that's unrelated to processor-integrated graphics (and as you note, revenue obviously is the number we want). People still buying discrete cards in this price range are "doing it wrong" already :) They aren't good enough for any reasonable gaming and they are overkill for everything else.

There's currently a Radeon HD 5750 available for $90 on Newegg, after MIR (here).

There should be some GeForce 9800 GTs in that price range as well. But the point is certainly valid for the <$60 market.
 
That's not a fair analysis because that includes GPUs that have *become* <$150 since they were bought. You need the original purchase price to do that analysis.
True - NVidia has great detail for sales history. What data is NVidia collecting from Valve?
 
Fusion only matters if AMD increases its CPU market share. What good is Fusion if Intel still has 80% of the market?

Well, the way I see it, if Fusion is really better than Intel's stuff, AMD's market share will increase. If Intel catches up to AMD in graphics, then it doesn't matter: just apply the same reasoning about APUs to Intel instead of AMD.
 
Wouldn't current information/speculation indicate roughly Mobility 5650-class (i.e. mostly bandwidth constrained) performance for Llano?

No idea, what's the current info on Llano?

Well, the way I see it, if Fusion is really better than Intel's stuff, AMD's market share will increase. If Intel catches up to AMD in graphics, then it doesn't matter: just apply the same reasoning with Intel instead of AMD.

There's a third option, Intel doesn't catch up in graphics and AMD doesn't catch up in market share. :)
 
Not sure what you're saying. Is the Fusion GPU going to be 67xx class? or even 65xx?
AMD will be offering at least 2 levels of Fusion initially, though perhaps only Llano based will make it to desktop (leaving Ontario for mobile).

I reckon Ontario is somewhere in the vicinity of Cedar performance, what's the next rung up? I don't know how good Llano is, but I'm not trying to suggest it has HD5770 performance. HD5570?

As the years go by the Fusion GPU will catch up with the mainstream segment, because the proportion of the die dedicated to CPU cores will diminish.

Power and bandwidth constrain performance growth for enthusiast cards. All the efficiency gains in the architecture apply to Fusion - though architectural lag will be relevant.

Fusion GPU will lag the discrete GPU by one year in architecture, I guess. But it looks like it'll be on a the same node or half a node behind the succeeding GPU. So while the architecture lags, density lags less so.
 
No idea, what's the current info on Llano?
Me neither. I just seem to remember 400SP being thrown around, and having experienced HD 4650 and Mobility 5650 both screaming for more bandwidth to stretch their legs... It was more of a rhetorical question. sorry. :)

Edit:
I'm guessing that one is is similarly constrained as the mobile 5650? What kind of memory is Llano expected to use? Dual channel DDR3 1333?
 
Last edited by a moderator:
Sure, but that's unrelated to processor-integrated graphics (and as you note, revenue obviously is the number we want). People still buying discrete cards in this price range are "doing it wrong" already :) They aren't good enough for any reasonable gaming and they are overkill for everything else.

Eh, while that may have been the case previously, its really not true anymore. Cards like the 5670, GT 240 and even the 5570 offer a huge step up from integrated graphics and allow most games to be played comfortably at medium/high settings at resolutions like 720p/1280x1024/1440x900. They're a valid choice, now crap like the 5450 and the G210, they're a waste of anyone's money.

Obviously with stuff like Llano this might change quite rapidly in the next couple of years but I feel its a correction worth making as it really doesn't reflect current market conditions.
 
I reckon Ontario is somewhere in the vicinity of Cedar performance, what's the next rung up? I don't know how good Llano is, but I'm not trying to suggest it has HD5770 performance. HD5570?

Hmmm that would fall somewhere on the spectrum between amazing and incredible if it's indeed the case. Agreed that future APUs will dedicate more transistor budget to graphics but one factor that hasn't been floated yet is that this new competition from the bottom will certainly spur innovation at the top. It's possible that the entry level discrete market could disappear or it could re-invent itself with better/faster/lower power hardware.
 
Power and bandwidth constrain performance growth for enthusiast cards. All the efficiency gains in the architecture apply to Fusion - though architectural lag will be relevant.

That's going to become a problem if APU's are indeed going to supplant the GPU market in mainstream and below. It's certainly quite possible it'll eventually erase the budget market IF they can maintain feature parity. But with both Intel and AMD currently trying not to go over ~100 watts and up to ~150 (for high end stuff) for their CPU (even to the point of coming up with sometimes fanciful redefinitions of TDP), how much leeway is going to be available for GPU functions?

Or are we predicting that both will suddenly relax power constraints and we'll start seeing 200-300 watt APUs? At least on Intels side they already have a process advantage and their CPUs are still bumping ~130 watts. Then again someone using one of the 130 watt CPUs is unlikely to be a customer for an APU. So even at ~73-95 watts for desktop Core i3 and i5 that isn't exactly leaving a lot of power to reach mainstream graphics levels without a serious bump in CPU/APU power envelope.

Sure technology and process nodes will advance for a while. But that's going to apply both to discrete as well as integrated APUs. Heck, I think it's even questionable whether they'll displace much of the budget-mainstream segment.

Regards,
SB
 
That's going to become a problem if APU's are indeed going to supplant the GPU market in mainstream and below. It's certainly quite possible it'll eventually erase the budget market IF they can maintain feature parity.
Absolutely no sign of D3D11.1. So, erm, feature parity is pretty much assured for first generation Fusion.

Though Intel seems to be stuck at D3D10.1.

But with both Intel and AMD currently trying not to go over ~100 watts and up to ~150 (for high end stuff) for their CPU (even to the point of coming up with sometimes fanciful redefinitions of TDP), how much leeway is going to be available for GPU functions?
How is that different from discrete being constrained by 300W - though I admit there have been some gas guzzlers (ATI and NVidia) this past year.

Or are we predicting that both will suddenly relax power constraints and we'll start seeing 200-300 watt APUs? At least on Intels side they already have a process advantage and their CPUs are still bumping ~130 watts. Then again someone using one of the 130 watt CPUs is unlikely to be a customer for an APU. So even at ~73-95 watts for desktop Core i3 and i5 that isn't exactly leaving a lot of power to reach mainstream graphics levels without a serious bump in CPU/APU power envelope.
When "gamer graphics" are involved, how much power is a CPU consuming? Put another way, when the GPU is the bottleneck in a midrange system's gaming performance, what's the CPU's power consumption? 20W?

Separately there's the question of the pros and cons of CPU and GPU sharing a socket, in terms of power and bandwidth.
 
How is that different from discrete being constrained by 300W - though I admit there have been some gas guzzlers (ATI and NVidia) this past year.

That's the thing though, even mainstream cards are using ~100 watts more or less. Adding that to a CPU while not ballooning TDP is going to be a challenge. Granted, it'll be a bit less as the CPU won't include the power used by memory, etc. But it'll still roughly double the power ceiling of a CPU putting it significantly higher than what either company are currently comfortable with for their CPUs.

When "gamer graphics" are involved, how much power is a CPU consuming? Put another way, when the GPU is the bottleneck in a midrange system's gaming performance, what's the CPU's power consumption? 20W?

Separately there's the question of the pros and cons of CPU and GPU sharing a socket, in terms of power and bandwidth.

Well, that's obviously going to be a case by case situation. But it's not uncommon for games to be CPU bound, although it might be argued that at mainstream and below that's a far more uncommon situation.

However, if more complex phyics (Mafia II for example) start to become the norm rather than the exception it's quite possible that we may have more situations of a maxed or near maxed CPU even with a budget CPU. And where some might say you could just use the GPU for that, at the budget and mainstream levels of GPU performance it probably makes more sense for the CPU to do it.

Yeah I still remain unconvinced that first generation devices will exceed discrete budget GPUs due to bandwidth sharing, memory sharing, etc...

Regards,
SB
 
That's the thing though, even mainstream cards are using ~100 watts more or less. Adding that to a CPU while not ballooning TDP is going to be a challenge.

If a card uses 100 watts, how much of that power can be attributed to the GPU chip? And how much of that power will be transferred to the embedded chip when the GPU transistors move in with the CPU?

If the TDP of a CPU chip is 50W and the TDP of a GPU chip is 50W, what will the TDP of the embedded package need to be? Not necessarily 100W.
 
Back
Top