Intel Atom Z600

Same could go for other semis hypothetically buying SoCs from Intel. Intel's so far SoCs haven't been cheap from what I recall.

They aren't cheap, but that was kind of the point. Intel wouldn't make Atom available for other manufacturers to use on an SoC if it allowed them to release something cheaper yet still comparable in quality, turning into a direct competitor with what they offer. And of course, many SoC manufacturers would be more than willing to sell at much lower margins than Intel.
 
And what guarantees that it didn't launch earlier because of lack of more resources as you seem to imply and not simply a very specific desing decision that was rated as adequate for Medfield's real timeframe?

Sorry i didn't get what you meant with that? your right although i didn't state resources i was implying that they have more than enough to do what ever they want to do, and also the manufacturing/technical expertise to have launched Medfield sooner, had they put the same kind of effort into it that they do their high end.

Slight but important difference being that Intel doesn't compete directly with ARM CPU IP but with all of ARM's licensees not being exactly small as semiconductor manufacturers. It's no coincidence that Intel in the longrun sees in Qualcomm for instance a much larger competition threat than in AMD or anyone else

True, but most of the competitors that you imply do use direct ARM Cortex designs, with the exception of Marvell and as you state Qualcomm.
Im not suggesting Qualcomm is exactly 'small fry' as they are the market leader in their respected area, ($4bln revenue last Q is nothing to sniff at)
But most companys don't have the resorces compared to Intel, nor the engineers.

As chips get faster and likely suck more power/get more complex, only one company has got the technolgy and the market sewn up at the high end..and thats Intel.
It would only take an advance in battery technology for instance to completely shred one of ARM's advantages overnight.

Regardless, The Qualcomms and Apples have to rely on other third parties to manufacture the processors, and those that do..ie TSMC/Global founderies are waaay behind Intel, a conservative estimate is 2years, and will likely grow bigger not smaller, and in TSMC's case actually have lost 30% profit the last quater, probably down to poor yeilds or somethig like that.

Intel is the Microsoft of the chip world..in eats up the competition, you would be a brave man/women to bet you house against Intel coming out on top with in 5 years.
But for competition sake,i really hope Intel lose this one, else in a matter of 3-5 years we could see competition completely erradicated, and technology progression stagnate to a standstill, with Intel releasing slight clock speed improvements every 18 months..for comparativly bloated prices.
 
Sorry i didn't get what you meant with that? your right although i didn't state resources i was implying that they have more than enough to do what ever they want to do, and also the manufacturing/technical expertise to have launched Medfield sooner, had they put the same kind of effort into it that they do their high end.

That takes more resources since additional "efforts" aren't for free. Again why should it be due to efforts and not some simple design decision?

True, but most of the competitors that you imply do use direct ARM Cortex designs, with the exception of Marvell and as you state Qualcomm.
Qualcomm licenses ARM IP but develops their own custom CPUs out of it.

Im not suggesting Qualcomm is exactly 'small fry' as they are the market leader in their respected area, ($4bln revenue last Q is nothing to sniff at)
Start counting how many design wins Qualcomm has on a annual basis in terms of smartphone SoCs. There shouldn't be any other semiconductor manufacturer with more.

But most companys don't have the resorces compared to Intel, nor the engineers.
Intel's core business are and will remain to be CPUs. But feel free to enlighten us why Intel didn't manage with an ungodly amount of resources poured into the Larabee GPU project to get anything worthwhile out of it for desktop GPUs and was forced to cancel it.

As chips get faster and likely suck more power/get more complex, only one company has got the technolgy and the market sewn up at the high end..and thats Intel.
See former paragraph. Ask around how big Knights Corner for HPC even under 22nm will be and how much power it burns and it'll be easy to understand why it would had completely flopped for the desktop.

It would only take an advance in battery technology for instance to completely shred one of ARM's advantages overnight.
Well let's see it first.

Regardless, The Qualcomms and Apples have to rely on other third parties to manufacture the processors, and those that do..ie TSMC/Global founderies are waaay behind Intel, a conservative estimate is 2years, and will likely grow bigger not smaller, and in TSMC's case actually have lost 30% profit the last quater, probably down to poor yeilds or somethig like that.
Qualcomm and Apple amongst other involved IHVs execute and have more than just a bit of a success in the small form factor market. Intel is struggling for years now to enter it and now that they managed the first small step, they've already sent all others home...for sure :rolleyes:

Intel is the Microsoft of the chip world..in eats up the competition, you would be a brave man/women to bet you house against Intel coming out on top with in 5 years.
But for competition sake,i really hope Intel lose this one, else in a matter of 3-5 years we could see competition completely erradicated, and technology progression stagnate to a standstill, with Intel releasing slight clock speed improvements every 18 months..for comparativly bloated prices.
Intel is new to the smartphone market and x86 CPUs haven't shown to be exactly optimal this far. We'll see how it pans out in the longrun but I need a few more substantial arguments than what I'm reading here. Yes Intel can make it happen as long as they don't make any idiotic design decisions as with Larabee otherwise they'll have wasted a crapload of resources again. Medfield is a good step but nothing groundbreaking either.
 
I'm not sure I even want Intel to have any success in the smartphone market. The only platform they'll be able to run on for now will be Android and if Intel does prove to be successful on Android then that'll only create more fragmentation in the Android 'ecosystem'.
 
Well you seem to be re-stating some of my comments, so ill leave the first few.

Intel's core business are and will remain to be CPUs. But feel free to enlighten us why Intel didn't manage with an ungodly amount of resources poured into the Larabee GPU project to get anything worthwhile out of it for desktop GPUs and was forced to cancel it.

I think you are getting mixed up, Intel actually has dominated the graphics side of desk top..not in terms of performance..they suck..or in terms of credibility..they suck. but in terms of units sold they win...of course im talking about IGP and not discrete..but thought i would throw that one in there..
And besides, we are talking about moving down into mobile, Intel owns shares in IMG TECH, they can just use those chips, which are widely considered to be the most advanced in every metric.
If you are comparing Intels graphics portfolio to its MOBILE competitors, it actually has more powerfull IP, or shares the same =IMG TECH.

Intel dominates anything above 15w, it would be churlish to suggest otherwise, ARM vendors dominate below 5w and AMD somewhere in between.

So the battle then is pushing into each others markets, who has the most tools, who has the most to lose/the most to gain?
I would argue It is Qualcomm/TI/STE/Marvel/Renensis etc,
Intel has the best manufacturing..years ahead of everyone else..exclusivly to them selfs.
Intel has the most money.Intel has the most engineers.
Intel also makes most of its money where arm can't hope to get near in the short term, where as Intel looks like it can strike below 5w within 12 months if it wanted to, and not only that, it could afford to do it by selling its silvermont break even...just to knock the competition down.

Ill make this clear i don't want that to happen, because it will suck, i also didn't think this was a likely scenario for years untill CES, untill i saw the performance of Medfield, backed up by various credible sources.

Medfield looks like it would have been competitive with any ARM derived chip currently released, and if they can get 22nm silvermont out, and start chipping away at the market share of ARM, then it could get messy.
 
Intels results were out today. In one of the presentations, it still shows silvermont on 22nm in 2013, so likely negates that fudzilla report of medfield on 22nm in 2012.
 
Intels results were out today. In one of the presentations, it still shows silvermont on 22nm in 2013, so likely negates that fudzilla report of medfield on 22nm in 2012.

Explanation for that is "mid-2012" talk is about sampling, just like Medfield was sampling early 2011.
 
So indeed Intel shouldn't be counted out, but assuming they'll succeed because they had successes in the past looks dubious.

Not saying they will dominate or even succeed. Just saying all the people discounting Intel at this early time should be aware that Intel has in the past been in far worst positions versus the competition and still found a way to compete and succeed. Even if it took them a decade or more (server market first with P-Pro [meh] and then Itanium [better but more meh] and finally with perfomant and cost effective Xeons).

Anyone that underestimates them better be prepared. The past is littered with companies that underestimated Intels willingness to continue pushing until it succeeds when it comes to CPU designs. Of course, many of their other ventures haven't fared as well, but they've rarely faltered when it came to designing CPUs. Although they can be caught off guard at times (Athlon 64/Opteron versus P4/P4 based Xeons, for example).

The question is simple. Do you think, if Intel wholeheartedly commits to the mobile platform on the basis that's its crucial to the company's success, that they can do it ?

I don't think it's crucial. But it certainly is a large potential growth market to enhance their current core revenue generators.

Desktop computing isn't going anywhere. And interestingly enough, with the advent of tablets, it appears the notebook market is shrinking while the desktop market is growing. As consumers go with cheaper desktop computers for the home, while ditching notebooks for on the go media consumption and internet access.

Unlike the professional market, consumers only really liked notebooks because it provided them with access to media and internet while travelling. I find a lot of people locally have been ditching their notebooks in favor of tablet + desktop computer.

Basically the large powerful CPU market isn't going anywhere.

The only danger to Intel at the moment, IMO, is that rise in popularity of tablets and web browsing capable phones (iPhone, Android, and Windows Phone) is likely to start eating into their mobile CPU sales. But is unlikely to touch their desktop/server CPU sales. And in fact, I'm predicting they'll see an increase in desktop CPU sales coinciding with the increase in tablet sales.

Regards,
SB
 
How likely do you think is it that Intel starts licensing CPU IP to other semiconductors or other semiconductors to abandon SoC development and buy SoCs from Intel?

Highly unlikely. Intel went that route in the early days with x86 and haven't been terribly happy with that decision since then. Unless Intels efforts in the ultra mobile market completely fails over the next 10+ years, it's unlikely they'll go that route again.

I'm willing to bet Intel is far more willing to be subpar over the span of a decade in order to have uncontested rights and control over their design when they succeed, than to potentially have to deal with competitors using their own designs to compete against them.

BTW - when I said "when they succeed" above, I'm not implying that I think they will or won't succeed. But that Intel is likely confident they'll succeed. The only question in their minds will likely be how long it will take.

As to Larrabee, GPUs have a potential to disrupt their core business but as of yet haven't significantly impacted it. Hence while they are working on that to counteract effects of GPU computing it's not receiving the same sort of priority as their mainline business or their efforts in the ultra portable arena.

Regards,
SB
 
Anyone that underestimates them better be prepared.
Oh I see what you mean and you can be sure ARM and its partners are certainly not underestimating Intel power (pun intended... or not).

Desktop computing isn't going anywhere. And interestingly enough, with the advent of tablets, it appears the notebook market is shrinking while the desktop market is growing. As consumers go with cheaper desktop computers for the home, while ditching notebooks for on the go media consumption and internet access.
Interesting thought but didn't desktop sales still go down in 2011? And wouldn't users only need a small CPU performance-wise?

But is unlikely to touch their desktop/server CPU sales. And in fact, I'm predicting they'll see an increase in desktop CPU sales coinciding with the increase in tablet sales.
I think it will more significantly increase their sales in servers as you need to serve all these mobile devices.
 
Well you seem to be re-stating some of my comments, so ill leave the first few.

No I don't.


I think you are getting mixed up, Intel actually has dominated the graphics side of desk top..not in terms of performance..they suck..or in terms of credibility..they suck. but in terms of units sold they win...of course im talking about IGP and not discrete..but thought i would throw that one in there..

Until sandy bridge it didn't take any particular talent to create a fancy office word renderer. You're admitting yourself that they aren't considered to have the best solution for integrated graphics, which actually supports my actual point that their so far field of concentration hasn't been graphics. However it has gotten quite a bit better with sandy bridge and it'll further improve with ivy bridge. That still doesn't mean though that Larabee wasn't a flop for the desktop, despite them investing a quite high amount of resources. Experience is something that comes over time and it helps NOT making false design decisions.

As I said Medfield might be a step in the right direction, but considering what they could have integrated and what they did in terms of the GPU for instance asking a similar price than NV does for Tegra3 for each Medfield SoC while the GPU in AP30/smartphones should be at least by 50% faster than a SGX540@400MHz, isn't what I'd call the best solution.


And besides, we are talking about moving down into mobile, Intel owns shares in IMG TECH, they can just use those chips, which are widely considered to be the most advanced in every metric.

Yes Intel owns a fair amount of shares in IMG and yes Intel has a multi-year multi-license agreement with IMG for various generations of GPU and video IP from IMG. What's your point?

If you are comparing Intels graphics portfolio to its MOBILE competitors, it actually has more powerfull IP, or shares the same =IMG TECH.

I'm not comparing anything and the above hardly makes any sense anyway. Intel uses IMG GPU IP for the small form factor markets, full stop.

Intel dominates anything above 15w, it would be churlish to suggest otherwise, ARM vendors dominate below 5w and AMD somewhere in between.

And? It's still Intel against a quite long list of competitors. If you can't understand that then I'm afraid we're wasting bandwidth here.

So the battle then is pushing into each others markets, who has the most tools, who has the most to lose/the most to gain?
I would argue It is Qualcomm/TI/STE/Marvel/Renensis etc,
Intel has the best manufacturing..years ahead of everyone else..exclusivly to them selfs.
Intel has the most money.Intel has the most engineers.
Intel also makes most of its money where arm can't hope to get near in the short term, where as Intel looks like it can strike below 5w within 12 months if it wanted to, and not only that, it could afford to do it by selling its silvermont break even...just to knock the competition down.

Any further nonsensical apples to oranges comparisons? Because I'm frankly getting tired of it. It's tiresome to state the same stuff over and over again. ARM sells CPU IP to it's partners and their partners are actually competing DIRECTLY with Intel.

Ill make this clear i don't want that to happen, because it will suck, i also didn't think this was a likely scenario for years untill CES, untill i saw the performance of Medfield, backed up by various credible sources.

Obviously Medfield kills any comparable platform in sight :rolleyes: It's a competitive solution and that's about it.

Medfield looks like it would have been competitive with any ARM derived chip currently released, and if they can get 22nm silvermont out, and start chipping away at the market share of ARM, then it could get messy.

Let's see other semiconductor and Intel's future SoCs first. Other than that good luck making more "points" with a bunch of "ifs" and "buts".
 
Highly unlikely. Intel went that route in the early days with x86 and haven't been terribly happy with that decision since then. Unless Intels efforts in the ultra mobile market completely fails over the next 10+ years, it's unlikely they'll go that route again.

I'm willing to bet Intel is far more willing to be subpar over the span of a decade in order to have uncontested rights and control over their design when they succeed, than to potentially have to deal with competitors using their own designs to compete against them.

It obviously was a question aiming for answers like yours above. The problem is that some just can't comprehend it here, whether you put it as a point, a question or even would back it up with some documentation.

BTW - when I said "when they succeed" above, I'm not implying that I think they will or won't succeed. But that Intel is likely confident they'll succeed. The only question in their minds will likely be how long it will take.

If they make the right choices and design decisions and the result is increasingly competitive not very long. Albeit a really bad example since there are other factors involved Intel won't touch anytime soon, how long did it take for Apple to become a serious player in the small form factor market? A well rounded vision and a good strategy and it won't take all that long.

As to Larrabee, GPUs have a potential to disrupt their core business but as of yet haven't significantly impacted it. Hence while they are working on that to counteract effects of GPU computing it's not receiving the same sort of priority as their mainline business or their efforts in the ultra portable arena.

Regards,
SB

Not necessarily; as I said above graphics will get far more aggressive with ivy bridge. The real point behind my whole reasoning is: experience in graphics effiency means a correct philosophy for the field and leads to correct design decisions, visions, strategies. All 3 were partially or entirely wrong for Larabee and so far for anything before Medfield for the smartphone market.

Further example: how much do you think Cedartrail would end up being a HUGE success even if they would have had released on time with full DX10.1 graphics drivers?
 
Moving the discussion sideways a bit....

.

Not necessarily; as I said above graphics will get far more aggressive with ivy bridge.

Further example: how much do you think Cedartrail would end up being a HUGE success even if they would have had released on time with full DX10.1 graphics drivers?

Is there any expectation whatsoever, that Intel's inhouse graphics becomes more viable for them for mobile as the fab process reduces ? I know nothing about their inhouse other than the fact that the last couple of iterations have been better than previous, and obviously integrated along with the processor. Is both power and die area orders of magnitude beyond whats needed for mobile, or are they anywhere in the ballpark that say 14nm makes it doable for them on a smartphone Soc.
 
Is there any expectation whatsoever, that Intel's inhouse graphics becomes more viable for them for mobile as the fab process reduces ? I know nothing about their inhouse other than the fact that the last couple of iterations have been better than previous, and obviously integrated along with the processor. Is both power and die area orders of magnitude beyond whats needed for mobile, or are they anywhere in the ballpark that say 14nm makes it doable for them on a smartphone Soc.

Since Intel's embedded graphics is at the moment on DX11 level, with an apples to apples oranges I'd sense that it's not going to be an easy comparison in terms of perf/W and perf/mm2.
 
Me;
Im not suggesting Qualcomm is exactly 'small fry' as they are the market leader in their respected area, ($4bln revenue last Q is nothing to sniff at)

Your reply;
Start counting how many design wins Qualcomm has on a annual basis in terms of smartphone SoCs. There shouldn't be any other semiconductor manufacturer with more.

Me;
True, but most of the competitors that you imply do use direct ARM Cortex designs, with the exception of Marvell and as you state Qualcomm.

Your reply;
Qualcomm licenses ARM IP but develops their own custom CPUs out of it.

Just 2 examples from the peice we were both refering to, conclusion you were saying the same thing as i said, just worded different.

What i was trying to point out to you regarding your comments about larrabee, was that it is a poor comparison to make, Intel doesn't need MOBILE graphics IP as they have IMG TECH...so why are you mentioning desktop graphics??

Better comparisons of Intel chucking huge resources into something, they were clearly behind on can be found in other comments above.

My points i was making about ARM below 5w Intel dominating above 15w was in the context of striking distance, and also where each makes the bulk of their revenue..in Intels case the bulk of their revenue is at a place where ARM/Qualcomm/whoever can hope to touch, not next year, 2 years or more.

Where as all of the ARM brigade make most their revenue WITHIN STRIKING DISTANCE of Intel..as proved my the figuers of this Medfield chip.
(taking out the embedded space, smartphones/tablets im talking about)

Im not making silly suggestions like Anand when he stated 'this proves Medfiled would have dominated Android last year' but it would have taken away some revenue from the ARM brigade..Intel still earns billions high end regardless.

ARM is the king of mobile as we will see more proof of this year and first half of next, but with the inclusion of Silvermont it will get interesting, Ill just leave it with that.
 
Last edited by a moderator:
Not necessarily; as I said above graphics will get far more aggressive with ivy bridge. The real point behind my whole reasoning is: experience in graphics effiency means a correct philosophy for the field and leads to correct design decisions, visions, strategies. All 3 were partially or entirely wrong for Larabee and so far for anything before Medfield for the smartphone market.

Further example: how much do you think Cedartrail would end up being a HUGE success even if they would have had released on time with full DX10.1 graphics drivers?

Ah yes, the integrated graphics in IVB was fairly decent and points out what we've been saying. When Intel feels suitably threatened (in this case by AMD's Fusion initiative) they're extremely willing to invest significant resources into minimizing the impact of said threat. Same thing happened back when AMD was talking about moving to dual core ahead of Intel. But Intel moved fast with first dual P4's on a chip and then later with chips designed to be dual core.

I don't think they've stopped work on Larrabee either, although they aren't talking about it anymore. I have a feeling we may see it resurface in the future if Intel gets it to the point where they are happy about it or if GPU compute adoption starts to rapidly take off. The relatively slow uptake of GPU compute hasn't made something like Larrabee a required priority.

Regards,
SB
 
Ah yes, the integrated graphics in IVB was fairly decent and points out what we've been saying. When Intel feels suitably threatened (in this case by AMD's Fusion initiative) they're extremely willing to invest significant resources into minimizing the impact of said threat. Same thing happened back when AMD was talking about moving to dual core ahead of Intel. But Intel moved fast with first dual P4's on a chip and then later with chips designed to be dual core.

My point being in the given case that any "threat" for the small form factor market doesn't come from just one side but from multiple sides which makes things way more complicated.

I don't think they've stopped work on Larrabee either, although they aren't talking about it anymore. I have a feeling we may see it resurface in the future if Intel gets it to the point where they are happy about it or if GPU compute adoption starts to rapidly take off. The relatively slow uptake of GPU compute hasn't made something like Larrabee a required priority.

Regards,
SB
Since we're obviously talking about Atom SoCs here, even the SGX540 they've integrated in Medfield is within it's power envelope constraints equally efficient for 3D as well as GPGPU. In the case of LRB or any LRB alike design if you concentrate an architecture way too much on one side neglecting the other it'll be no wonder that the result will be an unbalanced design being good at one thing and suck for the other. Even worse if said design burns an ungodly amount of power; it might be relatively irrelevant for Knights corner since it ended up an exclusive HPC option it wouldn't be for anything else, unless N times more power consumption would be irrelevant.

LRB didn't have a resource problem IMHO; rather a bunch of wrong design decisions.

french toast,

I refuse to further waste bytes trying to punch myself through with your twisted reasoning. Either you can read and understand what I wrote so far or I'm pulling a white flag.
 
Back
Top