Intel Loses Steam Thread

Need to clear out Haswell inventories, blame the delay on yields. They've done it in the past. Of course yields will surely get better with the extra time, but I doubt that's the main reason for the delay.

Intel turns over their entire inventory in 2-2,5 months. Haswell inventories has nothing to do with it.
This is about profit maximization.
Continuing with the present products, means that Intel and their partners can amortize their design work, tool set, et cetera over a longer period and enjoy the cost benefits of production lines that are debugged and trim.
However, the tactic hinges on having a mostly captive audience, that replaces their equipment according to given time cycles. For part of the market, particularly corporate and administration, this is a reasonable description of the situation (however, there is a strong movement towards longer replacement cycles, and some sites may even replace with non x86 devices). For the private market the situation is clearly different, and replacement devices are bought out of need or desire. And if the product isn't desirable now as a replacement to what an individual has, it isn't going to look any better in nine months.

So it is a tactic that tries to optimize profit at the cost of overall market size.

Their analysis in the past could be made with the assumption that all customers would replace their computers, only at a later date, allowing them to milk the "institutional" customers, while getting the private sales somewhat later. This is no longer the case, not only are replacement cycles getting longer (lower sales numbers/year), but customers are actually deserting the platform. This decision won't exactly help the x86 market rebound in terms of volume. It will be interesting to see the numbers a year from now.
 
It doesn't need to decide the full chain. Manufacturing at new nodes leverages existing and older tech above the critical layers, so it comes down to a subset of the industry and a more rarified stratum where moves need to be made.

The leading-edge stuff already has something of a special relationship with Intel, because they are the first adopters for a number of things.
It would take a heavy investment for Intel to replicate the expertise of a tool maker over a whole range of products, but that doesn't preclude Intel's paying for early or exclusive access to tools it contributes heavily to commercializing.
ASML took advantage of this a little when it goaded additional investment outside of Intel for its 450mm research, if only to prevent Intel from having dibs on it.
Thing is, Intel is already pushing ahead of the curve like you describe. I don't think it makes sense for them to push even higher up that rather steep cost curve. It isn't that it is impossible, it's just that they already do it as far as they can justify it, and I'm hard pressed to come up with arguments for going further ahead given their market situation. The fact of the matter is that they are slowing down their transitions, not speeding them up.
I read an estimate today that Intel was a year or so ahead of TSMC. I think that might be a reasonable (if arguable) snapshot of where we stand right now. (I don't think we should discuss such numbers here, I just toss it out to give innocent bystanders a ballpark figure).
They need the volume.
There is physical silicon that needs to go *somewhere*.
Or they can reduce their production capacity and save a lot of money that way. This seems to be the way they are heading. They have had excess capacity for some time, it is quite reasonable for them to do something about it. Lowering costs is a valid way of improving margins.
 
Thing is, Intel is already pushing ahead of the curve like you describe.
They are early adopters and take part in the pathfinding for new tools, but they've been content for a long time to let specialized third parties to be the tool makers.
At some point, those third parties have to bow to their interest in serving a wider market stocked by Intel's competitors.

Some set of critical tools at the sub-10nm regime not available to them might be handy.

I don't think it makes sense for them to push even higher up that rather steep cost curve. It isn't that it is impossible, it's just that they already do it as far as they can justify it, and I'm hard pressed to come up with arguments for going further ahead given their market situation.
Intel's virtuous cycle of physical, design, and volume breaks down if the physical stuff is not compelling.
I don't see it having much to entice the platform and content providers, if the only thing it provides to them is even more of a commodity.

The fact of the matter is that they are slowing down their transitions, not speeding them up.
That may be so. The implications of it are more far-reaching than just Intel, though.
Unless they've gotten a shareholder-enforced lobotomy, one of the most experienced players easing off the curve when it's becoming so steep may hint at a re-evaluation of the fab process end-game.

That might mean more for the remaining fab players as an industry, if back pressure from other parts of the chain or other industries is overriding the original goal to corner the fabrication market at the end of fab scaling.

I read an estimate today that Intel was a year or so ahead of TSMC. I think that might be a reasonable (if arguable) snapshot of where we stand right now.
At some headline numbers, perhaps. In terms of process quality, experience with manufacturability at the next geometries, and time to market, this would require TSMC and the like to deliver.


Or they can reduce their production capacity and save a lot of money that way. This seems to be the way they are heading. They have had excess capacity for some time, it is quite reasonable for them to do something about it. Lowering costs is a valid way of improving margins.

Reducing capacity means not catering to the big verticalized players that have opted so far not to entertain relying on Intel's products.
It might save money now, but that means having even less leverage to divert where the flow of money is going, especially as more of it leaves not just Intel but Intel's industry.
 
I would put Intel at least 2 years ahead of everyone else in terms of process tech. They are sort of taking it easy right now and still laughing at the competition.
 
I'm not sure Intel can afford to be in front of everyone else while taking it easy.

Having such a great advantage in chip manufacturing probably comes at the cost of really hard work - and not just effective work.
But I'm sure they pay the right amount of money to the right people.
 
That brings up a good question though: Intel got this far ahead in lithography tech at least partially because they needed to be, especially around the times of the P4 and their hopes of ever-increasing clockrate. As the money kept pouring in, they were able to stay ahead with the enormous R&D budget that came from all their high dollar CPU sales.

As high dollar CPU sales decline, eventually the R&D budget will also decline. Slower sales with smaller margin will begin to take a toll on their ability to keep ahead on lithography, which will eventually challenge that lead. It becomes a self-fulfilling prophecy once the decline starts...
 
even my i-don't-know-how-many-years-old Core 2 Duo on 65 nm is pretty damn good enough for all office things I am doing with it, even light gaming plus movies playback, etc...

So, what exactly are we speaking about? Don't you think that the main problem of Intel nowadays is that there is nothing to stimulate the proper utilisation of all the high performance CPUs' power?
They need to introduce software which can be used by more and more people in desktop environment, they need to make people need and want to upgrade more often...

while they are doing exactly the opposite, and even the Broadwell thread proves it
 
That brings up a good question though: Intel got this far ahead in lithography tech at least partially because they needed to be, especially around the times of the P4 and their hopes of ever-increasing clockrate. As the money kept pouring in, they were able to stay ahead with the enormous R&D budget that came from all their high dollar CPU sales.

As high dollar CPU sales decline, eventually the R&D budget will also decline. Slower sales with smaller margin will begin to take a toll on their ability to keep ahead on lithography, which will eventually challenge that lead. It becomes a self-fulfilling prophecy once the decline starts...

That's why they're opening their fabs to third parties: it brings in the extra revenue they need. Will it be enough? Hard to say.
 
That's why they're opening their fabs to third parties: it brings in the extra revenue they need. Will it be enough? Hard to say.

At some point the physical challenges will be so rough, that even for Intel it would kind of impossible to do anything beyond that, except a radical change in manufacturing, I mean change of the type of processors, which I am not even sure they are ready for...

And even then - you can argue that nothing lasts forever, neither will Intel's dominance
 
That brings up a good question though: Intel got this far ahead in lithography tech at least partially because they needed to be, especially around the times of the P4 and their hopes of ever-increasing clockrate. As the money kept pouring in, they were able to stay ahead with the enormous R&D budget that came from all their high dollar CPU sales.

As high dollar CPU sales decline, eventually the R&D budget will also decline. Slower sales with smaller margin will begin to take a toll on their ability to keep ahead on lithography, which will eventually challenge that lead. It becomes a self-fulfilling prophecy once the decline starts...

That will undoubtedly happen if they are unable to break into the mobile market. Let's see how cherry trail does this year.
 
My guess is Intel has enough resources to sustain their current lead (for the next several years). But I will be curious to see what happens at 5nm and beyond, ~2019-2020.
 
Why so? What do you mean, please clarify! You do not expect there will be further progress and everything will just stop or what?

There's a fair amount of scientific literature (which I haven't reviewed) that suggests sub-5nm transistors just aren't possible. Of course this doesn't mean we can't do other things, like 3D stacking, bigger wafers, or whatever power-efficiency improvements process people will be able to think of.
 
The distance between atoms in a Si lattice is around 0.5nm. You need a number of those to create a transistor. You also need a way to connect wires to it. And you need a way to deal with the quantum tunneling effects that play at that level.

My first chip was in 0.7um or 700nm. 20 (gasp) years later, we're at 20nm, or a 35x improvement. It's a little jarring to think that a whole industry that has always depended on Moore's law only has another 5x to go.
 
There's also a fair bit of waffling as to which way to go to get the sub 10nm nodes, or whether the last few nm short of the physical limit are going to be worth it, especially with regards to Moore's Law.

I think it would be enlightening to know how much silicon being made these days is really keeping track with that, even now.
Which devices have a design and market optimum at the highest transistor/$ at current nodes and below?
 
There's also a fair bit of waffling as to which way to go to get the sub 10nm nodes, or whether the last few nm short of the physical limit are going to be worth it, especially with regards to Moore's Law.

I think it would be enlightening if there were an exhaustive accounting of how much silicon being made these days is really keeping track with that, even now.
Which devices have a design and market optimum at the highest transistor/$ at current nodes and below?
 
What about the materials science required to move away from silicon as the building block for semiconductors. Surely, as we approach the physical limit for the current way of building circuits and wiring them up, new materials must come in to play?
 
Heavy use of new materials is looking to be necessary even in the silicon photolithography endgame, not that the fundamental lack of divisibility with single digits of atomic layers is amenable to material changes.

Wildly different manufacturing methods that don't augment or build off of silicon lithography have a steep hill to climb, because silicon mass production is massively far ahead even with its problems and super top-heavy economics. Silicon photolithography VLSI may be hitting a dead end at some point, but that dead end does have the benefit of being a functional one.
 
Back
Top