Is Intel's dominance just dumb luck? *spawn

Without intense competition, the biggest player would just sit at the last available process for as long as possible. There were a total of 3 pitch refinements in processes in the 80's (not counting other refinements)

http://en.wikipedia.org/wiki/1.5_µm_process

versus 3 in this half decade alone. The effort put into the most recent three also utterly dwarfs those from the 80's. The capital needs are now astronomical, but this is the result of competitive pressure rather than some imperative of Intel that just because they have the most money to spend in advancing fab processes that they should. (This almost always the case, look at the progress your cable box has made in the last decade versus your cell phone.)

That defeats your whole point. Competition in the CPU space was much more fierce in the 1980's, especially the first half than it is today. There was no dominant player and Intel didn't command nearly as much of the market as it did at the beginning of the 2000's. And in the 70's there was even more fierce competition, albeit with a much smaller market.

70's - 3 node transitions.
80's - 3 node transitions.
90's - 4 node transitions.
00's - 4 node transitions.
10's - 3 node transitions thus far. Although node transitions aren't as clear and delineated as they were in the past.

So, how do you explain the 90's and 00's having more node transitions than the 70's and 80's despite having significantly less competition and a dominant CPU player that did not exist in the 70's and 80's? It wasn't until the late 80's that Intel started to gain dominance due to the plethora of IBM PC/XT clones. But that was absolutely nothing compared to what they had in the late 90's early 00's. Yet despite Intel's virtual stranglehold on CPU's in the 90's and 00's, we still increased the rate at which we transitioned to new nodes.

Here's a hint, it has to do with volume of production. The more volume created, the more revenue is generated for the semi-conductors. The more revenue generated, the more profit (presumably, although that's not always the case). The more profit generated, the more money can be sunk into researching and implementing new node transitions.

And this is coming despite the fact that there is less competition AND it is significantly harder and more costly to transition nodes now than it was in the 70's, 80's or 90's. The volume of product shipped and sold is key to allowing progress as node transitions become increasingly difficult.

Regards,
SB
 
Last edited:
You're right, once computer demand hit new highs you had more money around that could be poured into R&D, but even though revenue is necessary for advancing process nodes, it isn't sufficient. There needs to be a legitimate challenge coming from rivals for a company like Intel to want to advance the state of the art once things are "good enough." Apple anyone? Surely if it spent 1/10 of its revenue a year on new fabrication techniques we'd see more substantial advances by now, but just because the money is there doesn't mean it will spend it. Intel's hand by comparison was forced by competition; if they had successfully blocked competition in the x86 space, I think we would have been stuck w/ good enough instead of some of the stunning innovations we've seen.

I also wouldn't say that competitiveness / competition (measured not in the quantity of companies) decreases just because there are fewer players in a space (although it goes to 0 for sure when there's only 1); a small handful of competitors might actually be trading more blows and making bigger advances for bigger stakes than lots of competitors going in many different directions, most of them software dead-ends. In terms of processor design, an important difference between the 70's and early 80's from the early 90's and 2000's was that in the latter, software written for a single, mature instruction set became a well established staple of business. Hardware manufacturers had a standing target to optimize for instead of thinking about deciding what went into a new instruction set, how it was implemented in hardware, and courting software developers.

It was the convergence of software demand to code written in this one instruction set and legitimate competition in this settled space, not just the total number of pure hardware competitors that matters. In the 70's and early 80's, each hardware company would be hawking a different CPU with software incompatible w/ hardware from any other company, and software oriented corporations weren't big business. Then software oriented companies became extremely big and the i386 provided a sufficiently advanced set of instructions in which the biggest revenue generating software was written for over 2 decades. Once software vendors rallied around i386 compatible code and there was a layer of abstraction that could be counted on to not change much, hardware manufacturers could pour resources into just designing hardware to run code written in those instructions fast without running the risk of going down a software dead-end.

Intel also fortunately _didn't_ have a strangle hold on x86 in the 90's and the early 2000's; the competition was fierce, the strides made huge, and the stakes were big. They lost an important lawsuit to AMD in the early 90's and felt the heat from the resulting flood of x86 compatible clones with seriously competitive performance starting with the K6 and 6x86. With the software winds aligned with their sails, both AMD and Intel had the money to pour back into cutting edge fabs and their competition caused the speed of process advances to pick up. AMD64 and Opteron was a total upheval of technical leadership and they resorted to rebates when serious technical missteps like the Itanium and P4 threatened to hobble them.
 
Last edited:
Because they out engineered all the alleged competition.

You said anyone with billions can compete because they have billions. You made it sound like Intel inherited the money without actually earning it or having engineering talent, I'm simply pointing out they got those billions through hard work and earning it.
Even if that was the case, which I agree with (save for hard work being a key for a big company like that, talent and taking the right decisions are what makes things work, not hard work itself), and while I like Intel, market dominance is not good for the consumer.

Say...if Sony or Microsoft had market dominance in the consoles realm, they could charge consumers for petty things. For instance, now that PS4 enjoys a bigger market share than the Xbox, MS are offering big deals almost every week and working harder to make people happy with updates and stuff.
 
Intel probably won't spin off their fabs. This would hugely complicate a very lucrative line of business producing bespoke hardware for Government(s).
 
Intel's current node transitions have been extremely profitable for them. They sell their 177mm^2 dies for twice the money of AMD's 315mm^2 dies. Their investment into new nodes pay off in the long run even if competition isn't fierce. The lack of competition from AMD means Intel was able to sit on 4c/8t in the consumer space for 7 years now, there has been improvements in the cores for sure but the dies are just getting smaller and thus Intel's margins get bigger. Intel could for the same price as the i7-920's original price and power budget, have made 8c/16t cpus by now. They do have them inside the Intel extreme line up but I would imagine they could even do this for the mainstream consumer segment if they were pressured. Haswell-e is barely larger than Piledriver and if Intel just cut down that oversize cache, they could have a 8c/16t cpu for under 300mm^2. If there really was competition, I'd imagine we'd see this instead of the 4c/8t i7s that sell for $350 today.

Just looking at how far AMD is from Haswell-e performance will tell you just how little effort Intel put in and still be ahead. There is no drive to bring 8c/16t CPUs down to the price segment where dies that size used to occupy. AMD can`t even compete with Intel`s second rung of CPUs. AMD will never be able to compete with serious Intel just because of how much money Intel can put into R&D. And it has gotten to the point where AMD has lost so much money that it can even compete with an Intel that is barely moving.

Intel makes more money -> Intel puts more money into R&D -> Intel gets more ahead -> Intel makes more money.
AMD makes little money -> AMD puts little money into R&D -> AMD falls behind -> AMD loses money
This cycles has gone on to the point where AMD might as well not exist in the x86 space.

W/e got Intel to where it is, it has to be some luck involved. I am not saying it was all dumb luck. Or am I saying that Intel does not have engineering talent. But at the end of the day. Intel got itself into a very secure place in their monopolistic market segment. Nobody is really allowed in and their only other competition is a dying husk of a company which is losing hundreds of millions a quarter. This is all in relation to the x86 PC space. Intel has competition in other spaces such as mobile but those are segments that are distinct from PC at least for now. Intel will sell a lot more phone/tablet SoCs than anyone will sell ARM PCs. The competition is so indirect it allows intel to pretty much milk the PC market, which is still a multi hundred billion dollars a year market.
 
The lack of competition from AMD means Intel was able to sit on 4c/8t in the consumer space for 7 years now
It's not that. A consumer does not want six cores. A consumer wants those 2 or 4 cores, just cheaper than before. Intel's main competition is itself.
There is a reason Core i3 dual cores do not have turbo and have lower top frequencies than quad cores. It's another way of motivating consumers to choose quad core parts instead of duallies.
Besides which, Intel has been spending more and more die space, relatively. It has just been spent on beefier GPUs, not more cores. Far more relevant in the consumer world.
Even in the (light) gaming case, 4 cores are more than enough. Intel won't be making a six core consumer Skylake. Intel will be making a 72EU Skylake.
 
Even in the (light) gaming case, 4 cores are more than enough.
Arguably, four cores, as long as they're modern intel ones at least, is enough for anyone. There's not a game available where you can't hit 60fps+ with four CPU cores. Then add hyperthreading, and you're already in overkill scenario...
 
AMD was highly competitive in the time frame of Athlon, Athlon 64 and Athlon 64 x2. It was a clear number #1 choice for gaming PCs and even companies bought AMD boxes in high volumes. If I remember correctly AMD peaked at around 40% market share of desktop CPUs. They also had a solid presence in the server market, as their CPU integrated memory controller was a big deal for those workloads. During this era, AMD CPUs were priced close to the competing Intel CPUs, meaning that AMDs profit margins should have been close as well.

The biggest mistake that AMD did was the Bulldozer architecture. Without this mistake the situation now would have been quite different. I don't understand why AMD didn't see that Intel's (Pentium 4, Pentium D) and IBMs (POWER6, Cell/Xenos) speed demon designs were failing to reach their goals (10 GHz clock targets failed, TDP was horrible). Both Intel and IBM started to focus heavily on performance per watt designs. AMD continued to develop the Bulldozer further despite the delays and despite that it was clear that speed demon architectures had no future. AMD simultaneously maintained both the K10 (Stars) architecture and the Bulldozer for a long time (APUs had Stars while desktop chips had Bulldozer). AMD also developed a completely new low-power core (Bobcat->Jaguar) at the same time. Their resources were spread too thin. AMD should have scrapped the Bulldozer design after the problems become apparent, and migrated the best new tech/ideas to their existing Stars CPU core. At that point AMD should have implemented SMT ("Hyperthreading") like all the other CPU manufacturers at that time (Intel, IBM, Sun). That would have warranted a wider CPU core (high IPC + extra scaling for multithreaded software).

It would be nice to know what exactly happened to K9, the AMDs rumoured 8 wide high IPC design (http://en.wikipedia.org/wiki/AMD_K9). Was there internal conflicts between the future paths? Narrow high clock (Bulldozer) vs wide low clock (K9)?

Conclusion: In my opinion Intel's dominance now is not just about dumb luck. AMD almost beat Intel, but they made a critical mistake. As a challenger, you cannot make mistakes this big. AMD did many things right, they started focusing on CPU + GPU fusion (integrated on the same die) sooner than Intel (but delivered their product later, because of splitting their focus to too many CPU architectures), they implemented 64 bit x86 sooner (Intel adapted AMD instruction set), they implemented float vector instructions sooner (3dNow), they had a true dual core CPU sooner (Intel put two chips under same heat spreader), etc. They could have implemented a wide CPU with SMT sooner (K9) than Intel (Nehalem), but chose to chase Intel's and IBMs outdated speed demons and failed...
 
It's not that. A consumer does not want six cores. A consumer wants those 2 or 4 cores, just cheaper than before. Intel's main competition is itself.
Actually, people will pay for things they don't need if the price isn't such a big deal. If you really want to argue about this than the C2Q and the i7920 would have never sold as well as they did as back then, nothing used that many threads. If Intel sold Haswell-e at consumer prices, many people would still buy them simply because they are affordable enough and will be more future proof.

Consumers also don't know what they want. Most people would do with cpus 10 years old at this point for everything they possibly do but they still go to bestbuy and buy i7s. Its all about marketing. People buy what they think they want. The marketing makes it that they think they want things they really don't. I have family that complain about how their computers are slow but all they do is web related stuff. They also have i7s in their systems. It doesn't matter what they buy, they will download a ton of bloatware and make their system slow. That is the average consumer and that is why companies can even sell new hardware year after year. These same people are now discovering that they can do everything they used to do on a PC on their new tablet/smartphone. Without all the bloatware, even tho these devices have like 1/2 the processing power as an old C2Q PC, feel faster. Consumers don't know this and just think their PCs are old junk. The decline in the PC market is largely driven by consumers realizing they don't need a huge tower to do the basic web related activities they do.

If consumers really knew what they needed. AM1/celeron systems with an decent SSD would be a lot more popular than they are now.

As for gamers. Many of them buy i7s/i7-e when they don't need them. Their justification is worth it for them. You can't just say they don't want want more powerful parts if they could for the same money.
 
Conclusion: In my opinion Intel's dominance now is not just about dumb luck. AMD almost beat Intel, but they made a critical mistake. As a challenger, you cannot make mistakes this big.

Intel learned the end of silicon scaling the hard way with Prescott. Intel had a hugely succesful contingency though: The Pentium M which hit just as the PC market transformed from primarily desktop to laptop.

Intel was quick to kill off Netburst after Prescott, throwing resources behind Core 2 (memory disambiguation being a game changer), only to finally merge valid P4 features into their new Core architectures with Sandy Bridge.

Better management, deeper pockets and CPU architects at least on par with AMD is why Intel came back and beat AMD, - not luck.

Betting the farm on a speed racer years after P4 demonstrating the futility doesn't reflect well on AMD management.

Cheers
 
Consumers also don't know what they want. Most people would do with cpus 10 years old at this point for everything they possibly do but they still go to bestbuy and buy i7s. Its all about marketing. People buy what they think they want. The marketing makes it that they think they want things they really don't. I have family that complain about how their computers are slow but all they do is web related stuff. They also have i7s in their systems. It doesn't matter what they buy, they will download a ton of bloatware and make their system slow. That is the average consumer and that is why companies can even sell new hardware year after year.

That's not even strictly the problem : web pages are single-threaded mostly, sometimes almost the whole browser is single-threaded (Firefox) so the i7's thread are just wasted. If you're suffering a very CPU heavy web page then a Celeron at 3.7GHz would be about as good as an i7. Then it suffers a storage bottleneck (writing/reading the webcache from the hard disk) or perhaps more often internet bandwith bottleneck. They don't seem to know an i7 will download at 120KB/s max on semi-public wifi while their hopelessly slow laptop would download at 10MB/s if plugged at 100BaseT on a network that has WAN fiber.

Bloatware then comes on top, but it can be relatvely lightweight (search hijacker, adware etc.)

These same people are now discovering that they can do everything they used to do on a PC on their new tablet/smartphone. Without all the bloatware, even tho these devices have like 1/2 the processing power as an old C2Q PC, feel faster. Consumers don't know this and just think their PCs are old junk. The decline in the PC market is largely driven by consumers realizing they don't need a huge tower to do the basic web related activities they do.

If consumers really knew what they needed. AM1/celeron systems with an decent SSD would be a lot more popular than they are now.

As for gamers. Many of them buy i7s/i7-e when they don't need them. Their justification is worth it for them. You can't just say they don't want want more powerful parts if they could for the same money.

I agree a lot, though the tablets have special circumstances : mobile website instead of full website, "app" instead of mobile or full website, everything "hardwired" to beam 2D surfaces to the GPU and video to the hardware codec.
I advise to get 8GB RAM too along a low end CPU (either with HDD or SDD, it's still decent with the latest HDDs imo). The overkill capacity is good because you can run a 64bit browser without worrying too much and there is usually a ton of transparent disk cache.
 
Back
Top