The AMD Execution Thread [2007 - 2017]

Status
Not open for further replies.
Well, that's not pretty; but at least AMD's share fell by less in the last quarter than in the previous ones.
 
I don't believe anything would happen to Intel at all.

The way I see it is AMD came to prominence because they were once needed by Intel to meet the requirement of availability of a second source as mandated by their contract with IBM. But that contract has long expired.

Being the only competitor in a given space is not necessarily a problem. What is troublesome is maintaining a monopoly through 'exclusionary conduct' or 'predatory acts' and/or some specific ways of leveraging an existing monopoly to gain an advantage when entering a related market.

In other words, Intel would be sitting pretty with their 96% server CPU market share for as long as they behave.

Pretty sure Intel would be forced to offer licenses for x86. Even if they behave we would still have a situation where if you go out to buy a new pc, either for private us or for business, you're essentially left with only one option.

A situation where the whole pc using world has to rely on the whims of one manufacturer can never be good.
 
Pretty sure Intel would be forced to offer licenses for x86. Even if they behave we would still have a situation where if you go out to buy a new pc, either for private us or for business, you're essentially left with only one option.

A situation where the whole pc using world has to rely on the whims of one manufacturer can never be good.

I don't think it is ideal either but someone would first have to bring successful antitrust action in court in some jurisdiction where Intel wants to continue to sell CPUs, or extract concessions by threat of such action.

It could actually be invigorating if a competitor other than AMD were to emerge out of this in the end.
 
Pretty sure Intel would be forced to offer licenses for x86.
Yes perhaps, but to WHO would they offer it? Who in their right mind would try and compete with Intel on their own turf? None who tried ever succeeded very well; AMD is AFAIK the only one still left (and we know how well they're doing, heh), unless VIA is also still making x86es, although for years and years that has only been comparatively primitive chips for "low power applications" (which is mostly due to the simplicity of the architecture VIA is stuck with, and with modern tech, full-blown x86 chips can hit the same power envelope and squish VIA's CPUs in performance.)

Possibly some state-backed actor in China might take up production, but considering the current political climate I somehow doubt the US gov't would compel Intel into helping someone like that... :D Of course, if China wanted home-grown x86 chips for themselves, they could just ignore intellectual property rules and simply go ahead and make the chips anyway. Like it would be the first time something like that happened! ;)
 
Yes perhaps, but to WHO would they offer it? Who in their right mind would try and compete with Intel on their own turf? None who tried ever succeeded very well; AMD is AFAIK the only one still left (and we know how well they're doing, heh), unless VIA is also still making x86es, although for years and years that has only been comparatively primitive chips for "low power applications" (which is mostly due to the simplicity of the architecture VIA is stuck with, and with modern tech, full-blown x86 chips can hit the same power envelope and squish VIA's CPUs in performance.)

Possibly some state-backed actor in China might take up production, but considering the current political climate I somehow doubt the US gov't would compel Intel into helping someone like that... :D Of course, if China wanted home-grown x86 chips for themselves, they could just ignore intellectual property rules and simply go ahead and make the chips anyway. Like it would be the first time something like that happened! ;)

Apple might be interested, not to compete with Intel, but to easily replace Intel chips with homegrown ones in Macs. Samsung and Qualcomm might also give it a try. They wouldn't be competing against Intel's big cores anyway, which is where Intel is very, very hard to beat.
 
Intel could point to ARM as the new competitor. In various markets, this would be approaching correct. Billions of people perform a significant portion of their personal computing on ARM devices. Desktop is a declining market, with the lower end potentially nibbled away. Server is a more clear case where ARM does not yet compete in a serious fashion, however, if Intel just promised to not try too hard with Xeon D, it could probably get more competition than AMD provides right now.

If AMD were to collapse, it would be a question of where the patents in their portion of the cross-licensing agreement go.
Regulators would not step in immediately in the event of dissolution, and would not necessarily dictate where the assets would be sold, although the prospect of a high-tech wedge to drive into a cross-license with Intel's portfolio might get attention if a group like Samsung, ATIC, or a party in a less-than-good-terms country like Russia acquired it.
A patent troll group like the Rockstar patent litigation company?

There may be no guarantee that whoever buys the IP from AMD is interested in competing in x86. I don't know if the current agreement would be able to head off purchasing the patents, and relicensing the AMD portion of x86-64 to Intel for some fee.
That would likely be a much more likely money-maker than AMD's x86 is, although the value diminishes as the years pass. It's not clear how much AMD has been contributing to the pool to keep it fresh as the years tick by on the big-ticket items Intel wants.
 
Apple might be interested, not to compete with Intel, but to easily replace Intel chips with homegrown ones in Macs. Samsung and Qualcomm might also give it a try.
Color me sceptical. x86 arch is extremely dated and wonky, it's nothing short of a small miracle it's even viable today, much less performing at the level it is. The amount of innovation and engineering required to get this shitty ISA to perform decently is substantial, frankly I don't see why anyone would bother. What would be the point? There's ARM when you need doing something Intel can't - or won't.

Samsung and Qualcomm (and one might argue, also Apple), have no need for backwards compatibility. Mobile devices in particular have no legacy apps from a decade or several that just NEED to work for some particular specific professional purpose, so why would anyone in their right mind want to use x86 in such a setting? The additional instruction translation required, register re-naming and so on is always going to bog down x86 compared to native RISC archs.
 
Color me sceptical. x86 arch is extremely dated and wonky, it's nothing short of a small miracle it's even viable today, much less performing at the level it is. The amount of innovation and engineering required to get this shitty ISA to perform decently is substantial, frankly I don't see why anyone would bother.

I've seen these x86 doomsday statements for what, 10 years now?
Yet there doesn't seem to be a single measurable metric that corroborates these assumptions.

Why does Intel keep selling server chips for UNIX?
How does Intel keep getting two-figure performance boosts in IPC at least once every two years?
Why are the consoles using x86 chips?
Why did Mark Cerny claim that most people were wrong to assume that x86 was inefficient, after researching on the subject?
Why are PCs still using x86?
Why are apple PCs still using x86?
Why did Windows RT fail so miserably?
Why did the Zenfone 2 score so well in Anandtech in both performance and battery life, even though it's downright emulating some functions in Android?
How come there's not a single ARM core that competes in single-threaded performance with Bulldozer (which is supposed to be terrible by today's standards)?

And most of all, who the hell cares that x86 is apparently extremely dated and wonky, if the lowest level that 99.99% of coders will ever use is C/C++ and then tell the compiler to do the rest?
 
Samsung and Qualcomm (and one might argue, also Apple), have no need for backwards compatibility. Mobile devices in particular have no legacy apps from a decade or several that just NEED to work for some particular specific professional purpose, so why would anyone in their right mind want to use x86 in such a setting? The additional instruction translation required, register re-naming and so on is always going to bog down x86 compared to native RISC archs.

And yet, despite Apple not needing backwards compatibility and being able to make their own ARM CPUs, they would rather use Intel x86 CPUs for their non-mobile parts. ARM is well and good, but it's optimized for low power computing. With regards to high performance personal computing, there is no competition to Intel's x86 architechture except for AMD's x86 products. Sure AMD's x86 products can't match Intel's mid to high end CPUs, but they are still far more powerful than ARM products.

Perhaps Apple will eventually transition their personal computers to ARM someday. But why, when they can just get performant x86 CPUs from Intel?

For all the pooh-poohing of x86, there is still no competition to it for high performance personal computing. And very little competition in the server market. Even HPC where backwards compatibility is far less of a factor, x86 dominates for CPUs (yes I know GPUs are quite prevalent there) and the same goes for the world's supercomputers.

Oh and as to why would someone want to enter the x86 market? High margins. The ARM CPU market is in a race to low margin commodity pricing, making x86 potentially quite a bit more profitable than trying to compete in the low power ARM CPU market. Of course, the flip side, is significantly higher R&D required to compete in the x86 market.

Regards,
SB
 
Last edited:
Totz,
The answer to most of your whys is: because SW.
And the answer to the remainder of your whys is: because Intel has an order of magnitude more engineers working on it than anyone else.
 
And yet, despite Apple not needing backwards compatibility and being able to make their own ARM CPUs, they would rather use Intel x86 CPUs for their non-mobile parts.
Yes of course, why bother inventing the wheel twice, expending vast amounts of money and time in R&D just to reach where Intel is now in performance, and much less surpassing them? I'm sure Apple's getting a pretty sweet deal on Intel chips anyway.

It just makes sense they're doing what they're doing, and you can't just replace Intel straight off anyway. Well, you could perhaps, if all you make is like, a Macbook Air or something. But Apple has a large product range, several performance tiers just for laptops, then desktops, and on top of that an up to 12-core workstation as well. You would need to re-invent the wheel not just once, but many times to replace Intel. This is basically unfeasible, even for a company as rich as Apple. They could do it I suppose, if they truly wanted to. With their deep pockets they could probably poach enough managers and engineers and so on to get the project rolling, but it would take friggin years and years to actually produce anything competitive with Intel's offerings at that point in time.

Shit, they could afford to buy Intel if they wanted to and have spare change left over, but again, why? They can get the chips they want in quantity (buying many millions of chips every year gives you first served priority, heh), and at a price they're obviously willing to pay, and there's nothing better out there either, so, like, meh...! ;)

ARM is well and good, but it's optimized for low power computing.
Well, you could design an ARM chip for any power envelope you like really. Not sure where ARM is on SIMD math, how competitive that aspect is, but such problems can always be solved if there's an actual need for it. The thing is you already have Wintel for high-power computing and plenty of OS and software support, so again, why actually bother going through the trouble designing an ARM chip with no enterprise software support to go with it? :)

Thing is though, eventually mobile ARM chips are going to become so powerful that they could reasonably replace any regular desktop computer's brains straight off. Even some workstations. It's probably not going to take all that long either. But software isn't there. Chromebook is...well, languishing is probably the nicest way to describe it, and Microsoft got burned bad with Win RT. I don't think anyone's stressing all that much trying again, even at the big corps.

Oh and as to why would someone want to enter the x86 market? High margins.
You can only enjoy the high margins if you somehow manage to develop a successful, sought-after product, and the bar to entry is incredibly high on that one, making it pretty much not worth even trying in the first place. Look at AMD's x86 operation and ask yourself how sure-fire those high margins are... :)
 
I think it's in Intel's best interest to keep AMD going. AFAIK the cross licensing deal between Intel and AMD is not transferable. Therefore, if AMD goes down and someone buys AMD or AMD's patent portfolio (instead of just pumping money into AMD), they'll have to renegotiate the licensing deal with Intel, and that'll complicate things for both Intel and the new buyer. Therefore, the potential risk of buying AMD is pretty high for anyone but Intel itself. However, it'd be unwise for Intel to buy AMD outright as that'll certainly be under heavy scrutiny from regulators.
 
It just makes sense they're doing what they're doing, and you can't just replace Intel straight off anyway. Well, you could perhaps, if all you make is like, a Macbook Air or something. But Apple has a large product range, several performance tiers just for laptops, then desktops, and on top of that an up to 12-core workstation as well. You would need to re-invent the wheel not just once, but many times to replace Intel. This is basically unfeasible, even for a company as rich as Apple. They could do it I suppose, if they truly wanted to. With their deep pockets they could probably poach enough managers and engineers and so on to get the project rolling, but it would take friggin years and years to actually produce anything competitive with Intel's offerings at that point in time.

Shit, they could afford to buy Intel if they wanted to and have spare change left over, but again, why? They can get the chips they want in quantity (buying many millions of chips every year gives you first served priority, heh), and at a price they're obviously willing to pay, and there's nothing better out there either, so, like, meh...! ;)


Well, you could design an ARM chip for any power envelope you like really. Not sure where ARM is on SIMD math, how competitive that aspect is, but such problems can always be solved if there's an actual need for it. The thing is you already have Wintel for high-power computing and plenty of OS and software support, so again, why actually bother going through the trouble designing an ARM chip with no enterprise software support to go with it? :)
While I agree completely, I'm going to play devils advocate here, and give at least a few reasons:
1. Apple wouldn't have to pay the Intel margins. If you look at what Apple charges for the new MacBook, deduct their typical margins, and then look at how large a portion of what is left goes toward that ULP Broadwell, you'll see that it could be one heck of a lot less expensive. Contrary to what you imply Intel can't play favourites with pricing. Oh, they can provide other fringe benefits, but pricing is regulated. And Apple uses a lot of the upper tiers of Intels consumer processors. Rolling their own processors, Apple could attract more price sensitive customers, fatten their margins, or for that matter go up in die size and capabilities providing more performance at lower cost. Or any combination.
2. Apple could keep their own schedule when it comes to product launches. The recently launched MacBook for instance was delayed horribly by the late arrival of Broadwell Y processors in the necessary Apple volumes.
3. They could tailor the processors to the devices they want to make and the market tiers they want to create. Now, it is well known that Intel effectively makes processors that are more or less bespoke for Apple, but still.
4. Intel plans ahead, and sometimes their carefully measured pace of introducing new features and capabilities may not fit Apple, who can be quite aggressive in how they push forward in some respects. For instance, even Skylake won't support higher than 4k resolutions. However Apple already offers their 5120x2880 iMac, so for at least another couple of years or so Apple will have no integrated graphics option that can be utilized for these classes of displays and up.
5. Apple sells a bit over 300 million of their own SoCs yearly, and rising. This is similar to Intels volumes, and in devices that enjoy higher ASPs. They already have the volume to drive the development of very competitive processors, without the development cost per chip getting too high. Also, by their very nature SoC are highly integrated modular devices so making versions that target other power envelopes is relatively straightforward, at least as long as you don't aim for 100W processors, but stay in more typical ranges.

Anyone who takes on interest in processor technology would love to see what Apple could produce in a couple of years on 10nmFF and with a power envelope of 40-50W. Unfortunately I don't think the arguments presented above is enough to provoke Apple to shift architecture. OSX Macintoshes is their legacy platform. Reasonably, they would much rather push their iOS products into new market niches, than muck around with their relatively low volume OSX range, that does quite OK as it is. It's a boat that doesn't really need rocking.

Edit: For relevance to an AMD thread - anyone who takes an interest in processor technology is of course quite interested in how AMD Zen will shape up. Together with Qualcomm Kryo, it is the most interesting new core on the horizon.
 
Last edited:
AMD releases a new beta driver, for Witcher and Project Cars. Still no fucking release driver on the horizon, whereas Nvidia has dropped several WHQL certified drivers these past five months alone.

What the hell is going on, did they completely gut their driver dev team after they stopped the monthly release schedule or what? Since they abandoned monthly releases they've released what, three-four WHQLs? Last year alone saw a whopping TWO versions, AFAIR. This is pitiful, I can only assume they're conserving their resources for Windows 10 RTM version, and we'll finally see a non-beta release around that time, but that's just wishful speculation for me at this stage...
 
We're probably getting a major new release in June, together with Fiji and all the rebadges.

While I question the need for WHQLs for anyone who's actually serious about playing games, I do admit that the waiting time for supposedly stable releases if getting a bit out of hand.
 
Yes perhaps, but to WHO would they offer it? Who in their right mind would try and compete with Intel on their own turf? None who tried ever succeeded very well; AMD is AFAIK the only one still left (and we know how well they're doing, heh)
If Intel's x86 patents were licensed/invalidated by antitrust orders, perhaps the interesting competition that would open up wouldn't be in a CPU manufacturer making an new x86 chip. It could be most interesting if new mobile CPUs were allowed to support or emulate the x86 ISA. I'm thinking especially of NVidia's existing Denver which already has a hardware code translation design (similar in sprit to Transmeta's functional but ultimately unsuccessful x86 chip). Or a hypothetical A11 chip from Apple, which could allow their own CPU to run existing OSX applications natively and allow an easier hybrid iOS/OSX.
 
If Intel's x86 patents were licensed/invalidated by antitrust orders, perhaps the interesting competition that would open up wouldn't be in a CPU manufacturer making an new x86 chip. It could be most interesting if new mobile CPUs were allowed to support or emulate the x86 ISA. I'm thinking especially of NVidia's existing Denver which already has a hardware code translation design (similar in sprit to Transmeta's functional but ultimately unsuccessful x86 chip). Or a hypothetical A11 chip from Apple, which could allow their own CPU to run existing OSX applications natively and allow an easier hybrid iOS/OSX.

AFAIK most of those x86 patents are related to implementations in hardware. Emulating x86 doesn't seem to require a license deal with Intel, as Transmeta didn't seem to have a license deal with Intel or AMD.
 
Status
Not open for further replies.
Back
Top