The AMD Execution Thread [2007 - 2017]

Status
Not open for further replies.
If AMD would go under otherwise, I don't think there's any chance Intel would NOT be there saving them, just because the US monopoly-thingy (trade comission or whatever it is?) would be chopping Intel with antitrust claims and monopoly lawsuits befoer Intel could say x86
 
Intel's close to or is effectively a monopoly in most markets it competes with AMD in, and the trend is not reversing. It's not clear how much interest the government has in maintaining this charade, and it's not clear what remedy could be instituted in the event AMD ceased to be.

Intel could probably point to ARM or more likely companies that licensed it like Apple, Samsung, and Qualcomm, which are larger or have market caps bigger than Intel.
 
Intel will simply say that they own less than 5% of the global chip market so clearly aren't a monopoly. Really if anything is a monopoly it would be ARM.

Nothing like that is saving AMD.
 
Intel will simply say that they own less than 5% of the global chip market so clearly aren't a monopoly. Really if anything is a monopoly it would be ARM.

Nothing like that is saving AMD.

MacOS, Linux, UNIX, mobile OS'es etc didn't stop MS from being sued to hell and back for monopoly
 
Samsung can just hire whatever it wants from AMD, and wait for the carcass to hit the ground before picking up the patents it may crave...or wait for JP & friends to give Rory this really super duper mega consulting pro-tip (bro-tip?) to sell off the patent portfolio so that they live for another year and get closer to the ultimate winning strategy of making non-descript ARM chips and having super-optimal costs by outsourcing everything.
:p
 
Samsung can just hire whatever it wants from AMD, and wait for the carcass to hit the ground before picking up the patents it may crave...or wait for JP & friends to give Rory this really super duper mega consulting pro-tip (bro-tip?) to sell off the patent portfolio so that they live for another year and get closer to the ultimate winning strategy of making non-descript ARM chips and having super-optimal costs by outsourcing everything. IMHO anybody that's already involved in making MPUs / SOCs has no reason whatsoever to buy AMD or merge with them, the only ones that may care to put in the cash would be some investment fund that just wants to increase the portfolio / put money in high-tech.
If someone like Samsung cares about getting some engineers and patents they can wait. If they want a complete engineering team they can't wait as they won't be the only company picking off engineers.
 
If someone like Samsung cares about getting some engineers and patents they can wait. If they want a complete engineering team they can't wait as they won't be the only company picking off engineers.

Considering the sort of engineering talent bleed occurring recently, with Samsung, NV and Qualcomm gleefully picking up people left and right, it is unclear to me what sort of complete engineering team AMD can put on its sales-counter...moreover, it is debateable whether or not an established player like Samsung wants a complete engineering team, with all the good and bad that's involved, or just wants to headhunt at will and then perhaps pick up the people that the hunted heads will want.

It is indeed harder to pick at the carcass once it fell, since more predators will be around...is the risk of that high enough to offset overspending now to get AMD? Debateable. And I say overspend because I have a sinking feeling that even if the BoD and Rory would like to cash in easily, ATIC may be resistant to that, and they hold a non-negligible share, as well as a voice on the board.
 
Is there anyone left who believes buying ATi instead of "merging" with NVidia was the smarter move?
Years after... NV still has BETTER GPGPU solutions threatened only by future Intel products
 
Sure, absorbing NV would've been even more complicated. Of course, an absolutely brilliant idea would've been to avoid buying either, and figure out how to help out the core business with the money thus saved, as opposed to entering extremely ill-setup agreements that chase pie-in-the-sky dreams.

ARM bought the Phalanx people (small team), Intel built up its GPU expertise by relatively small scale acquisitions (I think even the Real3D/Lockheed Martin chaps were small and easy to gulp etc.), Via got S3. Only AMD was "wise" enough to go shopping for an overvalued behemoth, so that they could have chipsets and IGPs...oh sure, and Fusion 1111oneoneone. It is difficult to argue that they wouldn't have been hugely better off either purchasing SiS + XGI (I think those were still disjointed), or S3 off of Via or...anything else that was small, cheap, hugely easier to integrate and not bound to get you extra competitive pressure from a super-competitive rival such as NV.
 
AMD's graphics architecture had to make a much harder right-hand turn to get closer to how compute works with AMD's CPU floating point, and it's not coincidental that GCN has so many hallmarks of Nvidia's GPUs.
Even with the turn, there are elements that make GCN very disparate from the CPU SIMD it is theoretically meant to augment that have not been resolved. There's a massive batch size difference between CPU and GPU SIMD. Nvidia's would have been closer at the outset, although neither GPU's scalar capability or completeness is where it should be.

The argument has been made that ATI wouldn't have been as bad a choice if AMD had run with the mobile group back when it did.
AMD's and ATI's common lack of software follow-through appears to have been a good match, if nothing else was. Nvidia's software and tools, even if not up to snuff for a lot of people, are at least mentioned in the same sentence as standard compute toolchains without some of the extreme scorn I've seen heaped on AMD.
What are the odds that if Nvidia's software engineers were acquired AMD would have fired them all anyway?
 
AMD had trouble absorbing / integrating ATI, and the companies were like long lost siblings in many regards, from the rather significant disabilities in handling business processes and the disdain for most things software, to the engineering prowess and the "if you build it they will come by magic" attitude. Also the constant whining about the evil competitors that always win through some dastardly scheme as opposed to based on some merit. NVIDIA would've caused a huge indigestion, not to mention what is likely to have constituted a bleed that would make Rory's clean-up seem like a bit of nose powdering, once Jen-Hsun would've started actually expecting results from old school AMDers that were accustomed with the "slow and possibly stagnant wins the race...or at least minimises stress" attitude.

Conversely, NVIDIA integrating AMD would've hugely strained their resources, at a time in which NV needed them. So I disagree with the notion that it was likely to have been a good deal, it would probably have been an equally bad one, but it is possible that JHH would have been better at not completely screwing things up, contrary to people like Ruiz or Meyer, who were quite obviously deprived of any strategic vision, tenacity or even intuition. Meyer's backing of BD at all costs was quite clearly highly detrimental to AMD. Considering the ludicrous amounts of sales it brought (and it was clear once they actually got silicon / advanced sim results back that it would suck, please no JohnFruehe mentions at this point), AMD would have been better served just selling K8L, possibly using the tweaked core from Llano and dragging on until they could release Trinity based notebook SKUs, which is where that chip could have done serious damage. Instead they blew large amounts of money bringing up and marketing a POS - JHH and/or possibly even RR would've just said no, Meyer went for it.

AMD could've gotten S3 or XGI for pocket change, coupled with SiS remaining chipset engineers, and thus they would've gotten their miniME Intel platform. They also would have had a large chunk of money to invest in their core business and in process R&D (maybe 32nm wouldn't have sucked so epically as a consequence? since GF's current 32nm is pretty much inherited from AMD). Hell, they could have done what everybody else is doing with them currently: wait for ATI's carcass to hit the ground then pick it up, as opposed to ridiculously overpaying for it as if it was a company in full stride. Which it definitely was not since ATI had had a pretty fubard margin model, and botched up releases for quite a while.
 
NVIDIA would've caused a huge indigestion, not to mention what is likely to have constituted a bleed that would make Rory's clean-up seem like a bit of nose powdering, once Jen-Hsun would've started actually expecting results from old school AMDers that were accustomed with the "slow and possibly stagnant wins the race...or at least minimises stress" attitude.
It's not clear that those would have been the old-school AMD engineers. Some of the discussion of the Ruiz and Dirk eras was the shift from smaller teams and optimized physical design to large teams reliant on more automated processes. Many of the old schoolers left by that time, and the stagnation in design shows.

Meyer's backing of BD at all costs was quite clearly highly detrimental to AMD. Considering the ludicrous amounts of sales it brought (and it was clear once they actually got silicon / advanced sim results back that it would suck, please no JohnFruehe mentions at this point),
There were multiple attempts to replace K8. BD was just the last and they had to release something.

AMD would have been better served just selling K8L, possibly using the tweaked core from Llano
Llano had significant issues at 32nm. There are likely many reasons for that, but BD isn't the chip with terrible yields that had GF and AMD sniping at each other.
BD wasn't good, but that didn't make K8 viable, it was demonstrably worse.

They also would have had a large chunk of money to invest in their core business and in process R&D (maybe 32nm wouldn't have sucked so epically as a consequence? since GF's current 32nm is pretty much inherited from AMD).
That was set in stone when IBM went gate-first and AMD had to follow, that choice was why GF and Samsung nearly told them to fly a kite when the question came up for 20nm.
AMD lacked the capability to decide its destiny in process tech since 130nm or 90nm, so ATI or no ATI might not have changed the fundamental problem AMD faced.
 
There were multiple attempts to replace K8. BD was just the last and they had to release something.

I disagree with the last part. Given the non-impact that BD had in any markets that it shipped, and the highly damaging impact it had WRT perception of AMD products, not releasing it at all would likely have been more beneficial. Looking at AMD's shipments have been comprised of, they still ship / shipped more K8Ls than they did anything BD (excluding Trinity SKUs, of course).

Having Llano out to cater to their incredibly weak notebook line-up and to the desktop bottom end, whilst shipping x4 and X6 K8Ls in consumer, possibly with a clock bump, would probably have had a net positive effect on AMD's standing, IMHO. On one hand they'd have saved bring-up and mkt costs, on the other they'd have paid less to GF (45nm probably much much cheaper than 32nm). Perhaps more importantly, there'd have been some hope left around the mystical BD that would save them, as opposed to having the world see it's a lemon. Bringing it out into the world in the form of Trinity (possibly sooner as a consequence of skipping BDver1) would have made it look more competitive, by association and by positioning. Having BDver1 SKUs did squat to help bolster AMD sales, hold market share erosion or financials...it probably made them worse because for the price they were selling cheaper Phenom IIs they now had a more expensive BD thing, which was quite unappealing.

Of course, the whole argument about BDs raison d'etre is that it was supposed to set the server world alight, but it was (IMHO) a very bad call to make once they had adequate characterizations in hand. It quite clearly had no chance of doing anything useful there either, and given that AMD was already in the noise in that particular space, bringing out another dud and putting money into that was non-sense. If anything they could have tried for the FirePro APUs back then, if needing to have people remember that AMD exists. Probably same negligible impact, but much cheaper, and also less screwy with what might've been reasonably lucrative Magny-Cours based SKUs.
 
Remember that ATI released the lackluster R600 series after its acquisition as well. I totally agree that there was a lack of vision in focusing on the traditional AMD vs. Intel beefy-CPU battles rather than finding a small slice of the burgeoning mobile market or improving their supply chain to supply sufficient quantities of Llano or Brazos for big customers like Apple.
 
I disagree with the last part. Given the non-impact that BD had in any markets that it shipped, and the highly damaging impact it had WRT perception of AMD products, not releasing it at all would likely have been more beneficial.
Do you have a breakdown of the sales mix in the server market? I don't have them. I don't think BD is inferior to Magny Cours in the bulk of that market.
That aside, AMD was contractually obligated to release something, with Cray as the most notable example.
It also sounds like AMD had some volume requirements at 32nm, since so much had to be done to modify its agreement with GF. BD was comparatively low volume to Llano, and it wasn't BD that was financially wrecking AMD enough to force a highly unusual good-die agreement with GF.

Looking at AMD's shipments have been comprised of, they still ship / shipped more K8Ls than they did anything BD (excluding Trinity SKUs, of course).
AMD took a $100 million writedown on Llano, so K8 at 32nm did more outright numerical damage to AMD than BD did at 32nm.

Having Llano out to cater to their incredibly weak notebook line-up and to the desktop bottom end, whilst shipping x4 and X6 K8Ls in consumer, possibly with a clock bump, would probably have had a net positive effect on AMD's standing, IMHO.
Llano was the chip that forced good-die, meaning AMD was severely pressed to make that chip worthwhile.

On one hand they'd have saved bring-up and mkt costs, on the other they'd have paid less to GF (45nm probably much much cheaper than 32nm).
This may not have happened. One of the unknowns is the minimum volumes required of AMD at the leading process node. It was one of the no-comment items from the modification.

Perhaps more importantly, there'd have been some hope left around the mystical BD that would save them, as opposed to having the world see it's a lemon.
Failure to launch a replacement design for two nodes would not have helped AMD's aura of competence, and it was coming out for servers regardless of the desktop.
 
Do you have a breakdown of the sales mix in the server market? I don't have them. I don't think BD is inferior to Magny Cours in the bulk of that market.
That aside, AMD was contractually obligated to release something, with Cray as the most notable example.
It also sounds like AMD had some volume requirements at 32nm, since so much had to be done to modify its agreement with GF. BD was comparatively low volume to Llano, and it wasn't BD that was financially wrecking AMD enough to force a highly unusual good-die agreement with GF.

Not in the public space sadly, and not as exact as I'd want for any sort of detailed analysis. Given how their market-share went, and how their shipments have reduced constantly (both can be cross verified with public data), and how the margin structure eroded, it is very difficult to ascertain that BD was a good move even in server. Whilst it may not be worse than MC in that space strictly perf-wise, it was worse for AMD having higher-per-unit costs and putting the final nail in their credibility as a high-performance MPU maker. Them failing to supply Cray in time also clarified they're unreliable. I'll grant you that the latter might've been one of the binding reasons for getting BD out.

Note that by K8L i meant the old 45nm Phenom IIs, not necessarily Llano. It is opaque how Llano without the GPU would have fared on 32nm, although rumblings indicated that the CPU bits might've been the problem. At any rate, whilst Llano was a product that had some potential to help AMD and was needed, given their pretty loathsome mobile lineup (I think that it was probably the first compelling notebook CPU that AMD had in many years), therefore making the risk and investment worthwhile, I find it difficult to make the same claim by rapport with desktop BD or even server BD.

Moreover, we should also factor in that Llano could have netted AMD an Apple deal that would've been quite a breath of fresh air to a weary runner...they couldn't manufacture enough of it and with enough reliability and that was that. So I'd say that AMD struggling to make that chip worthwhile is somewhat forced, and it was probably more of a case of them struggling (yet again) to supply the one thing in their lineup that was compelling outright (excluding BC). In effect, same thing appears to be happening with Trinity, where AMD appears utterly incapable of ensuring a proper supply of the mobile variants, and especially the ULVs, which are their most compelling SKUs ATM.
 
Note that by K8L i meant the old 45nm Phenom IIs, not necessarily Llano. It is opaque how Llano without the GPU would have fared on 32nm, although rumblings indicated that the CPU bits might've been the problem.
There are Llano SKUs with the GPU completely gated off that hit 100W TDP. Overall, Llano really spanned its power range with less sensitivity to GPU clocks and enabled SIMDs. A few speed grades or active CPU cores could shift power consumption wildly. Turbo was more primitive and it was barely active, which is one area for which BD was a significant improvement.

I cannot be certain that a Llano chip with no GPU might have done better if there was some kind of negative interaction with the rules or process tweaks for GPU logic, but at this point it seems that the CPU was never going to win that fight.

Moreover, we should also factor in that Llano could have netted AMD an Apple deal that would've been quite a breath of fresh air to a weary runner...they couldn't manufacture enough of it and with enough reliability and that was that.
Is that certain? Llano would have been a regression in SSE support versus the Intel chips that preceded it. Bulldozer was the first chip that caught up on that front.

In effect, same thing appears to be happening with Trinity, where AMD appears utterly incapable of ensuring a proper supply of the mobile variants, and especially the ULVs, which are their most compelling SKUs ATM.
ULV implies it's going to be in limited quantity, since it has to be the chips that work best at low voltages with neither leakage or clock speed degradation getting out of hand. Gate first has a known problem with variability, and these problems get worse the nearer you get to the design corners.
Intel has the general volume to skim much more, and its 32nm process was second-gen gate-last.
22nm Tri-gate is simply better at ULV, so it's not like AMD has hope of improving using bulk planar until the node after the next. FD-SOI might help at the very low power range, but it doesn't seem as good once clocks climb back up to where CPUs like to play.
 
Status
Not open for further replies.
Back
Top