Apple should’ve just bought the company at whatever high price was being asked. Having all the top designers they could get, along with the refined IP already designed, is invaluable.
PVR wasn't even particularly expensive, and certainly not by Apple measurements. Overall a rather large dropped ball there, seeing as some of the lost performance is almost certainly caused by the need to not step on PVR IP...
 
PVR wasn't even particularly expensive, and certainly not by Apple measurements. Overall a rather large dropped ball there, seeing as some of the lost performance is almost certainly caused by the need to not step on PVR IP...

It was a surprise to me as well, but maybe there's something we don't know happening in the back...
Furthermore, it can be difficult to retain talents during an acquisition. I'm not sure about how the general atmosphere was in PVR about (potentially) working for Apple at that time.
 
People should wait and see what the A12 GPU is before coming to conclusions about the A11/PVR IP. Just look at the die shots which show it looking very much like previous generations in terms of structure. Add in the fact it's a TBDR and supports PVRTC. If it looks like a duck and it quacks like a duck... Architectural licenses are a thing.

I'll remind people the following during last year's announcement:
Apple is of a view that it will no longer use the Group’s intellectual property in its new products in 15 months to two years time, and as such will not be eligible for royalty payments under the current license and royalty agreement.
15 months time is September 2018.
 
People should wait and see what the A12 GPU is before coming to conclusions about the A11/PVR IP. Just look at the die shots which show it looking very much like previous generations in terms of structure. Add in the fact it's a TBDR and supports PVRTC. If it looks like a duck and it quacks like a duck...

I'll remind people the following during last year's announcement:

15 months time is September 2018.
True. And as you point out it even says 15 months to two years, so we could be looking at pretty much one year on top of that.
Apple, by the way, already has a number of ex-IMG people working for them. I doubt we’ll ever know just how and why the dominoes fell like they did.
 
I doubt we’ll ever know just how and why the dominoes fell like they did.
The why is extremely clear in my view and the writing was on the wall. Qualcomm. The way I see it is that Qualcomm now has such a significant lead and IMG/ARM were too complacent and unable to offer competitive IP that Apple/Samsung put in their custom GPU projects into action. As an IP company you should never take it for granted that the customer is going to rely on you or that you're only competing against the other IP vendor. HiSilicon is going to be the last high-end GPU IP customer but we don't know how long that'll last and after that there will be no market anymore. For ARM that's not too dramatic as they have the CPU IP business to support the GPU business, but for IMG that's an existential threat as that's just revenue that's gone forever.
 
I'm not sure about how the general atmosphere was in PVR about (potentially) working for Apple at that time.
Then...don't buy them, and just keep paying that $1.50 or whatever was the license per hardware unit...? Not a whole lot, that's for sure. (Or as Nvidia calls it: couch money... :p) Not rocking a very much not-sinking boat is also an option, of course.
 
The why is extremely clear in my view and the writing was on the wall. Qualcomm. The way I see it is that Qualcomm now has such a significant lead and IMG/ARM were too complacent and unable to offer competitive IP that Apple/Samsung put in their custom GPU projects into action. As an IP company you should never take it for granted that the customer is going to rely on you or that you're only competing against the other IP vendor. HiSilicon is going to be the last high-end GPU IP customer but we don't know how long that'll last and after that there will be no market anymore. For ARM that's not too dramatic as they have the CPU IP business to support the GPU business, but for IMG that's an existential threat as that's just revenue that's gone forever.
Oh, I can see a couple more good reasons besides performance/power trajectory.
Not being beholden to a small IP company for a critical part of their SoCs is generally wise. They bought a large stake (as did Intel) in IMG for strategic reasons. Samsung taking over IMG for instance (or Qualcomm, or...) wouldn’t have made them very happy. Also, they value their secrets and the less anyone knows of their plans, the further out in time, the better. Would they want Intel to know if they were requesting IP features that would only really make sense in a higher power envelope for instance?
Also (related to your point re:Qualcomm) they were aware of how IMG spent their resources and may have felt insecure about their long term competitiveness.
Apart from corporate strategy, controlling CPU and GPU IP fully may make it easier to design in resource sharing and give higher priority to features that are particularly useful to (future versions of) iOS.

Seeing what they have brought to the table in terms of CPU design, it’s difficult not to be curious about what their GPUs will look like in a few generations. Whatever we may be able to figure out about their internal workings, I doubt they’ll ever publicize much in terms of nitty gritty detail. You may have some detective work cut out for you. ;-)
 
From Bloomberg: "Apple Plans to Use Its Own Chips in Macs From 2020, Replacing Intel."

The long-speculated and rumored Intel → ARM switch may be happening soon.
Bloomberg said:
The initiative, code named Kalamata, is still in the early developmental stages, but comes as part of a larger strategy to make all of Apple’s devices -- including Macs, iPhones, and iPads -- work more similarly and seamlessly together, said the people, who asked not to be identified discussing private information. The project, which executives have approved, will likely result in a multi-step transition.
Bloomberg said:
While the transition to Apple chips in hardware is planned to begin as early as 2020, the changes to the software side will begin even before that.
[…]
As part of the larger initiative to make Macs work more like iPhones, Apple is working on a new software platform, internally dubbed Marzipan, for release as early as this year that would allow users to run iPhone and iPad apps on Macs, Bloomberg News reported last year.
 
From Bloomberg: "Apple Plans to Use Its Own Chips in Macs From 2020, Replacing Intel."

The long-speculated and rumored Intel → ARM switch may be happening soon.

Long speculated, and still hasn't happened.

I mean I know that Apple has more money than God. And that people love Apple. And that people hate Intel with a passion and would like to see it die in a fire and be replaced by ARM for a whole number of reasons.

Does any of this explain why Apple would burn hundreds of millions of dollars to develop a high-performance desktop-only ARM core to compete with Intel's offerings for what, at the end of the day, is a product which is a small fraction of their sales base?

Isn't this just wishful thinking?
 
What better way have Apple of improving sales of Mac to replace PCs than by offering all their many millions of iOS users the opportunity to use the same apps and experiences across devices? "Looking for a new computer? Got an iThing? Get a Mac and use all your apps."
 
Does any of this explain why Apple would burn hundreds of millions of dollars to develop a high-performance desktop-only ARM core to compete with Intel's offerings for what, at the end of the day, is a product which is a small fraction of their sales base?
In terms of sales, yes, but the Mac does bring in more revenue than the iPad.

This doesn't mean it automatically makes sense for Apple to switch the Mac to ARM (especially given that Apple will need multiple chips or multi-chip configurations for the entire Mac line as opposed to at most one chip per generation for the iPhone and iPad), but the Mac isn't that small.
 
Would be an investment, not just of money but time and resources.

In crunch times, they've been known to pull engineers off Mac projects to work on iOS releases.

Seems like this transition would require allocating resources the other way.

Would it gain them a big cost savings though? There are times when they can't produce enough of the latest SOCs for iPhones so other products get chips which are a generation or more older.

Maybe that will change. The Apple TV 4K got I think the same SOC as the latest iPad Pros at the time, which was a departure from previous Apple TV iterations, which tended to get older SOCs.
 
In terms of sales, yes, but the Mac does bring in more revenue than the iPad.

This doesn't mean it automatically makes sense for Apple to switch the Mac to ARM (especially given that Apple will need multiple chips or multi-chip configurations for the entire Mac line as opposed to at most one chip per generation for the iPhone and iPad), but the Mac isn't that small.
Fully agree that being able to switch ISA doesn't automatically mean it makes sense. That said, there is a fair bit of overlap between the iPad chips and the low-power draw notebooks, with the A10x arguably outperforming lower end xxBookyy already. And if Apple can roll custom chips for its iPads, it could quite possibly do it for the higher end Macs as well, as they are less price constrained. It's not as if Intel (and AMD) are giving their chips away, so the budgeted SoC cost per machine would be way higher than for the iDevices. When you speak of yearly updates, Intel hasn't managed all that much on a yearly basis for a long time now - obviously, for Apple, owning their own fate has both attraction and risk. However, so far there hasn't been strong signs that they will do it even though it has long been known that they can. Should keep Intel in line in negotiations, if nothing else.
 
Well Apple sells a fair number of laptops at $2000 and up. Most of their laptops are over $1000.

They need better than low-end performance to get those high ASPs.
 
Long speculated, and still hasn't happened.
Well, yesterday wasn't April 1st, so there's that at least...

Still, I don't know. Marzipan? Ok, it's a codename for something they want to keep under wraps, so why not. If they're making their own GPUs, why not their own desktop class CPUs as well (although they've been calling their ARM cores "desktop class" since the A7...)
 
What better way have Apple of improving sales of Mac to replace PCs than by offering all their many millions of iOS users the opportunity to use the same apps and experiences across devices? "Looking for a new computer? Got an iThing? Get a Mac and use all your apps."

If they want to allow iOS apps to run on Macs, there are surely simpler ways to do this than to design their own desktop-class CPU, and switch the ISA of Mac OS once again?

Distribute the apps from the store as an ISA-neutral byte-code, then JIT them to the native ISA on install? Just as an example.
 
Every Apple A-Series SoC release, including the last several (A7, A8, A9, A10) have greatly raised the bar in mobile graphics performance, and Adreno, in its best years, manages to roughly match it in absolute performance about six months later (and with lower power efficiency). If Apple was seeing the writing on the wall for PowerVR because of Qualcomm, they must not know how to read.

Apple obviously wanted more control over its supply chain, and that’s a smart goal. Doesn’t mean they went about it as well as they should’ve, to the detriment of certain aspects of their products’ performance. In the end, they’ll obviously do just fine.

And, yes, there’s still a decent amount of PowerVR in the A11 GPU; following the royalties was always how I knew Kanter’s proclamations were premature. But, the liberties Apple has taken with customizing the design this time around were fundamentally more significant to the point that they feel comfortable calling this their own creation for the first time. And some of the uneven graphics performance scaling from the last generation bear witness to those differences in design balancing.
 
Every Apple A-Series SoC release, including the last several (A7, A8, A9, A10) have greatly raised the bar in mobile graphics performance, and Adreno, in its best years, manages to roughly match it in absolute performance about six months later (and with lower power efficiency). If Apple was seeing the writing on the wall for PowerVR because of Qualcomm, they must not know how to read.
You're suffering from the effects of the reality distortion field. For several generations, Adreno has been vastly leading in performance and efficiency, and let's not even talk about die area.

Apple has went from this on the A8:

A8-07.jpg


To this today:

95164.png
 
vastly...in one benchmark...
It's notable that Apple has never used any form of direct cooling on its SoC, whereas a number of other phones have used both one and more integrated heatpipe/s. Clearly it's cheaper to not include cooling; easier to build a very flat device too, and still possible for Apple to reach very competitive performance in just about all situations without it.

Still, as time goes on, maybe we'll start seeing some cooling because we have to? After all, chips keep getting smaller; even though power consumption may go down with new manufacturing nodes, everything gets bunched up more tightly. Hotspots get smaller. It'll be harder to passively cool chips as they keep on shrinking.
 
Back
Top