Apple A11 SoC

Discussion in 'Mobile Devices and SoCs' started by iMacmatician, Aug 18, 2017.

Tags:
  1. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    PVR wasn't even particularly expensive, and certainly not by Apple measurements. Overall a rather large dropped ball there, seeing as some of the lost performance is almost certainly caused by the need to not step on PVR IP...
     
    milk likes this.
  2. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    2,729
    Likes Received:
    99
    Location:
    Taiwan
    It was a surprise to me as well, but maybe there's something we don't know happening in the back...
    Furthermore, it can be difficult to retain talents during an acquisition. I'm not sure about how the general atmosphere was in PVR about (potentially) working for Apple at that time.
     
    pharma and milk like this.
  3. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    949
    Likes Received:
    98
    Location:
    Luxembourg
    People should wait and see what the A12 GPU is before coming to conclusions about the A11/PVR IP. Just look at the die shots which show it looking very much like previous generations in terms of structure. Add in the fact it's a TBDR and supports PVRTC. If it looks like a duck and it quacks like a duck... Architectural licenses are a thing.

    I'll remind people the following during last year's announcement:
    15 months time is September 2018.
     
  4. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,914
    Likes Received:
    774
    True. And as you point out it even says 15 months to two years, so we could be looking at pretty much one year on top of that.
    Apple, by the way, already has a number of ex-IMG people working for them. I doubt we’ll ever know just how and why the dominoes fell like they did.
     
  5. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    949
    Likes Received:
    98
    Location:
    Luxembourg
    The why is extremely clear in my view and the writing was on the wall. Qualcomm. The way I see it is that Qualcomm now has such a significant lead and IMG/ARM were too complacent and unable to offer competitive IP that Apple/Samsung put in their custom GPU projects into action. As an IP company you should never take it for granted that the customer is going to rely on you or that you're only competing against the other IP vendor. HiSilicon is going to be the last high-end GPU IP customer but we don't know how long that'll last and after that there will be no market anymore. For ARM that's not too dramatic as they have the CPU IP business to support the GPU business, but for IMG that's an existential threat as that's just revenue that's gone forever.
     
  6. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Then...don't buy them, and just keep paying that $1.50 or whatever was the license per hardware unit...? Not a whole lot, that's for sure. (Or as Nvidia calls it: couch money... :p) Not rocking a very much not-sinking boat is also an option, of course.
     
  7. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,914
    Likes Received:
    774
    Oh, I can see a couple more good reasons besides performance/power trajectory.
    Not being beholden to a small IP company for a critical part of their SoCs is generally wise. They bought a large stake (as did Intel) in IMG for strategic reasons. Samsung taking over IMG for instance (or Qualcomm, or...) wouldn’t have made them very happy. Also, they value their secrets and the less anyone knows of their plans, the further out in time, the better. Would they want Intel to know if they were requesting IP features that would only really make sense in a higher power envelope for instance?
    Also (related to your point re:Qualcomm) they were aware of how IMG spent their resources and may have felt insecure about their long term competitiveness.
    Apart from corporate strategy, controlling CPU and GPU IP fully may make it easier to design in resource sharing and give higher priority to features that are particularly useful to (future versions of) iOS.

    Seeing what they have brought to the table in terms of CPU design, it’s difficult not to be curious about what their GPUs will look like in a few generations. Whatever we may be able to figure out about their internal workings, I doubt they’ll ever publicize much in terms of nitty gritty detail. You may have some detective work cut out for you. ;-)
     
  8. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    759
    Likes Received:
    198
    From Bloomberg: "Apple Plans to Use Its Own Chips in Macs From 2020, Replacing Intel."

    The long-speculated and rumored Intel → ARM switch may be happening soon.
     
    pharma likes this.
  9. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,064
    Likes Received:
    374
    Location:
    en.gb.uk
    Long speculated, and still hasn't happened.

    I mean I know that Apple has more money than God. And that people love Apple. And that people hate Intel with a passion and would like to see it die in a fire and be replaced by ARM for a whole number of reasons.

    Does any of this explain why Apple would burn hundreds of millions of dollars to develop a high-performance desktop-only ARM core to compete with Intel's offerings for what, at the end of the day, is a product which is a small fraction of their sales base?

    Isn't this just wishful thinking?
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    39,695
    Likes Received:
    9,744
    Location:
    Under my bridge
    What better way have Apple of improving sales of Mac to replace PCs than by offering all their many millions of iOS users the opportunity to use the same apps and experiences across devices? "Looking for a new computer? Got an iThing? Get a Mac and use all your apps."
     
    Scott_Arm and AlBran like this.
  11. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    759
    Likes Received:
    198
    In terms of sales, yes, but the Mac does bring in more revenue than the iPad.

    This doesn't mean it automatically makes sense for Apple to switch the Mac to ARM (especially given that Apple will need multiple chips or multi-chip configurations for the entire Mac line as opposed to at most one chip per generation for the iPhone and iPad), but the Mac isn't that small.
     
  12. wco81

    Legend

    Joined:
    Mar 20, 2004
    Messages:
    6,144
    Likes Received:
    242
    Location:
    West Coast
    Would be an investment, not just of money but time and resources.

    In crunch times, they've been known to pull engineers off Mac projects to work on iOS releases.

    Seems like this transition would require allocating resources the other way.

    Would it gain them a big cost savings though? There are times when they can't produce enough of the latest SOCs for iPhones so other products get chips which are a generation or more older.

    Maybe that will change. The Apple TV 4K got I think the same SOC as the latest iPad Pros at the time, which was a departure from previous Apple TV iterations, which tended to get older SOCs.
     
  13. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,914
    Likes Received:
    774
    Fully agree that being able to switch ISA doesn't automatically mean it makes sense. That said, there is a fair bit of overlap between the iPad chips and the low-power draw notebooks, with the A10x arguably outperforming lower end xxBookyy already. And if Apple can roll custom chips for its iPads, it could quite possibly do it for the higher end Macs as well, as they are less price constrained. It's not as if Intel (and AMD) are giving their chips away, so the budgeted SoC cost per machine would be way higher than for the iDevices. When you speak of yearly updates, Intel hasn't managed all that much on a yearly basis for a long time now - obviously, for Apple, owning their own fate has both attraction and risk. However, so far there hasn't been strong signs that they will do it even though it has long been known that they can. Should keep Intel in line in negotiations, if nothing else.
     
  14. wco81

    Legend

    Joined:
    Mar 20, 2004
    Messages:
    6,144
    Likes Received:
    242
    Location:
    West Coast
    Well Apple sells a fair number of laptops at $2000 and up. Most of their laptops are over $1000.

    They need better than low-end performance to get those high ASPs.
     
  15. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Well, yesterday wasn't April 1st, so there's that at least...

    Still, I don't know. Marzipan? Ok, it's a codename for something they want to keep under wraps, so why not. If they're making their own GPUs, why not their own desktop class CPUs as well (although they've been calling their ARM cores "desktop class" since the A7...)
     
  16. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,064
    Likes Received:
    374
    Location:
    en.gb.uk
    If they want to allow iOS apps to run on Macs, there are surely simpler ways to do this than to design their own desktop-class CPU, and switch the ISA of Mac OS once again?

    Distribute the apps from the store as an ISA-neutral byte-code, then JIT them to the native ISA on install? Just as an example.
     
  17. Lazy8s

    Veteran

    Joined:
    Oct 3, 2002
    Messages:
    3,100
    Likes Received:
    18
    Every Apple A-Series SoC release, including the last several (A7, A8, A9, A10) have greatly raised the bar in mobile graphics performance, and Adreno, in its best years, manages to roughly match it in absolute performance about six months later (and with lower power efficiency). If Apple was seeing the writing on the wall for PowerVR because of Qualcomm, they must not know how to read.

    Apple obviously wanted more control over its supply chain, and that’s a smart goal. Doesn’t mean they went about it as well as they should’ve, to the detriment of certain aspects of their products’ performance. In the end, they’ll obviously do just fine.

    And, yes, there’s still a decent amount of PowerVR in the A11 GPU; following the royalties was always how I knew Kanter’s proclamations were premature. But, the liberties Apple has taken with customizing the design this time around were fundamentally more significant to the point that they feel comfortable calling this their own creation for the first time. And some of the uneven graphics performance scaling from the last generation bear witness to those differences in design balancing.
     
  18. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    949
    Likes Received:
    98
    Location:
    Luxembourg
    You're suffering from the effects of the reality distortion field. For several generations, Adreno has been vastly leading in performance and efficiency, and let's not even talk about die area.

    Apple has went from this on the A8:

    [​IMG]

    To this today:

    [​IMG]
     
    BRiT likes this.
  19. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,366
    Likes Received:
    218
    Location:
    NY
    vastly...in one benchmark...

    [​IMG]
    [​IMG]
     
    Pressure likes this.
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    It's notable that Apple has never used any form of direct cooling on its SoC, whereas a number of other phones have used both one and more integrated heatpipe/s. Clearly it's cheaper to not include cooling; easier to build a very flat device too, and still possible for Apple to reach very competitive performance in just about all situations without it.

    Still, as time goes on, maybe we'll start seeing some cooling because we have to? After all, chips keep getting smaller; even though power consumption may go down with new manufacturing nodes, everything gets bunched up more tightly. Hotspots get smaller. It'll be harder to passively cool chips as they keep on shrinking.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...