Apple A11 SoC

Discussion in 'Mobile Devices and SoCs' started by iMacmatician, Aug 18, 2017.

Tags:
  1. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,708
    Likes Received:
    528
    That's a weird question. What do you have in mind? Generally speaking iOS uses extra resources quite well, leveraging the GPU, image processor, and now AI coprocessor for instance.
    What are you fishing after?
    There aren't any Apple employees here leaking company secrets, unfortunately. We only have access to the information that is in the public domain. The phone hasn't been available for independent experimentation or investigation. Since you have access to Apple developer material you should be well aware of this.
     
  2. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,708
    Likes Received:
    528
    Legit geekbench result according to John Poole (of PrimateLabs). Link
     
    Lodix likes this.
  3. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,470
    Likes Received:
    1,801
    Location:
    La-la land
    It can't be both PVR furian and Apple-designed at the same time.
     
    Mize likes this.
  4. psurge

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    931
    Likes Received:
    28
    Location:
    LA, California
    Comparing closed source benchmark results across different architectures with no real idea of frequency or power-consumption during the benchmark, or of working set sizes, probably makes me a moron, but I feel like the integer score (~4700) especially is potentially impressive. It seems to have perf comparable to a 2017 MacBook Pro 13" on this benchmark. Does GB4 take any pains to measure steady-state performance, or is it just measuring max-turbo performance for everyone?
     
  5. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,343
    Likes Received:
    529
    The reason you stick more memory in a phone is reduce app tombstoning+restarts, it saves power. It also improves the user experience.

    Cheers
     
    french toast and BRiT like this.
  6. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,343
    Likes Received:
    529
    They don't. To add insult to injury, the individual subtests are so short (<1s) that processors, especially older ones, don't rev up until the subtest is over.

    With Speedstep and turbo enabled my Westmere Xeon at home never gets above 2.8GHz in a GB4 run, - on a 3.46GHz base/3.73GHz turbo processor !!

    Cheers
     
    lanek, psurge and BRiT like this.
  7. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,470
    Likes Received:
    1,801
    Location:
    La-la land
    It'd have to save many millions of times the power of refreshing 8GB of RAM to be worth it. How often does apps have to restart for average users, and is it meaningful in the big scheme of things? Doesn't the battery last all day anyway for most of us? (Not counting the incessantly scrolling corner-case maniacs who can't keep their hands off their phones no matter what they're doing or what situation they're in here...)

    User experience is just fine on my 2GB iP7... :p
     
  8. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    920
    Likes Received:
    69
    Location:
    Luxembourg
    The working set sizes are the same for all platforms and detailed in the white paper. Also companies licensing the benchmark have full source code access.

    That's an issue of Speedstep and turbo on your machine. The tests are plenty long enough to represent proper interactive workloads. Mobile devices ramp up at down to 10ms response times.
     
    Laurent06 likes this.
  9. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,343
    Likes Received:
    529
    LPDDR4 refresh power is between .6 to ~2 mw/GB. Alas, the difference between 3GB and 6GB is 2-6mw.

    Your iPhone 7 has a 7.45Wh battery (1960mAh @ 3.8V), going from 3GB to 6GB would eat 0.048 to 0.144Wh per day, or .6 to 1.8% of your battery capacity, - in the noise.

    Cheers
     
    iMacmatician, french toast and BRiT like this.
  10. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,343
    Likes Received:
    529
    But Geekbench isn't an interactivity benchmark, if it was, the choice of subtests would be downright bizarre (LLVM code generation and ray tracing). Geekbench is a single/multi threaded CPU benchmark.

    The individual subtests are less than a second each with a two second pause between them, "to avoid thermal issues". This skews results in two ways:
    1. Lowers benchmark scores for system with slow ramp of operating frequency (>1s on Westmere, but still 110ms on Haswell CPUs).
    2. Artificially inflates benchmark scores for systems with limited power dissipating capability (every mobile platform out there). In effect average power is one third of the actual power usage when doing stuff.

    Cheers
     
  11. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,708
    Likes Received:
    528
    True, but then that is an issue with all benchmarking, isn't it?
    You always have to understand the tool.
    I wouldn't say that pausing between subtests inflates the scores of mobile devices, they really do perform at the level shown under these conditions. And those conditions (bursty workloads) are pretty relevant for many devices. If you make runtimes really long (say half an hour or so to ensure thermal equilibrium is reached) you end up with the same problem at the other end of the spectrum - does this really represent the workload you want to model? Add that running in a thermally throttled setting generally makes the results less repeatable. Devices don't reach thermal equilibrium equally fast, so just how long/short should a benchmark run be to be "fair"? Is the device held in hand? Is the weather warm?

    Having short runs helps both repeatability, and simply makes the benchmarking more convenient. Plus, it is probably a better model for actual use than the other repeatable extreme, the thermal equilibrium runs. Not a good model for 24/7 server use, but then, what iPhone does that....

    Geekbench 4 is actually a pretty good benchmark for what it is. You just have to be aware of its limitations.
    (An issue cropped up with the earliest geekbench runs for instance, where a subtest was scheduled to a slower core, because the tasks run at normal rather than maximum priority. Results always have to be checked for sanity.)
     
  12. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    920
    Likes Received:
    69
    Location:
    Luxembourg
    On a Galaxy S8 / A73 the shortest subtest is 1.3s and the longest one 8s.

    Show me a real world sustained workload scenario where loads are longer than that period. There is none. You don't do compiling or encoding on mobile devices. The tests selected in GB4 are a very fair selection of various workloads that would be representative of the real world. The LLVM test for example might be used a representation of runtime compilation which does happen.

    You don't care about mobile devices what the performance is for a constant 15 minute workload, you care about performance when you open/load an app or open a website. This is what 95% of use-cases are about. It's also the way to measure peak performance of the chip and architecture.

    If Westmere is that slow then it's also utterly shit and also misrepresented in all JS benchmarks out there because that performance will never be reached in the real world for example. The benchmark also has to be realistically usable on lower-end devices and the lowest common denominator here takes several minutes to go through a benchmark run. The workload complexity is absolutely fine and it's the best jack-of-all trades benchmark there is out there. But of course there's people who'll request SPEC2006 on a phone without realizing that that takes hours to run.
     
  13. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    12,595
    Likes Received:
    2,556
    This new Apple GPU should be roughly in the same performance range as the Intel GPU in the macbook line, and about the 30-40% of the Intel GPUs in the Macbook Pro line. I wonder if we'll ever see Apple supplying their own GPUs for their own devices, with the exception of the cases where people are using AMD, Nvidia options.
     
    Kaarlisk likes this.
  14. Pottsey

    Newcomer

    Joined:
    May 26, 2002
    Messages:
    65
    Likes Received:
    2
    Location:
    Nottingham UK
    Wouldn't most 3D game sessions or augmented reality apps which are extremely popular at the moment count as a sustained workload scenario well past 8 seconds. An 8 second test does not represent what a very large amount of users do on phones. Those users would care about what performance is for a constant 15 minute workload.
     
    Mize likes this.
  15. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    8
    Location:
    Leicestershire - England
    I'm not sure about that, apple makes these legacy software issues in the first place by cheating out on ram in what is routinely the most expensive phone bar boutique brands.
    they offer years worth of software updates which is commendable, but on phones with 512mb and 1gb ram it has caused no end of issues.
    the idea that they would be averse to seeing industry standard amounts of ram because they cheaped out on their old phones is daft.
    I think the whole more ram = less battery life has been disproven, one plus 5 has 8gb, offers respectable battery life in line with phones with 3/4gb ram, has a moderate sized battery.
    most of the ram will go unused, the stuff you do use its more efficient to have in ram than keep loading up the app from cold, in no way do I think 8gb is useful for a smartphone in 2017, but history tells us future software gobbles up ram faster than forward thinking projections, plus if a 450$ smartphone has 8gb, I want at least 6gb in a 1000$ one, apple has been pulling these tricks for years, intentionally setting their phones up to run slow in the future = buy new iPhone due to lock in.
     
    Lodix likes this.
  16. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    8
    Location:
    Leicestershire - England
    Good questions, not sure about that, display is likely a Samsung amoled (or LG?) and supports HDR10, not sure if that's full 10 bit or mobile version (8bit?), but I would think it would be some major cudos to apple if it could record HDR, no matter 4k 60hz is pretty mind blowing it's self.
    not sure air play would have the bandwidth for that, but I don't know the specs in any case.
     
  17. wco81

    Legend

    Joined:
    Mar 20, 2004
    Messages:
    5,979
    Likes Received:
    204
    Location:
    West Coast
    I looked at the spec pages. It doesn't record video in HDR but of course has HDR photo modes.

    iPhone X has HDR display, iPhone 8 does not, though there's no reason they can't have HDR on an LCD display. Won't be as good as OLED but it would be better than no HDR LCD.
     
  18. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    920
    Likes Received:
    69
    Location:
    Luxembourg
    3D games are not CPU heavy and there's no workload which maxes out even what is now mid-range devices. AR apps are hopefully mostly GPU accelerated.
     
  19. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,708
    Likes Received:
    528
    HDR photo modes uses multiple exposures. That is not possible when recording video, and the native dynamic range of the sensor is likely modest. That said you can still map the tones you do capture to a HDR format if you want.

    Do you want reasons to buy, or to abstain? ;-)
     
  20. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    920
    Likes Received:
    69
    Location:
    Luxembourg

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...