Apple A14 and A14X SoCs

80703919.png
Off topic
this is one thing I've noticed myself with apple devices, charging to 100% actually charges to 110 or something (yeah ok I know) i.e. it takes waaaaaaay longer to go from 100% to 99% than eg 99%-98% or 45%-44%
PSA dont charge your devices to 100%
 
Yes, that’s way off topic. But no, Apple doesn’t charge to 110%, they just very wisely don’t charge at the same rate the entire charging cycle. As a battery approaches "full", the increasing voltage and temperature of a linear charger function would increase the amount of unwanted side-reactions, which in turn reduces capacity (and may even cause fundamental damage). Every remotely sensible Li-battery charger therefore reduce the current pushed to the battery as the voltage increases, but there is still room for differences in how well this is done, to what extent if at all it takes battery temperature into account, what cut-offs are employed to determine "empty" and "full" and so on.

It should be pointed out that the tech press is no help at all. They emphasize maximum battery life, i.e. charge into the questionable region and run the battery down completely (with its own set of undesireable consequences on a chemical level). They also praise fast charging (faster is better) which when pushed far always increases battery temperature and invites side reactions, lowering capacity and service life of the cell. Thus, manufacturers that take a more balanced approach to battery usage are penalized, and the public is conditioned to value only aspects that are detrimental to cell health and longevity.
A good device manufacturer might decide that since human beings aren’t perfect, it might be wise to only let them charge a cell to 80% of nominal capacity, and discharge it to 20%. But such a manufacturer would get thrashed in any "battery life" test. And since "battery longevity" isn’t tested for, well...

I’m a chemist. Electronics engineers design these systems though, and they tend to have a rather "black box" approach to batteries. Understandable, particularly when you get into the outer edges of the cells operation where odd shit (<= chemistry) starts to happen more. That’s the area where chemists bring out their shamanistic drums and crows feet, dance seven turns counter clockwise dressed in lab coats while chanting to appease the atomic spirits.
Physicists get uncomfortable when we do that.


Edit: Regarding the percentages shown, the charging circuit uses a model for cell Voltage vs. Capacity. That model normally uses the behaviour of a new "typical" cell. It works reasonably well for most cells, subject to production variance. However, with usage the discharge curve will change, and it will not necessarily do so uniformly as different degradation mechanisms have different effects. Also, the ability of the cell to provide stable voltages under differing conditions such as discharge rate and temperature may be affected. (Different manufacturers have different policies - some shut down phone (battery) operation when it gets hot, some when it gets hotter, some not at all - which will obviously affect longevity too). Note that charging behaviour changes as well. And fast charging that may have been OK when the cell was new will get progressively more destructive as the cell ages. It gets messy when you try to really get a grip on it, so it should be no surprise that EEs throw their hands up in disgust and just use a standard curve. Reality doesn't really fit that curve, particularly not over time and differing conditions. Thus, percentages shown will not match usage experience.
 
Last edited:
Is it really to do with the batery and not just how the values are displayed? They may cook the % value SHOWN on the UI just for better user readability. Maybe the batery does only charge up to 80%, and only discharges to as much as 20%. But then they remap the % shown on screen to that useable range, so 20% shows as 1% and 80% shows as 100%. Also, since I know users find that 100% satisfying to look at, I'd clamp the range even harder to that. Anythinf from 75-80 would still show as 100% just for the psychological effect. May wanna do the same on the lower bound a bit too so the user is pleasently surprised by how much those last 10% last them in an eventual emergency rather than disapointed by how quickly it went. Games do the same kind of remaping of values for user-facing UI for similar psychological reasons.
 
Is it really to do with the batery and not just how the values are displayed?
Both, typically.
They may cook the % value SHOWN on the UI just for better user readability. Maybe the batery does only charge up to 80%, and only discharges to as much as 20%. But then they remap the % shown on screen to that useable range, so 20% shows as 1% and 80% shows as 100%. Also, since I know users find that 100% satisfying to look at, I'd clamp the range even harder to that. Anythinf from 75-80 would still show as 100% just for the psychological effect. May wanna do the same on the lower bound a bit too so the user is pleasently surprised by how much those last 10% last them in an eventual emergency rather than disapointed by how quickly it went. Games do the same kind of remaping of values for user-facing UI for similar psychological reasons.
This is typically done, but note that it also leaves the voltage cut-offs for 0% and 100% to the discretion of the device manufacturers. Different manufacturers employ differing policies. Also, the Voltage vs. discharge curve not only shortens with age but also changes shape with typical ageing mechanisms. The manufacturer could try to take this into account to some extent, but may very well not, and you won't find perfect matching of reality to model even then. That can get really apparent if you change conditions, for instance use a device in the cold - you might see very sudden drops in indicated remaining battery, going from 30-40% to shutting down in a matter of a few minutes.
I think the take home message is that battery charge indications are approximations, and that we actually learn how it maps to reality and how it changes over time with the devices we live with. Like spouses, really. ;-)
 
My experience is that when new, iphones do really good with battery times with respect to their capacities. But they wear out faster then say an android phone with larger batteries. Probably has to do with smaller batteries having less active cells left after three to four years of use.
 
Is it really to do with the batery and not just how the values are displayed? They may cook the % value SHOWN on the UI just for better user readability. Maybe the batery does only charge up to 80%, and only discharges to as much as 20%. But then they remap the % shown on screen to that useable range, so 20% shows as 1% and 80% shows as 100%. Also, since I know users find that 100% satisfying to look at, I'd clamp the range even harder to that. Anythinf from 75-80 would still show as 100% just for the psychological effect. May wanna do the same on the lower bound a bit too so the user is pleasently surprised by how much those last 10% last them in an eventual emergency rather than disapointed by how quickly it went. Games do the same kind of remaping of values for user-facing UI for similar psychological reasons.
Yes thats what I think, I did post about this here 1-2 days ago (but I dont see the post here? strange )

they do something like

batterydisplay percent = clamp01( actualbatteryvalue * (100/90) ) * 100
 
Early results for A14 and 3DMark Wild Life have been around the net. Scores are slightly lower than A13, but the actual framerate timeline is weird, no? It's unreasonably even, almost like there's a bottleneck somewhere. Driver bug?
80703919.png

3bCydKp.jpg

Three results, three different resolutions. For Apple to apple you need Wild Life Unlimited results and here the iPhone 12 is slightly faster than iPhone 11/Pro/Max (8513 vs. 7982/7968).
 

Attachments

  • D0CAAFA7-76D3-4C85-BB41-67D79DCB1C22.png
    D0CAAFA7-76D3-4C85-BB41-67D79DCB1C22.png
    299.9 KB · Views: 16
Also, Geekbench 4 memory bandwidth numbers are in and they are much better than the A13. Inexplicable, as the iFixit teardown shows memory parts numbers that indicate lpddr4-4266 (I checked Microns website and didn’t find an exact match, but close.) If it really is 64-bit lpddr4 the bus utilization is fantastic. Too fantastic, results are quite close to the A12x.

Something has changed, and quite substantially, but what? Can’t wait for Andrei at Anandtech to sink his teeth into this one.
 
Also, Geekbench 4 memory bandwidth numbers are in and they are much better than the A13. Inexplicable, as the iFixit teardown shows memory parts numbers that indicate lpddr4-4266 (I checked Microns website and didn’t find an exact match, but close.) If it really is 64-bit lpddr4 the bus utilization is fantastic. Too fantastic, results are quite close to the A12x.

Something has changed, and quite substantially, but what? Can’t wait for Andrei at Anandtech to sink his teeth into this one.
Memory bandwidth didn't actually change, just some µarch changes on how things behave. This is why I'm glad GB5 dropped the memory tests.
 
Memory bandwidth didn't actually change, just some µarch changes on how things behave. This is why I'm glad GB5 dropped the memory tests.
Thanks.
It did seem odd.
Rather than drop the memory test, they might simply make it more robust. It doesn't really fit the generally core-architectural focus of the benchmark, but it is an important parameter, particularly for unified memory systems!
 
Three results, three different resolutions. For Apple to apple you need Wild Life Unlimited results and here the iPhone 12 is slightly faster than iPhone 11/Pro/Max (8513 vs. 7982/7968).

The benchmark claims it renders at 2560x1440, but regardless, my 12 Pro arrived this morning.

Wild Life, normal vs Unlimited:
Fx89CM0.jpg
9X3kE2N.jpg


Matches results of some publications. The 12 also seems to be the only device in its class with such a large discrepancy between normal and Unlimited. The others in that link have <5% gap vs the 33% seen here, and it's not like the normal test is spending its entire run at 60fps V-Sync limit or anything.

The first result looks like there's something capping it...
 
The results are consistent with a 4x4 configuration when extrapolating from A14, and A12 vs. A12x scores so it would seem likely if legit.
 
A14X preliminary results?

https://appleinsider.com/articles/2...nchmarked-days-before-apple-silicon-mac-event

GB5 ST 1634
GB5 MT 7220

I wonder if it's 4+4 cores with some larger cache and/or faster RAM.

While this is not a bad performance (comparable to the single thread performance of an AMD Ryzen 9 5950X running Geekbench 5) if true, I still hope it to be higher in a higher thermal envelope such as a MacBook Pro.
I do hope it'll have something like 16GB RAM in a MacBook Pro though. 8GB is way too small for something to last for at least 5 years.
 
While this is not a bad performance (comparable to the single thread performance of an AMD Ryzen 9 5950X running Geekbench 5) if true, I still hope it to be higher in a higher thermal envelope such as a MacBook Pro.
I do hope it'll have something like 16GB RAM in a MacBook Pro though. 8GB is way too small for something to last for at least 5 years.
Yup. I’m hesitant to write anything, because in a few days we’ll know much more. But one of the things that make me wary as a consumer is the trend towards non-upgradeable memory and on-board storage. If you want longevity, you’ll have to pay current vendor defined prices, there is no hope of upgrading cheaply either now or at a later date. And vendors (not only Apple) always charge ridiculous premiums for RAM. If the computers have fast I/O ports, you have some hope of cludging a storage upgrade at a later date, but even that is impossible with RAM. Of course, if accurate, this 8GB leak might be from an iPad Pro.
 
Yup. I’m hesitant to write anything, because in a few days we’ll know much more. But one of the things that make me wary as a consumer is the trend towards non-upgradeable memory and on-board storage. If you want longevity, you’ll have to pay current vendor defined prices, there is no hope of upgrading cheaply either now or at a later date. And vendors (not only Apple) always charge ridiculous premiums for RAM. If the computers have fast I/O ports, you have some hope of cludging a storage upgrade at a later date, but even that is impossible with RAM. Of course, if accurate, this 8GB leak might be from an iPad Pro.
Mark Gurman is guessing that the ARM Macs will have fewer upgrade options.
Mark Gurman said:
What I’m watching for with Apple Silicon Macs: if they will offer multiple processor speed, graphics chips, and RAM options like they do with Intel Macs — or if it will be all standardized like iPhones and iPads. I’d guess CPU and GPU won’t be customizable, but perhaps RAM will.

I wonder if Apple could have different RAM types and speeds for the same SoC. For example, a 13" MacBook Pro with the following options at time of order:
  • 8 GB LPDDR4X-4266
  • 16 GB LPDDR4X-4266
  • 32 GB LPDDR5-5500
Then, even if everything else is equal, one would need to get the largest RAM amount for the highest GPU performance.
 
Don’t jinx it!
Goddammit, now I’ve got to dig out my vial of holy water and sprinkle it on the screen while praying to Seymour Cray, patron saint of uncrippled computers.

That done, your suggestion is definitely possible. Other SoC vendors make lpddr memory controllers that support both lpddr4 and lpddr5, and lpddr5 is available in 16GB 64-bit wide packages, so sure. But I’ll hold on to hope for something more ambitious than a by-the-numbers iPad Pro SoC being presented on Tuesday.
 
I’m personally hoping for a single stack of HBM2E / HBM3 for low end products like the MacBook, MacBook Air and Mac Mini just to shake things up. Simpler PCB and power savings. That would cut cost down as well, something the bean counters will like.
 
Back
Top