Yes, that’s way off topic. But no, Apple doesn’t charge to 110%, they just very wisely don’t charge at the same rate the entire charging cycle. As a battery approaches "full", the increasing voltage and temperature of a linear charger function would increase the amount of unwanted side-reactions, which in turn reduces capacity (and may even cause fundamental damage). Every remotely sensible Li-battery charger therefore reduce the current pushed to the battery as the voltage increases, but there is still room for differences in how well this is done, to what extent if at all it takes battery temperature into account, what cut-offs are employed to determine "empty" and "full" and so on.
It should be pointed out that the tech press is no help at all. They emphasize maximum battery life, i.e. charge into the questionable region and run the battery down completely (with its own set of undesireable consequences on a chemical level). They also praise fast charging (faster is better) which when pushed far always increases battery temperature and invites side reactions, lowering capacity and service life of the cell. Thus, manufacturers that take a more balanced approach to battery usage are penalized, and the public is conditioned to value only aspects that are detrimental to cell health and longevity.
A good device manufacturer might decide that since human beings aren’t perfect, it might be wise to only let them charge a cell to 80% of nominal capacity, and discharge it to 20%. But such a manufacturer would get thrashed in any "battery life" test. And since "battery longevity" isn’t tested for, well...
I’m a chemist. Electronics engineers design these systems though, and they tend to have a rather "black box" approach to batteries. Understandable, particularly when you get into the outer edges of the cells operation where odd shit (<= chemistry) starts to happen more. That’s the area where chemists bring out their shamanistic drums and crows feet, dance seven turns counter clockwise dressed in lab coats while chanting to appease the atomic spirits.
Physicists get uncomfortable when we do that.
Edit: Regarding the percentages shown, the charging circuit uses a model for cell Voltage vs. Capacity. That model normally uses the behaviour of a new "typical" cell. It works reasonably well for most cells, subject to production variance. However, with usage the discharge curve will change, and it will not necessarily do so uniformly as different degradation mechanisms have different effects. Also, the ability of the cell to provide stable voltages under differing conditions such as discharge rate and temperature may be affected. (Different manufacturers have different policies - some shut down phone (battery) operation when it gets hot, some when it gets hotter, some not at all - which will obviously affect longevity too). Note that charging behaviour changes as well. And fast charging that may have been OK when the cell was new will get progressively more destructive as the cell ages. It gets messy when you try to really get a grip on it, so it should be no surprise that EEs throw their hands up in disgust and just use a standard curve. Reality doesn't really fit that curve, particularly not over time and differing conditions. Thus, percentages shown will not match usage experience.