3D games are not CPU heavy and there's no workload which maxes out even what is now mid-range devices. AR apps are hopefully mostly GPU accelerated.

Are you saying 3D games are GPU-limited then? I assume this is a result of the high-resolution found in many phones.
Wouldn't a reasonable RTS blow that theory? I now there are Starcraft-like RTS games which I assume use considerable CPU power.
 
HDR photo modes uses multiple exposures. That is not possible when recording video, and the native dynamic range of the sensor is likely modest. That said you can still map the tones you do capture to a HDR format if you want.

Do you want reasons to buy, or to abstain? ;-)

Oh, I'm antsy to buy. What do you mean map the tones. Is there a consumer video editing app. that can convert a standard video capture into an HDR video?
 
apple has been pulling these tricks for years, intentionally setting their phones up to run slow in the future = buy new iPhone due to lock in.
Well, if they're as insidious as you say (and they probably are!), why should they stop when they clearly can get away with whatever it is they're doing? :p

Apple is the most profitable phone manufacturer by a vast margin. They literally eat everyone else's lunch except for Samsung, who gets to keep a starter course for all the efforts they put in. LG, Sony and just about everyone else pretty much subsidize their phone divisions with profits from elsewhere, or they did last time I checked anyway.

So skimping on RAM works, obviously. Not just from a profitability standpoint, but people also agree, since they're buying iphones in droves. And I agree as well, I used my iphone 5S up until when the 7 model launched last year and even with the current release of iOS it was very snappy and responsive, and didn't suffer from any noticeable slowdown or bad performance. My iphone 4 did, though. Upgrading it from 6 to 7 nearly killed it, even despite lacking almost all of the fancy-pants special graphics effects added in the new OS release. It was SO SLOW it frequently lagged even when keying in the unlock code, when the phone wasn't doing anything else at the time. Lots of lag elsewhere too in the device.

The 5S however wasn't like that, even though it only has 512MB (IIRC) RAM.
 
Oh, I'm antsy to buy. What do you mean map the tones. Is there a consumer video editing app. that can convert a standard video capture into an HDR video?
Now, I'm a stills guy, these newfangled "moving pictures" are the Devils work, I tell ye!
However, I'd check out something like this and functional equivalents in Final Cut.
But if you want reasons to buy, it seems it's enough to pick it up. Most people with a hands on experience have been lyrical.
 
High Dynamic Range refers to how many stops of light are usable before they are clipped, e.g. too much noise in the shadows to be usable or details lost due to highlights. It is preferably to have 14+ stops of usable light depending on what you are aiming for artistically. In short, HDR, is about retaining details in the two ends of the spectrum, shadows and highlights without clipping the data.

Wide Color Gamut is enabled by recording in the DCI-P3 or BT.2020 color space. Just because the target color space is BT.2020 does not mean the source has to be, so that is why some movies do not gain much over the old BT.709 color space.
 
Last edited:
I think they're comparing two different scenarios?

+30% includes the clocks, whatever they may be.
2x rate improvement for the FP16 and tex filtering.
 
Apple starts building an internal GPU design team from engineers who had worked on immediate mode renderers and could've never admitted the fundamental superiorities of tile based deferred rendering.

Meanwhile, Apple's close partnership with Imagination gives them access to the specifics of the PowerVR architecture. They work on their own implementation of a TBDR architecture, presumably careful to avoid infringing on PowerVR, and along the way even add to their team with engineers they hired away from Imagination.

The developer resources for the A11 GPU sing the praises of TBDR like gospel. That, from a design team whose core was mostly... former ATi engineers was it?

A good GPU is a good thing regardless of any hypocrisy in its origins, so I'm excited about the design of the A11's. A few of the advancements promoted by the developer materials over a traditional TBDR (if you want to call it that) appear to be features for which PowerVR could've allowed had they been exposed, though.

I also took note of Apple's mention of their Apple-designed video encoder this time around. As mentioned, they had been licensing PowerVR for the encoder and decoder. They told Imagination they wouldn't be paying them licensing and royalties at all within a fairly short time frame, so they've probably replaced the video decoder with their own tech by now, too.
 
Well, if they're as insidious as you say (and they probably are!), why should they stop when they clearly can get away with whatever it is they're doing? :p

Apple is the most profitable phone manufacturer by a vast margin. They literally eat everyone else's lunch except for Samsung, who gets to keep a starter course for all the efforts they put in. LG, Sony and just about everyone else pretty much subsidize their phone divisions with profits from elsewhere, or they did last time I checked anyway.

So skimping on RAM works, obviously. Not just from a profitability standpoint, but people also agree, since they're buying iphones in droves. And I agree as well, I used my iphone 5S up until when the 7 model launched last year and even with the current release of iOS it was very snappy and responsive, and didn't suffer from any noticeable slowdown or bad performance. My iphone 4 did, though. Upgrading it from 6 to 7 nearly killed it, even despite lacking almost all of the fancy-pants special graphics effects added in the new OS release. It was SO SLOW it frequently lagged even when keying in the unlock code, when the phone wasn't doing anything else at the time. Lots of lag elsewhere too in the device.

The 5S however wasn't like that, even though it only has 512MB (IIRC) RAM.
Well I wouldn't go as far as to call them insidious :) they do deliver great products hence the sales.
I'm just alluding to my own personal preference and perception, have not been able to talk my self into buying any apple product so far- I have tried, for the app store and cpu tech alone.
Always something that annoys me though, the fact that they no longer use gorilla glass and every other iPhone I see has a cracked screen- can't be a coincidence as it is so often.
The poor battery life in real world, the wide bezel with no FF stereo speakers taking the space up, the unnecessary skimping on hardware such as flash storage, ram, display resolution, until recently features offered by other much cheaper phones.
The unnecessary lock in with proprietary technology, always something that grinds my gears and puts me off.

But hey that's just my own bug bears, I can appreciate they bring a lot to the table, they optimise extremely well so day one performance is first class, often out performing phones with more hardware features, they make the most profit by far so they clearly know how to run a business.
I'm just saying if I was to blow 1000$ on a phone I would want high end specs across the board, value for money and all that.
You wouldn't want a Ferrari with plastic carbon fibre lookin interior just because it feels and looks the same would you? Poor anology but you catch my drift.

Back on topic. OK wonder if AT will do a deep dive on the gpu architecture?
 
I think they're comparing two different scenarios?

+30% includes the clocks, whatever they may be.
2x rate improvement for the FP16 and tex filtering.

after i watched Metal 2 on A11 - Tile Shading I guess different scenarios are caused by limited bandwidth(maybe 30% come from GFXBenchmark T-Rex Offscreen test). Tile shading must be implement to leverage high bandwidth tile memory. I curious how much A11 GPU will be throttling and for comparison iPhone 7 Plus only maintain around 65% on GFXBenchmark Manhattan 3.1 long term performance test.
 
Last edited:
after i watched Metal 2 on A11 - Tile Shading I guess different scenarios are caused by limited bandwidth(maybe 30% come from GFXBenchmark T-Rex Offscreen test). Tile shading must be implement to leverage high bandwidth tile memory. I curious how much A11 GPU will be throttling and for comparison iPhone 7 Plus only maintain around 65% on GFXBenchmark Manhattan 3.1 long term performance test.
Apple claimed A10 performance at half the power. They also claimed 30% higher performance. So what will the power draw be at that level of performance? We'll see. Glass is a lousy heat conductor though.
The A10 did pretty well on the long term performance test versus its peers, however less throttling would be better still. 3D-graphics, while not likely to be continous 100% loads, is still way less bursty than general smartphone use.
 
Apple claimed A10 performance at half the power. They also claimed 30% higher performance.

I wonder how much of that power and that power per watt improvement is attributed to going from 16nm to 10nm.

That 30% performance improvement sounds significantly smaller that previous A chip increments, but I can't be arsed checking !
 
I wonder how much of that power and that power per watt improvement is attributed to going from 16nm to 10nm.

That 30% performance improvement sounds significantly smaller that previous A chip increments, but I can't be arsed checking !
http://images.anandtech.com/doci/10329/3.PNG

(Btw HiSilicon more recently claims 0.8x power as a realistic figure)

Random performance points comparisons don't mean anything. Shaving off 30% perf from either curve means half power by itself. Shifting the curve back to nominal performance probably means that they use the same amount of power for both SoCs (which is btw not all that great sign). Overall seems the architectural efficiency improvements would be in the 25-30% range then.
 
I wonder how much of that power and that power per watt improvement is attributed to going from 16nm to 10nm.

That 30% performance improvement sounds significantly smaller that previous A chip increments, but I can't be arsed checking !
I think it's a fair guess that Apple has reduced the allowed power draw of the SoC. They promise longer battery life than the iPhone7, the body no longer has an aluminum back to help heat dissipation, and I think it's a fair guess that the screen will consume more power than the previous.
Something I personally find interesting is that they dedicate silicon to their neural engine, rather than trying to do that processing with a "GPU" adapted for the purpose.
 
It was one (of many) speculated reasons for Apple to develop their own GPUs that they wanted to target a wider range of applications, and would have an unkown (equally speculated) architecture suitable for Many Things.
No trace of that here though. Rather, they make a point of having dedicated hardware blocks. (It also makes me further question the wisdom of grafting such hardware blocks to dedicated GPUs, but that concern is not for this thread.)
 
That 30% performance improvement sounds significantly smaller that previous A chip increments, but I can't be arsed checking !
It's the smallest generational GPU increase (using Apple's numbers) for non-X SoCs since the APL0298 → A4 if not earlier. The next smallest increase is +50% for both the A7 → A8 and the A9 → A10.

So skimping on RAM works, obviously. Not just from a profitability standpoint, but people also agree, since they're buying iphones in droves. And I agree as well, I used my iphone 5S up until when the 7 model launched last year and even with the current release of iOS it was very snappy and responsive, and didn't suffer from any noticeable slowdown or bad performance. My iphone 4 did, though. Upgrading it from 6 to 7 nearly killed it, even despite lacking almost all of the fancy-pants special graphics effects added in the new OS release. It was SO SLOW it frequently lagged even when keying in the unlock code, when the phone wasn't doing anything else at the time. Lots of lag elsewhere too in the device.

The 5S however wasn't like that, even though it only has 512MB (IIRC) RAM.
The 5s has 1 GB.

I think it's a fair guess that Apple has reduced the allowed power draw of the SoC. They promise longer battery life than the iPhone7, the body no longer has an aluminum back to help heat dissipation, and I think it's a fair guess that the screen will consume more power than the previous.
Something I personally find interesting is that they dedicate silicon to their neural engine, rather than trying to do that processing with a "GPU" adapted for the purpose.
Which iPhones are you comparing? The iPhone 8 and 8 Plus have the same rated battery lives as the 7 and 7 Plus, and while the iPhone X has longer battery life than the 7, it's rumored to have a much larger battery. I agree with your power draw guess though, especially since the 8 and 8 Plus have 7-8% smaller batteries than the 7 and 7 Plus.

Looking ahead to next year, according to AnandTech, "t is expected that high-volume manufacturing (HVM) using the [TSMC] CLN7FF will commence in ~Q2 2018, so, the first “7-nm” ICs will show up in commercial products in the second half of next year." So is it reasonable to assume that the "A12," if it is released next September, will use TSMC 7 nm?
 
antutu benchmark
33nj808.png

2hmj407.png

2v3oc4g.png


after i look antutu benchmark 3d i think bandwidth limited not happen on A11(A9X on ipad pro 9.7 was bandwidth limited on 3d antutu because only half of bandwidth A9X ipad pro 12.9). improvement A10 to A11 about 71% and A11 89% of A10X 3d score. Maybe 30% improvement for long term perfomance but time will tell.

source:antutu.com
 
Last edited:
Back
Top