Apple A8 and A8X

The memory interface is going to be an insignificant fraction of that 1B increase...

True, but do we have any reason to believe anything else besides GPU and perhaps cache have changed? GPU alone doesn't seem like it would warrant even a third of the 1bn increase.
 
Well they're not going to be transistors just sitting there. If you think nothing else should have changed besides GPU and cache, then you'd probably find your culprits where you expect them. :)

For answers besides speculation howerver we would have to wait until someone gets hold of one of these mothers and de-caps it...
 
In its description of the A8 graphics, Apple remained very faithful to IMG's assessment of the improvement. IMG said 6XT gave up to 50% over series 6, and that is exactly how Apple described it, and was the reason I went with GX6400 very early on.

Assuming no other bottlenecks, 66XT should give around 50% more than 64XT. Dialling in the up to 50% from 6->6XT, you effectively end up with an "up to x2.25". To get to x2.5 you need somewhere between 10-20% clock increase.
 
Interestingly the iPad Air 2 has a significantly smaller battery than the iPad Air, a reduction from 32.4 to 27.3 watt hours, but Apple still claims the same 10 hour battery life. It also partly explains why there was no PPI increase, as thinness was their obvious priority, which limited battery volume.
 
it means that for the first time, apple is playing catch up... on a 3 billion transistor SoC made of 20nm... a really crazy piece of silicon for such performance.

Apple is playing catch up with whom exactly? Cetainly not with any minority report.

and what is the chance of next gen A9 will stay competitive against Erista ?
No idea. I don't have a green painted crystal ball lying around.

If they use GX6650 with A8X, they already maxed out GPU performance on PVR arch. When is due next img tech uarch ?
Rogue does scale up to crazy double digit cluster amounts if needed. There's no necessity for it and hence they wisely announced up to 6 cluster configs. I doubt anyone but Apple will use any 6 cluster config anytime soon anyway.

As for the former answers: any Series7 announcement does not exclude the possibility for a 8 or more cluster Series6XT config. If Series7 goes for =/>DX11 I'm not so sure that Apple would want to jump on that ship as soon.
 
ams said:
Obviously they have their own reasons for doing that, and not everyone agrees with that approach (including Anandtech), but it is what it is.

What "it is", is business. If you think IMG forward plan their graphics IP in glorious ignorance of their prime customer who is crucial to IMG operating at the level they do, then you are naive. Other companies likely do similarly. For those that don't have a significant major customer or two, their decision making must be based on what they hope potential customers might want.

Apple are the single biggest payer for graphics IP in the mobile segment. You can be sure IMG is delivering to their requirements. What some website might like is as far away as it is possible from being relevant to that decision making,

Whatever the FP16/32 ALU ratio is in series 7, it'll be because that is what Apple wants. And if Apple wants every other pixel to be bright pink, IMG will design the IP to do that too.

And BTW, it is starting to get boring to see postings on the A8 thread that at every opportunity try to get Kepler/Tegra/K1/Nvidia mentioned. I see this time, G80 got called out.
 
Last edited by a moderator:
As for the 2.5x performance increase claim for A8X I'm sure it's a result from Apple's own internal tests which no one knows what they consist of.
 
Apple is playing catch up with whom exactly? Cetainly not with any minority report.
I don't know in which world you are living, but Apple is not alone in the market. With the multi-platform benchs available (3DMark, browser javascript based, geekbench, etc), it's easier than ever to compare istuff to other solutions. And like or not, Apple must stay on top in all areas (including performance) if they want to continue to charge their premium tax.
So yes, they can't ignore competition and their crazy performance increase over generations is the proof.

PS: note that I did not mention the isheep/fanboy factor to keep this discussion objective. oh wait... :LOL:
 
I don't know in which world you are living, but Apple is not alone in the market. With the multi-platform benchs available (3DMark, browser javascript based, geekbench, etc), it's easier than ever to compare istuff to other solutions. And like or not, Apple must stay on top in all areas (including performance) if they want to continue to charge their premium tax.
So yes, they can't ignore competition and their crazy performance increase over generations is the proof.

Because Apple is not alone in the market and because certain law of physics for SoC development apply for them also (irrelevant of how many resources they may pour into the latter), it takes years to design and develop a SoC and it also takes at least a year from final tape out to mass production. In other words Apple didn't design any A8X in a vacuum and decided a couple of months ago to pump it up by say a billion transistors because of who know what weird theory you're going to come up with.

Apple is an established player in the market and while they obviously take into account what their competitors may have they still don't care about minority reports.

IMG announced it's first 6 cluster Rogue variant officially in November 2012: http://www.imgtec.com/news/detail.asp?ID=706 meaning that it was soooo damn hard to guess that Apple would probably use for it's next tablet generation a GPU with 192 FP32 SPs :rolleyes:

PS: note that I did not mention the isheep/fanboy factor to keep this discussion objective. oh wait... :LOL:
There isn't a single i-device in my househould if that's what you're worried about. I consider their products overprized for what they are, but this is coincidentially a highly technical oriented forum and I'm obviously not going to bring into any debate my own preferences for a specific toothpaste or detergent. If I would I'd vote for an Android tablet like the Nexus9 without a second thought.
 
I've moved the recent FP16 discussion to a new thread in the core forums, and I'll also start a thread specifically for discussing Rogue architectural decisions to go along with that. Let's keep this thread about A8 and A8X in terms of their competitiveness and use in the products they're in.
 
I don't know in which world you are living, but Apple is not alone in the market. With the multi-platform benchs available (3DMark, browser javascript based, geekbench, etc), it's easier than ever to compare istuff to other solutions. And like or not, Apple must stay on top in all areas (including performance) if they want to continue to charge their premium tax.
So yes, they can't ignore competition and their crazy performance increase over generations is the proof.

PS: note that I did not mention the isheep/fanboy factor to keep this discussion objective. oh wait... :LOL:

There are plenty of PCs that destroy Macbooks in performance every year. This must be hard for you to understand but the person who buys Apple products doesnt bother looking at benchmarks or how many cores it has as long as it performs the way they expect

The new Mini is nothing but a shameless "rebranding" rather than an upgrade but it will still sell 10x more than any of the Tegra K1 products, Apple can lose the performance race for the next 5 years and still outsell Tegra products by a massive margin

But i do find the isheep/fanboy comment funny coming from you :LOL:
 
Yeah really disappointing.

And they really should have enabled full Apple Pay on both. How much could an NFC chip be? Would have helped the growth of the Apple Pay ecosystem. Taking an iPad to the store to make purchases is about as likely as taking a full size iPad up to a mountain lake to take pictures.

Plus they have the virtualized SIM tech on the iPad Air 2 only, which they didn't even highlight in the keynote.
 
And they really should have enabled full Apple Pay on both. How much could an NFC chip be? Would have helped the growth of the Apple Pay ecosystem. Taking an iPad to the store to make purchases is about as likely as taking a full size iPad up to a mountain lake to take pictures.

I completely agree with the last sentence, however because of that, I fail to see the purpose of a NFC chip on any iPad. What am I missing?
 
Well presumably, after they get the Apple Pay thing going, they could use the NFC for other things.

They could open it up to third-parties, the way they opened up TouchID after one year.

So it could have other uses.
 
Presumably the no-NFC phone + Watch combo for Apple Pay extends to the no-NFC iPads as well.
 
I just wouldn't be shocked if they added it next year. NFC must be a pretty cheap part by now.

Maybe it's a supply issue. iPhone is priority so that may be why they kept the Mini 3 still at A7.

Last year, they didn't roll out TouchID to the iPads, because again, they were probably using every touch sensor they could make for the iPhone 5S.
 
I'm thinking the iPad Mini 3 will be like the iPad 3 and have a short life-span with a refresh in the spring adding the A8 and the other changes from the iPad Air 2 launching alongside the iPad Pro.

It's interesting that the iPad Mini 3 uses the A7 and supports in app Apple Pay while the iPhone 5s doesn't. I wonder whether the secure element is a separate chip meaning only the logic board needs to be changed or is it part of the SoC which either means the iPad Mini 3's A7 has been changed or the A7 had it all along and the lack of iPhone 5s support for in app Apple Pay is an artificial limitation.
 
Apple tends to reserve a lot of new functionality to new devices simply because they're new (as do many device/platform makers), so I'm leaning toward that explanation.
 
Back
Top