Next-Gen iPhone & iPhone Nano Speculation

Well they still haven't transitioned the iPod Touch to the A5. Though sales are declining, I think they're still way higher than that of AppleTV, which is a $99 product.

But I guess they just had to get AppleTV to output 1080p, what with AirPlay from iPad and iPhones and in the future, Macs being able to mirror to HDTVs with their next OS release.
 
Possibly they just didnt have enough supply of the A5 chips so they continued with the A4 for the ipod touch. Now that the A5X is in production, we may see an out of cycle update (compared to the usual yearly update) for the ipod touch.

Also a bit OT but since Samsung's 32nm process is in full production, shouldnt we also start seeing Samsung's own 32nm SoC's sometime soon? (Exynos 4212 or 4412)
 
Couldn't the new Apple TV contain harvested A5 chips, in much the same way that high-end GPUs with hardware faults are often released as lower-end models with modules/cores disabled?

Well they still haven't transitioned the iPod Touch to the A5. Though sales are declining, I think they're still way higher than that of AppleTV, which is a $99 product.
Working backwards with 62 million iOS devices sold last quarter and known sales figures for the iPhone, iPad, and AppleTV, the iOS breakdown is 37 million iPhones (60%), 15.4 million iPads (25%), 8.2 million iPod Touches (13%), and 1.4 million AppleTVs (2%). In the hypothetical worst case scenario where all iOS devices need Apple A5, for the iPod Touch and AppleTV to both use harvested single core A5's, 15% of Apple A5 production would have to be defective and available for harvesting without resorting to disabling good parts to fill the quota. The percentage will increase as the latest iPad and eventually iPhone transition away from the A5. Are 15%+ defect rates typical for a mature SoC? If the aren't going to be enough harvested chips, Apple might as well go with a full A5 with 512MB RAM for the next iPod Touch. It would save developers from dealing with another device/performance target. If Apple still plans to push the iPod Touch as a gaming device though they should put a bigger battery in it even if it thickens the device slightly like the new iPad.
 
Haven't heard anything on process, but Apple's statements seem to imply similar clocks: 543MP4@~250MHz + A9 Dual@~1GHz and complemented by 1GB RAM to drive the 2048x1536 display.

A much higher clocked G6400 core will be a great upgrade next year.

I just ran GLBenchmark Egypt Offscreen on the new OS 5.1 update on my iPad 2, and performance has improved somewhat, pushing the benchmark to 90 fps (should be updated on the site in several days). The fill tests I tried came in a little lower than normal, usually just related to normal variance. I'm sure scores on those and the Egypt Offscreen test would go quite a bit higher even if I had controlled the background processes to a better degree.

 
Since the A5X sustained the dual A9 CPU, I honestly wonder why they went for a MP4@250MHz and didn't go for a MP2@500MHz. Granted a frequency increase is never for free, but it could still be cheaper than a MP4@250MHz.
 
I'd go for MP4 over doubled clocks for more efficiency and flexibility with power consumption and heat were I a SoC designer, assuming my company could afford the die area. PowerVR scalability would also mean performance should only be a little less on the MP4, and getting to see how it plays out in the real world will be interesting versus all of the MP2@500+MHz coming down the pipe late this year and early next.
 
Still nothing on process or clocks?

As far as I know, Samsung hasn't shipped anything on it's 32nm process yet. Perhaps it's all going to Apple, but I find it suspicious that the Galaxy S III hasn't been announced. It's not just because they wanted to skip MWC. There's nothing on a 32/28nm process that's shipping in the massive volumes Apple would require. The most I can think of is the new AMD 7000 GPUs, but those are maybe a few hundred thousand at most, if that? Certainly not the 15-20 million Apple would need.

And Apple has always mentioned an increase in CPU performance during the keynote. I would expect that to remain unchanged. Besides, even during gaming, GPU performance is usually a greater limiting factor than CPU performance.
 
Hmm, they went from a 25-watt-hour battery on the iPad 2 to 42.5-watt-hour on the new iPad, which may account for the slightly thicker and heavier product.

Interestingly, they list 700 and 2100 Mhz bands for AT&T LTE but only the 700 Mhz band for Verizon LTE.

They both support the same set of UMTS (HSPA, HSPA+ and DC-HSDPA) bands. Both models would have micro-SIM for both LTE and UMTS support.

Now, why couldn't they support both networks with the same SKU. Maybe the main difference would only have been different micro-SIMs bundled with the iPad? Or is the overlap in bands secondary to the power amplifiers and antennas? You would think they'd want to consolidate to as few SKUs as possible.
 
Haven't heard anything on process, but Apple's statements seem to imply similar clocks: 543MP4@~250MHz + A9 Dual@~1GHz and complemented by 1GB RAM to drive the 2048x1536 display.
And Apple has always mentioned an increase in CPU performance during the keynote. I would expect that to remain unchanged. Besides, even during gaming, GPU performance is usually a greater limiting factor than CPU performance.
Did Apple market improved performance when they announced the 2nd gen iPod Touch? I seem to remember that was something developers figured out after the fact. And every new SoC has always had improved CPU and GPU performance. I'm thinking the CPU clock speed has improved modestly, probably 20% to 1.2GHz, which wouldn't be enough to mention similar to the 2nd gen Touch. If we still think the GPU is clocked at 1/4 CPU, then a 300MHz SGX543MP4 seems pretty reasonable. A small bump in GPU clock would also make the 4x GPU claim against Tegra 3 more credible, even if the 2x claim for the original A5 doesn't seem so (baring iOS 5.1 driver improvements).

http://www.zdnet.com/blog/btl/nvidia-on-apples-ipad-a5x-graphics-claims-show-us-the-benchmarks/71065

Interestingly nVidia is annoyed about the Tegra 3 comparisons and demanding detailed source data. They point out that Apple has made a really generic statement, but isn't that kind of the whole point of marketing? It may not always be right, but you don't have enough details to prove it wrong. nVidia themselves have no doubt done the same in marketing their new products.
 
Hmmm... didn't realize the battery had increased so much in capacity.

While the competition wasn't forcing Apple to rush to market and consequently gave them the luxury of waiting for a sub 45nm process to come on-line, getting only roughly the same battery life out of that new battery makes me wonder whether the app and baseband processors might not actually be on 32nm or below.
 
Apple could always point to GLBenchmark Pro Offscreen -- though not very representative of modern content, as mentioned -- if they would bother responding to nVidia.
 
Hmmm... didn't realize the battery had increased so much in capacity.

While the competition wasn't forcing Apple to rush to market and consequently gave them the luxury of waiting for a sub 45nm process to come on-line, getting only roughly the same battery life out of that new battery makes me wonder whether the app and baseband processors might not actually be on 32nm or below.
http://www.anandtech.com/show/5661/the-new-ipad-4g-contains-qualcomms-mdm9600

The baseband is apparently the 45nm MDM9600.
 
Ah, I see. And while not directly related, I'll change my guess on the app processor: it should still be Sammy's 45nm, which is why they might not have bumped the GPU clocks at all or much.
 
Ah, I see. And while not directly related, I'll change my guess on the app processor: it should still be Sammy's 45nm, which is why they might not have bumped the GPU clocks at all or much.
http://pc.watch.impress.co.jp/img/pcw/docs/511/949/html/9.jpg.html

ARM did report in Q4 that they have received their first 32nm foundry license royalty revenue. Other than Samsung who else has a 32nm process making ARM SoCs? With Samsung's own 32nm SoCs not available yet, it does make sense that Apple is the 32nm lead customer. ARM isn't reporting royalties from 28nm foundries yet so there does seem to be a few months gap between 32nm ARM (Samsung and Apple) and 28nm ARM (TI and nVidia), presuming that Qualcomm's progress isn't reflected in ARM's foundry revenue reporting.
 
I guess the stars didn't align for 32nm. Or A15. Or Rogue.

But it must fit their overall product portfolio strategy, which is to launch the iPad in March, iPhone in the middle of the year and then the iPod Touch maybe before the Holidays. This may not line the best with SOC roadmaps but maybe it's best for their supply chain if they have to produce tens of millions of devices.

Also the spacing out gives their customers a chance to recover from each purchase and buy the next thing on their calendar. :D
 
http://pc.watch.impress.co.jp/img/pcw/docs/511/949/html/9.jpg.html

ARM did report in Q4 that they have received their first 32nm foundry license royalty revenue. Other than Samsung who else has a 32nm process making ARM SoCs? With Samsung's own 32nm SoCs not available yet, it does make sense that Apple is the 32nm lead customer. ARM isn't reporting royalties from 28nm foundries yet so there does seem to be a few months gap between 32nm ARM (Samsung and Apple) and 28nm ARM (TI and nVidia), presuming that Qualcomm's progress isn't reflected in ARM's foundry revenue reporting.

Sad thing is that if the new iPad launched 3 months later, we'd probably be getting a far more low-power efficient SoC + baseband solution.
 
Since the A5X sustained the dual A9 CPU, I honestly wonder why they went for a MP4@250MHz and didn't go for a MP2@500MHz. Granted a frequency increase is never for free, but it could still be cheaper than a MP4@250MHz.

It's easier to hide memory latency at 250M than at 500M?
 
Back
Top