iPad 2

Because Cortex A15 and Rogue implementations won't be ready for launch by early 2012.

Not only that. If Apple's next SoC should be on 32nm, it has the possible advantage of being earlier production ready than 28nm and the disadvantage that you can't squeeze as much in as you'd theoretically could under 28nm.

A5 is already huge on 45nm; simply by increasing frequencies by a healthy notch under 32nm in theory will increase performance by a healthy notch but with still probably lower die estate than A5.
 
I'm horribly late to reply to this but:
Just on this point. I don't use a USB connection to charge anything, it's too damn slow. Wifi avoids requiring a cable.
That's because the original USB2 spec was limited to 500mA - newer specs can go much higher so there shouldn't be much of a difference anymore. However, when transferring at maximum speed, it's still limited to 900mA. And that is assuming PC chipsets support it properly of course, and even then it might take years for the majority of consumers to have access to it on both the PC and device side... So yeah,
 
iPad wall chargers, including aftermarket ones, have a USB ports. But I think they're rated at 2 amps.

Now if Wifi syncing is done over 802.11n, maybe it could give USB 2 syncs a run for its money, but wouldn't count on it.

Of course Apple is also putting Thunderbolt ports on Macs. If they migrated them to iOS devices, then it could be a big jump in speed.
 
Now if Wifi syncing is done over 802.11n, maybe it could give USB 2 syncs a run for its money, but wouldn't count on it.
Uhm... 802.11n SISO like in all handsets today is 72Mbps theoretical peak, and in the real world without interference it's 30Mbps. On the other hand, USB2 is 30MBps - that's a big 'B', so 8x difference in speed. Even with 40MHz channels (coming to tablets within 9 months and phones within 18 months but reserved to 5GHz frequencies) and MIMO (coming to tablets within 9-12 months - will remain very niche in phones but maybe a few ultra-high-end ones), that's still 2-3x slower. As for Thunderbolt/USB3, they'll be limited by the speed of the NAND flash, but it should at least be somewhat faster.
 
I know it was discussed on another thread (or maybe higher up on this one) but regarding AirPlay, has apple indicated that it is a lossless transfer for the video mirroring ? Also does the ipad send out 1024x768 wirelessly and the itv handles the upshift to hd , or does it ship out hd natively over the wireless, like it does over the hdmi connection ? The other discussion indicated that there was acceptably small delay in control input being reflected on screen, but i'm not sure whether that was more a commentary on the speed of sending the control info, than any delay in the video.
 
I know it was discussed on another thread (or maybe higher up on this one) but regarding AirPlay, has apple indicated that it is a lossless transfer for the video mirroring ?
Of course it's lossy; with only 30Mbps of bandwidth (and often less in practice) you could only barely do 320x240 16bpp at 24fps! I think someone would have noticed ;)

It's nearly certainly H.264 Main Profile at the native resolution of the device, or theoretically any resolution up to 720p (since the Apple TV doesn't support 1080p yet). Apple claims the Apple TV can only decode streams with bitrates up to 5Mbps, but I think it's actually higher than that, so who knows what the bitrate is actually like.

Also does the ipad send out 1024x768 wirelessly and the itv handles the upshift to hd , or does it ship out hd natively over the wireless, like it does over the hdmi connection
Who knows - since Apple controls both ends and won't be bandwidth limited below 720p, it doesn't really matter.

The other discussion indicated that there was acceptably small delay in control input being reflected on screen, but i'm not sure whether that was more a commentary on the speed of sending the control info, than any delay in the video.
Not sure what you mean there. There's a video on YouTube where someone does video mirroring of a video game with AirPlay. He obviously uses the tablet as the controller and both are visible at the same time in the video. The delay between the two is clearly quite small - not small enough for a multiplayer ranked FPS match obviously, but low enough to be usable.

As I said, the best comparison is probably OnLive.com - if someone finds that to be suitable, then so is this. It should be about as good in terms of bandwidth and latency today, and long-term it'll get much better than OnLive could ever be - with the obvious limitation that the GPU on the tablet is more limited and it'll drain battery life even more quickly than normal gaming.

On the other hand, we'll hopefully have at least 150Mbps Wi-Fi (SISO 40MHz channels) allowing for 50Mbps H.264 High Profile streams, so the quality loss on that front should be fairly low. Although if the tablet itself doesn't have two antennas, one very real concern is that placing your hand on the wrong spot will significantly reduce the WiFi bandwidth and result in missed frames and/or lower image quality. So it'd be nice if they went all the way to MIMO for that reason alone (300Mbps can't hurt either).

I'm skeptical it will set the world on fire, but it certainly makes for some interesting comparisons with Wii U. On one hand, Wii U's controllers will have better battery life and there's nothing else to use it for so you have no reason to try and conserve battery. On the other hand, by late 2012 or early 2013 and assuming Apple comes up wiht a SoC similar to the A9600 on 28nm, the iPad will be at least half as fast as the XBox360. Of course, by then a 2048x1536 display seems inevitable, so even if the tablet itself is used for 2D information ala Wii U, developers will still be sending a 1080p signal to the TV. So no way you'll have Wii U-level image quality in theory.

Technically speaking, it would be more interesting to do the 3D rendering on a Mac than an iPad. And either way I'm still more interested in what Apple might do via Apple TV. Mind you, since I don't really care, it would have saved me a lot of time not to write any of the above...
 
If they can condition users to tether via HDMI then all this AirPlay stuff would be moot.
For video, sure. For gaming, that doesn't make sense - can you imagine the reaction if Nintendo's Wii U controller wasn't wireless?
 
For video, sure. For gaming, that doesn't make sense - can you imagine the reaction if Nintendo's Wii U controller wasn't wireless?

Well for gaming on tablets to be taken more seriously by core gamers, they have to support physical controllers.

Something like the bluetooth controller that On Live announced except that it would have to support games other than On Live content.

Then you'd have the tablet tethered via the HDMI cable, the power cable (the iPad2 dongle has both power and HDMI) and paired with a bluetooth controller.
 
I had been wondering that myself after seeing the TBS logo, and I see you're correct after searching.

Looks like I missed the commercial the first time it made the rounds.
 
http://www.reuters.com/article/2011...20110715?feedType=RSS&feedName=technologyNews

Reuters is reporting that TSMC has begun trials of Apple's A6. Supposedly, whether they get the actual production contract will depend on yield rates.

Does this make sense? I presume moving to a different process involves lots of design work on Apple's part and likely the requirement to bring in TSMC. Given Apple's secrecy, would they really reveal their whole design to third parties and let them make working silicon without already having a manufacturing contract in place? Yield rates seems to be something that could be addressed as penalties, whether through fines or guarantees of additional capacity to maintain shippable volumes, in an existing contract rather than whether they get the contract. Given TSMC's manufacturing experience, is there really a major chance that TSMC couldn't get good yields on an A6 in a timely manner whereas Samsung can?
 
From my very limited knowledge of the semiconductor industry, i dont think you can decide at the last minute if you are going to go with a foundry or not, those kinds of decisions and associated contracts are signed years in advance. Besides given how capacity constrained 28nm is going to be initially, TSMC wont be begging Apple to fab their chips for them, its going to be the other way around mostly
 
Supposedly, the business that Samsung gets from Apple for making the components makes more money than their own smartphone and tablet business.

They're separate divisions but you'd think the CEO or chairman would weigh the costs of antagonizing a big customer by competing with them.

ETA: Especially competing by copying the look of the UI. There are reports that in the countersuits, Samsung is going to demand licensing fees from Apple for telecom patents that it has. Hmm, and Apple just got access to Nortel patents for LTE.
 
Besides given how capacity constrained 28nm is going to be initially, TSMC wont be begging Apple to fab their chips for them, its going to be the other way around mostly

May be the carrot of apple's business will make them ramp up 28 nm harder?
 
From my very limited knowledge of the semiconductor industry, i dont think you can decide at the last minute if you are going to go with a foundry or not, those kinds of decisions and associated contracts are signed years in advance. Besides given how capacity constrained 28nm is going to be initially, TSMC wont be begging Apple to fab their chips for them, its going to be the other way around mostly

Gaining Apples future business is worth literally billions in revenue, money denied a direct competitor.
 
The next logical step for Apple would be to invest money into their own fabs...since they already purchased their own "chip designers"...PA Semi and Intrinsity. Also remember the rumor that Apple will swtich to ARM in the future and not use Intel processors...
 
Last edited by a moderator:
I think designing a better process than what TSMC/Samsung can offer will be no small task. Not to mention the insane amount of on going R&D needed. Besides it is not obvious that they will be able to get the process advantage in a reasonable time frame, without which, this idea makes no sense.

Their best bet is buying some tiny foundry, and scaling up from there. But I don't see it happening.

PS: It would be, simply put, breath taking, if it did happen.
 
Back
Top