Next-Gen iPhone & iPhone Nano Speculation

A rare mistake from Jobs then. My ipad is virtually always using in landscape. And given that the ipad magnetic cover only acts a a stand in landscape mode, it appears that they accept thats the way its used.

Well, the reason I sometimes use the iPad in landscape is that text legibility improves. However, with twice the linear resolution, this will be a non-issue, and I will gravitate towards portrait for everything but video and some games. Ultimately, I believe it makes sense to optimize for reading over film viewing, but clearly individual usage patterns will tend to favor one over the other, and one of the nice things with increased resolution is that it makes the issue less critical. Reading will benefit greatly, obviously.
 
They can go for both video and text.

They're supposedly doing something big with textbooks this week.

With AirPlay, the HDMI dongle and the rumored TV initiative, outputting/streaming video to a big screen may become a bigger thing for iPad 3.

Video may become more important than 3D.
 
1) Windows drivers for PowerVR are likely much more mature than the ARM T604 considering it's not currently in any shipping products. These GPUs need to be ready for a Windows 8 launch in the fall.
2) Exynos 4210 sadly does not support 42Mbps HSPA+ or LTE. I'm hoping Samsung corrects that problem in the 4412 and beyond as nearly every high end smartphone will have LTE (heck, Verizon is flat-out requiring it on all new smartphones for this year).

1) While that is likely true, we cant say for Certain. Besides the only PowerVR GPU shipping with Windows is Cedartrail correct? And it has only working Dx9 drivers as of now.

2) I dont think thats true. They likely just got a better deal from Qualcomm if they bought both the SoC + Baseband rather than baseband alone. Also may have been due to insufficient supply of Exynos. With regards to LTE, the Galaxy Tab 7.7 has the Exynos + LTE combination.
 
1) While that is likely true, we cant say for Certain. Besides the only PowerVR GPU shipping with Windows is Cedartrail correct? And it has only working Dx9 drivers as of now.

Nahh, they've had winxp and win7 drivers for ages on the Z500 series and also the embedded variations in the Z600 series, both of which have the 535 core.

I think its a bit ironic that IMG keep referring to their big advantage in having DX driver experience, in that the SGX-ed based Intel Socs have had no end of bad press exactly based on bad driver implementation, which has continued with Cedartrail, which initially was supposed to have DX10.1 compliance, and then launched much later that stated, apparently down to problems getting Win compliance on the downgraded DX9.x drivers.

To be fair, as I understand it, these issues have not been within the control of IMG, but they do result in the company getting a bad reputation none-the-less.
 
To be fair, as I understand it, these issues have not been within the control of IMG, but they do result in the company getting a bad reputation none-the-less.

I'd accept that if you'd mean GMA500 for which IMG didn't provide any drivers, but Intel had ordered the GPU drivers from late Tungsten graphics.

In the case of Cedartrail though both the D3D as well as the OGL driver from what I've been told by a user are signed by IMG, which prooves that drivers come from IMG and not any other third party. In such a case if a driver fails WHQL verification it's of course a problem for IMG and of course within their control only.
 
I'm a bit new in this discussion, but here are a few remarks i have:

* The Tegra 3 has finally cached up to the A5, on GPU level. And doubled its CPU power compared to the A5.

* If we assume that the iPad 2 -> iPad 3 will move from the current screen, to a retina screen, we are moving from a 786.432 pixel density, to 3.145.728. That's about 4 times increase in pixel density.

I don't know about you, but anything that needs to run into this native resolution, especially graphical programs, will be much more taxed.

It will be silly, if Apple just used a small speed increase for the A6 on the GPU part, because there are ( finally ), competitors that already match the A5 ( and to be honest, looking at the same games on the Tegra 3 optimized vs iPad2 optimized, you clearly have more effects / details going on ).

The latest guesswork, places the Rogue on a +- 4 times increase in speed.

Technically, they can use the SGX543MP4 as a replacement, almost giving it a 2 times increase in speed.

But, when you see the 4 times increase in pixel density on the screen, then technically, using the SGX543MP4, is a downgrade.

Screen resolution increase: 4*
GPU speed increase: 2*

Notice the problem...

It also opens up the question of feeding any faster GPU. The memory controller needs to be upgraded to deal with more & more powerful GPU's on the system. You can add the most powerful GPU in a system, but if its starving for bandwidth, its crippled.

Now, even if they just upgrade to a SGX543MP4, with a Quad Core CPU setup, it does begs a other question. These specs then turn out to be identical to the PS Vita. At that moment, it almost rivals the latest handheld Console.

Sidenote: If only Apple added some real game controllers to the iPad ( hidden in the bezel, or slightly lower positioned? ), and wham. Apple can litterally challenge the more hardcore gaming / console market. That is another big market they will need in the future anyway, for even more potential profit.
 
I'm a bit new in this discussion, but here are a few remarks i have:

* The Tegra 3 has finally cached up to the A5, on GPU level. And doubled its CPU power compared to the A5.

* If we assume that the iPad 2 -> iPad 3 will move from the current screen, to a retina screen, we are moving from a 786.432 pixel density, to 3.145.728. That's about 4 times increase in pixel density.

I don't know about you, but anything that needs to run into this native resolution, especially graphical programs, will be much more taxed.

It will be silly, if Apple just used a small speed increase for the A6 on the GPU part, because there are ( finally ), competitors that already match the A5 ( and to be honest, looking at the same games on the Tegra 3 optimized vs iPad2 optimized, you clearly have more effects / details going on ).

The latest guesswork, places the Rogue on a +- 4 times increase in speed.

Technically, they can use the SGX543MP4 as a replacement, almost giving it a 2 times increase in speed.

But, when you see the 4 times increase in pixel density on the screen, then technically, using the SGX543MP4, is a downgrade.

Screen resolution increase: 4*
GPU speed increase: 2*

Notice the problem...

It also opens up the question of feeding any faster GPU. The memory controller needs to be upgraded to deal with more & more powerful GPU's on the system. You can add the most powerful GPU in a system, but if its starving for bandwidth, its crippled.

Now, even if they just upgrade to a SGX543MP4, with a Quad Core CPU setup, it does begs a other question. These specs then turn out to be identical to the PS Vita. At that moment, it almost rivals the latest handheld Console.

Sidenote: If only Apple added some real game controllers to the iPad ( hidden in the bezel, or slightly lower positioned? ), and wham. Apple can litterally challenge the more hardcore gaming / console market. That is another big market they will need in the future anyway, for even more potential profit.

Tegra 3 is MUCH better than the Tegra 2 compared to the PowerVR SGX543MP2 but it still gets beaten handily.

http://www.anandtech.com/show/5163/asus-eee-pad-transformer-prime-nvidia-tegra-3-review/3
 
Just an interesting note on Apple's iOS product mix from today's conference call. 62 million iOS devices were sold last quarter broken down as 37 million iPhones (60%), 15.4 million iPads (25%), 8.2 million iPod Touches (13%), and 1.4 million Apple TVs (2%). (iPod Touch numbers calculated from the other devices and total sales which were explicitly stated.) Given Apple's one SoC a year strategy, the need to greatly increase processing ability to meet a Retina iPad 3 is tempered by the need to scale down well to fit the iPhone since the iPhone is still the primary iOS device by a large margin and can't be impacted by production problems due to an overly large die or too exotic design or suffer from battery life regressions.
 
I'm a bit new in this discussion, but here are a few remarks i have:

* The Tegra 3 has finally cached up to the A5, on GPU level. And doubled its CPU power compared to the A5.

* If we assume that the iPad 2 -> iPad 3 will move from the current screen, to a retina screen, we are moving from a 786.432 pixel density, to 3.145.728. That's about 4 times increase in pixel density.

I don't know about you, but anything that needs to run into this native resolution, especially graphical programs, will be much more taxed.

It will be silly, if Apple just used a small speed increase for the A6 on the GPU part, because there are ( finally ), competitors that already match the A5 ( and to be honest, looking at the same games on the Tegra 3 optimized vs iPad2 optimized, you clearly have more effects / details going on ).

The latest guesswork, places the Rogue on a +- 4 times increase in speed.

Technically, they can use the SGX543MP4 as a replacement, almost giving it a 2 times increase in speed.

But, when you see the 4 times increase in pixel density on the screen, then technically, using the SGX543MP4, is a downgrade.

Screen resolution increase: 4*
GPU speed increase: 2*

Notice the problem...

It also opens up the question of feeding any faster GPU. The memory controller needs to be upgraded to deal with more & more powerful GPU's on the system. You can add the most powerful GPU in a system, but if its starving for bandwidth, its crippled.

Now, even if they just upgrade to a SGX543MP4, with a Quad Core CPU setup, it does begs a other question. These specs then turn out to be identical to the PS Vita. At that moment, it almost rivals the latest handheld Console.

Sidenote: If only Apple added some real game controllers to the iPad ( hidden in the bezel, or slightly lower positioned? ), and wham. Apple can litterally challenge the more hardcore gaming / console market. That is another big market they will need in the future anyway, for even more potential profit.

I'd say, performance @ native display resolution is irrelevant in a comparison with other solutions if they do not have a similar resolution / dpi. When Apple launches an Ipad with 'retina' display, people will want to buy it purely on the merits of that alone.

What matters is comparing raw performance (543MP2 is still faster than Tegra3 as per dagamer above). If performance at native resolution isn't good enough for certain games, what do you do? You render at a resolution that is fast enough, and upscale to native res. That's what the game consoles are doing right now afaik.
 
Does the SGX have scaling hardware?

Yeah it'll be interesting to see what games developers do with the higher resolution.

One wrinkle would be how good the HDMI output is. If the iPad 3 has greater than 1080p resolution as anticipated, could HDMI out or even Airplay benefit? Actually, AppleTV isn't capable of 1080p output so whatever is streamed by Airplay would be output at 720p.
 
Apple TV can decode 1080p but downscales to 720. So there's definitely some kind of scaling in the Apple A5. Don't they also do something with scaling on apps for pre-retina display iPhones on the iPhone 4+?
 
Does the SGX have scaling hardware?

Yeah it'll be interesting to see what games developers do with the higher resolution.

One wrinkle would be how good the HDMI output is. If the iPad 3 has greater than 1080p resolution as anticipated, could HDMI out or even Airplay benefit? Actually, AppleTV isn't capable of 1080p output so whatever is streamed by Airplay would be output at 720p.

With AirPlay, why would you want HDMI? If anything we are talking Thunderbolt instead of using antiquated connection outputs.

Current applications can be streamed in 1080p over AirPlay, just look at Real Racing 2 HD.
 
The gap between the 3GS and iPad, and iPhone 4 to some extent, in terms of fill rate relative to their display resolution was very large, and app developers have already shown that they'll make the necessary optimizations for each profile when Apple doesn't scale performance and resolution in exact proportion.

Apple doesn't need to directly challenge console manufacturers. They've cast a much wider net for demographic, so they'll be happy to continue growing the way they've been doing it. The same goes for competing against "open source" business models, like Android. Apple will never have the potential market share, but they'll continue to focus on making the best product for the average person and profit greatly along the way.

Android is just a way for Google to keep their services front and center to consumers, so their strategy is great for them, too. Just like iPhone, iPad will eventually lose its lead spot in market share, and that day won't be an indication of a win or loss for either company -- just a normal consequence of strategy.
 
Android is just a way for Google to keep their services front and center to consumers, so their strategy is great for them, too. Just like iPhone, iPad will eventually lose its lead spot in market share, and that day won't be an indication of a win or loss for either company -- just a normal consequence of strategy.
The definition of winning or losing varies anyways.

http://blog.flurry.com/bid/79061/App-Developers-Bet-on-iOS-over-Android-this-Holiday-Season

For cross-platform apps, developers have found that the same app only makes 25 cents on Android for every dollar that is made on iOS.

http://9to5mac.com/2011/09/21/google-23rds-of-our-mobile-search-comes-from-apples-ios/

Google themselves reported that 2/3rds of their mobile search traffic comes from iOS despite all those reports of Android device market share being larger than iOS.

iOS users apparently spend more money and use more services than Android or other smartphone platforms. So Apple can do quite well even if raw user numbers are lower and Google can do well too as long as iOS users use Google services.
 
Does the SGX have scaling hardware?

Yes it does. Obviously something like AFR for desktop mGPU would be utter nonsense for a TBDR multi-core. Instead performance scaling happens with each frame divided into portions and scheduling hw diverts these to different cores and after that to respective pipelines per core (and I'll get most likely corrected again for understanding it wrong... :LOL: )
 
Yes it does. Obviously something like AFR for desktop mGPU would be utter nonsense for a TBDR multi-core. Instead performance scaling happens with each frame divided into portions and scheduling hw diverts these to different cores and after that to respective pipelines per core (and I'll get most likely corrected again for understanding it wrong... :LOL: )

I think he meant hardware for scaling the framebuffer to a different target resolution, not scheduling render load among multiple cores.

You can do this with render to texture then rendering a textured quad, but you'd be limited to whatever relevant filtering is supported (probably just bilinear), unless you can cook up something else in shaders.

Most SoCs I've seen will have additional image/2D hardware, including the ability to scale the framebuffer. From the 3D GPU's point of view it's just rendering to some portion of memory and ultimately some display controller on the SoC has to manage it for it to become a framebuffer.
 
My bad then.

By the way:

* The Tegra 3 has finally cached up to the A5, on GPU level. And doubled its CPU power compared to the A5.

Has been answered multiple times before in former posts.

But, when you see the 4 times increase in pixel density on the screen, then technically, using the SGX543MP4, is a downgrade.

Screen resolution increase: 4*
GPU speed increase: 2*

Notice the problem...
The only problem I notice is that you forgot to think of frequency. What if purely hypothetically you have a MP4 clocked at 500MHz vs. the current MP2 clocked at 250MHz, is it still only a 2x times increase or could be conincidentially 4x this time? Not that I expect anything like that, but the entire paragraph above is still missing one rather important detail which is frequency and it does have the tendency to increase at least slightly with smaller manufacturing processes.
 
4x screen resolution doesn't affect the cost of vertex shading and triangle setup at all (with unified shaders this also benefits pixel shader). Higher resolution improves pixel quad efficiency, because the average pixel count of each triangle is quadrupled (triangles covering just one pixel are as heavy as triangles covering all four pixels in the quad, as both require four instances of pixel shader to run - rest are simply discarded after calculations). Improved resolution improves texture cache hit rate (more locality as continuous surfaces cover more pixels) so sampling BW will not increase by 4x. Also if texture resolution is not increased, sampling BW will stay identical for surfaces that are already using the highest mip. Depending on rendering techniques and content (polygon counts), the 4x resolution boost likely requires somewhere between 2x-3.5x GPU performance to run at same frame rate (assuming the software is of course fully GPU bound).

Rasterizing tiny triangles (just a few pixels) is very inefficient (bad quad efficiency, bad cache efficiency, heavy triangle setup cost, etc). If super high DPI resolutions get more popular, triangle rasterization is going to get some extra years before other techniques replace it (rasterization scales better than other techniques when resolution gets higher, but geometry complexity stays the same).
 
Back
Top