Implications of SGX543 in iPhone/Pod/Pad?

Not that I know anything, but I can't get rid of the gut feeling that the rumors for a hypothetical 2048*1536 resolution for a "iPad2" are way too much.

Too much for what though?
For what it is used for, or too much in terms of hardware requirements?

High resolution is wonderful for text, and one of the things increasing the resolution of the iPad2 can achieve is making it a much more viable platform for publishing "print" material. Kindle, Nook, and their ilk actually have decent sales volumes showing that there is a largely untapped market there. With a more compelling device this market could drive expansion. (Of course pretty much everything will look better at higher resolution even if it won't make that great a functional difference, just as print looked better from a laser-printer than from a dot-matrix.)

We tend to think in terms of the technology we have, and resolution independent displays have been a pipe dream for well over a decade. Always paved the way for, but never realized. Maybe the next decade will see that changed.
 
Too much for what though?
For what it is used for, or too much in terms of hardware requirements?

High resolution is wonderful for text, and one of the things increasing the resolution of the iPad2 can achieve is making it a much more viable platform for publishing "print" material. Kindle, Nook, and their ilk actually have decent sales volumes showing that there is a largely untapped market there. With a more compelling device this market could drive expansion. (Of course pretty much everything will look better at higher resolution even if it won't make that great a functional difference, just as print looked better from a laser-printer than from a dot-matrix.)

We tend to think in terms of the technology we have, and resolution independent displays have been a pipe dream for well over a decade. Always paved the way for, but never realized. Maybe the next decade will see that changed.

The majority of the above is fine and dandy if you use such a device as a fancy e-book reader. It'll get a tough cookie if you try to render something in 3D at such a resolution, unless of course 3D is running at 1024*768 or higher.

The SGX535 in A4 has 2 TMUs and a SGX543 2MP has 4. Unless the latter config is clocked at =/>400MHz I don't see it being all that easy to process 4x times the pixel amount between 1024 and 2048.

On a rather irrelevant side-note I'm still at home using mostly a 21" CRT monitor capable of up to 2048*1536*32@75Hz and I know exactly what you mean; even more so that if I should go all the way up to the maximum available resolution due to the mask override I get 2x oversampling on one axis for free (if it comes down to 3D). However... ***

Well, that's the only higher resolution we have some sort of evidence for in form of higher resolution graphics that at least hint that Apple was experimenting with a 2048x1536 display. And now we also have some sort of evidence that Apple's future SoC could support such a high resolution. Combine that with what Apple's done with the iPhone 4, the emerging tablet market in 2011 etc. and a 2048x1536 resolution doesn't seem so far fetched any more. IMHO the only thing that speaks against such a high resolution display in the iPad 2 is the high cost for the panel. But if anyone can pull it of it's Apple with it's high volumes, connections, cash, long-term platform strategy/investments/contracts etc (and you could argue that they also introduced other expensive display hardware such as capacitive touchscreens, IPS and very high ("retina") resolution displays to the mobile mass market).

As said above my primary consideration would be 3D for such a resolution; if however as I said 3D at a much lower resolution it isn't an issue even more so considering the size of the display (where I doubt it'll be twice as big as on iPad).

*** The next best thing that pops into my mind then would be that I already have a hard time reading simple text on the fore mentioned 21" CRT at 2048 (well the actual display size should be at 20", it ain't a LCD). I might not have eagle vision but it's still on very high level for my age *cough*. I'm obviously missing something again here.
 
*** The next best thing that pops into my mind then would be that I already have a hard time reading simple text on the fore mentioned 21" CRT at 2048 (well the actual display size should be at 20", it ain't a LCD). I might not have eagle vision but it's still on very high level for my age *cough*. I'm obviously missing something again here.
As with the iPhone 4 the higher resolution would not be used to display more information (like Windows with it's mostly fixed ppi with smaller fonts on higher resolution screens etc.), but to display it in a higher quality. I know a lot of people who have an iPhone 4 and an iPad and now read a lot less on the iPad than they used to and more on the iPhone 4 because of the much sharper screen, even though the iPad's screen size theoretically fits their reading habit a lot better. And that's the case for me too.
 
As with the iPhone 4 the higher resolution would not be used to display more information (like Windows with it's mostly fixed ppi with smaller fonts on higher resolution screens etc.), but to display it in a higher quality. I know a lot of people who have an iPhone 4 and an iPad and now read a lot less on the iPad than they used to and more on the iPhone 4 because of the much sharper screen, even though the iPad's screen size theoretically fits their reading habit a lot better. And that's the case for me too.

Ok the bell finally rang :) I still hope 3D won't run in its native resolution if 2048 should be true.
 
Not that I know anything, but I can't get rid of the gut feeling that the rumors for a hypothetical 2048*1536 resolution for a "iPad2" are way too much.

I don't know - 960 × 640 x 4 would make sense for an iPad2, if they can do it. That said, it could also just be the supported external resolution. Also, the support could be increased for the Mac OS/X version of the App Store and the upcoming OS/X Lion.

It is clear (by their own website) that the next push will be that OS/X Lion will integrate the Mac OS/X user experience with the existing iOS product range. You can see this at work now already when you try to install apps on the App Store.
 
The majority of the above is fine and dandy if you use such a device as a fancy e-book reader. It'll get a tough cookie if you try to render something in 3D at such a resolution, unless of course 3D is running at 1024*768 or higher.

The SGX535 in A4 has 2 TMUs and a SGX543 2MP has 4. Unless the latter config is clocked at =/>400MHz I don't see it being all that easy to process 4x times the pixel amount between 1024 and 2048.

.
How much power will SGX 543(2core at 400MHz) consume in 32 nm low-power CMOS process or even in 28 nm (which may be available in 2nd half this year)? Thx.
 
I don't know - 960 × 640 x 4 would make sense for an iPad2, if they can do it.
IMHO it's highly unlikely that Apple will change the screen's aspect ratio and resolution in the same year. That said I always wondered why Apple didn't choose the iPhone's 3:2 aspect ratio for the iPad.
 
Ok the bell finally rang :) I still hope 3D won't run in its native resolution if 2048 should be true.
The thing is, I'm sure as soon as Apple splashes Retina Display advertisement over the iPad 2, consumers will be demanding that all games check that box. That seemed to be the case in comments in game reviews and game announcements as soon as the iPhone 4 was released as if developers can just add this in on a dime. It'll probably have to take some big developers like Epic/Chair or id to say and show demos of the choice between iPad 2 Retina Display or regular resolution plus AA, extra effects, textures, draw distance, etc.

Which brings to mind what games are going to call the iPad 2's resolution? 960x480 and 1024x768 are HD because HD is in the consumer lexicon even though those resolution aren't actually HD. The next step up should be UHD but that doesn't have the same ring or recognition. I'm guessing the iPad 2 will also eliminate the hope of universal games in many cases since having to bundle assets for 4 resolutions from 480x320 to 2048*1536 will be very unwieldy and very inconvenient in file-size for 8GB iPhone 3GS users which are still selling.

IMHO it's highly unlikely that Apple will change the screen's aspect ratio and resolution in the same year. That said I always wondered why Apple didn't choose the iPhone's 3:2 aspect ratio for the iPad.
I think Steve Jobs was saying that Apple spent years sweating over the details of the iPad and given the long-standing rumours before release thats probably true. So they probably found the 4:3 resolution optimal for that screen size and use case or at least Steve Jobs preferred it.

Now that Steve Jobs is taking another medical leave it'll be very interesting to see how the iPad 2 is presented. Perhaps this aggressive feature list, Retina Display, dual core GPU, dual core GPU, double cameras, SD card slot, large size speakers, combined GSM and CDMA support, probably doubled flash capacity, etc. is in part to make sure the iPad 2 can sell itself on features since Steve Jobs won't be there to lend weight to it's launch. Presumably, him taking leave now means he could be well in time for the iPhone 5 and iOS 5 launch at WWDC.
 
Now that Steve Jobs is taking another medical leave it'll be very interesting to see how the iPad 2 is presented. Perhaps this aggressive feature list, Retina Display, dual core GPU, dual core GPU, double cameras, SD card slot, large size speakers, combined GSM and CDMA support, probably doubled flash capacity, etc. is in part to make sure the iPad 2 can sell itself on features since Steve Jobs won't be there to lend weight to it's launch. Presumably, him taking leave now means he could be well in time for the iPhone 5 and iOS 5 launch at WWDC.
I highly doubt that Steve Jobs leave of absence has any influence whatsoever on the features of the iPad 2 or iPhone 5. Especially for the iPad 2, which must be practically done by now. IMHO even if Steve Jobs would retire right now as CEO, from the board of directors and moved to an isolated island nothing would change for the 2011 product line.
 
(if rumours = true) My presumption is that 3D apps will run at a much lower resolution whereas 2D apps will use the native resolution. It is kind of like the Xbox 360 / PS3 dashboards running at 1080P @ 60FPS whereas the actual games run at a much lower resolution and frame-rate.

In any case if they do increase the specifications as such my expectation is that they'll segment the market between the old 'good enough' iPad and the newer iPad 2. Perhaps the reason is they want to increase the average sales price of their iPad whilst broadening the market at the same time.

I can sort of see it going:

$400 iPad
$500 iPad 3G
$600 iPad 2
$700 iPad 3G/4G

That way they don't cede the lower end of the market whilst extracting more dollars from those who are willing to spend more. Effectively people who would have paid $600 at launch would pay $700.

Just speculating on the business side.
 
I don't know the power numbers, but 400 MHz sounds at least reasonable when advancing to 32/28nm in a mobile phone for a 543MP2.
 
We've always dismissed people who were speculating about multi-GPU cores for PCs, since GPUs were always highly parallel in the first place.

Why is this considered more plausible to be a reality for a mobile chips? I don't see any performance or area advantages in doing so over increasing the amount of shaders just like big GPUs?

What am I missing?

(Can somebody point me to a comprehensive table with all the different PowerVR versions that doesn't require me to !#$!#$! create an account on their website?)
 
We've always dismissed people who were speculating about multi-GPU cores for PCs, since GPUs were always highly parallel in the first place.

Why is this considered more plausible to be a reality for a mobile chips? I don't see any performance or area advantages in doing so over increasing the amount of shaders just like big GPUs?
Dunno, Im wondering about his myself. Possibly its easier to design 1 common module to be able to hook it up with identical modules, vs. designing multiple chips with different numbers of units on them (needs scaling down/up internal busses, buffers, multiplexers, etc).
 
We've always dismissed people who were speculating about multi-GPU cores for PCs, since GPUs were always highly parallel in the first place.

Why is this considered more plausible to be a reality for a mobile chips? I don't see any performance or area advantages in doing so over increasing the amount of shaders just like big GPUs?

What am I missing?

(Can somebody point me to a comprehensive table with all the different PowerVR versions that doesn't require me to !#$!#$! create an account on their website?)

Maybe its the ability to turn them off as a separate entity within the chip rather than simply clocking it down as needed? That way in low use scenarios you could dynamically turn off half the chip and down clock the other half as needed? So I say a power advantage.
 
Maybe its the ability to turn them off as a separate entity within the chip rather than simply clocking it down as needed? That way in low use scenarios you could dynamically turn off half the chip and down clock the other half as needed? So I say a power advantage.
I doubt it. It's usually more efficient to render with the full GPU (at reduced voltage) and then shut it down than keep 1 GPU running longer. Or to shut down 1/2 of the shaders. .
 
I doubt it. It's usually more efficient to render with the full GPU (at reduced voltage) and then shut it down than keep 1 GPU running longer. Or to shut down 1/2 of the shaders. .

Can you run a smaller GPU at lower voltages than a larger one? IIRC performance per watt goes down as your absolute die size increases.
 
It's really about the practicality of offering a graphics solution to fit the many possible classes of device with power budgets higher than that of their range of custom cores.

A core with more pipelines would save on some overhead/redundancy (see SGX554MP versus SGX544MP), yet PowerVR's new comprehensive near-linear scaling and workload distribution on top of tiling's already inherent advantage for multi-core scalability makes grouping several cores still quite reasonable.

Excluding cancelled and pre-SGX cores, an overview of the range is:

SGX520: 1 ALU, 1 TMU
SGX530: 2 ALU, 1 TMU
SGX531: 2 ALU, 1 TMU, 128-bit bus
SGX535: 2 ALU, 2 TMU, DirectX 9L3
SGX540: 4 ALU, 2 TMU
SGX543: 4 ALU, 2 TMU, USSE2
SGX544: 4 ALU, 2 TMU, USSE2, DirectX 9L3
SGX545: 4 ALU, 2 TMU, DirectX 10.1
SGX554: 8 ALU, ? TMU, USSE2, DirectX 9L3
 
Last edited by a moderator:
Can you run a smaller GPU at lower voltages than a larger one? IIRC performance per watt goes down as your absolute die size increases.
It may be true for very large GPUs due to intra-die variation, but it's probably a secondary order issue for an SOC where the GPU is maybe 20% of the total die size?

In general, the problem with more independent units doing the same thing is that it can really mess up the traffic to external memory because you have 2 incoherent streams instead of just 1. This makes it harder for the memory controller to schedule transactions efficiently, so expect some BW loss.
There's also efficiency issues wrt synchronization and load balancing.

In the case of GPUs, I assume this would be mainly the case for texture fetches and, to a lesser extent (at least for tilers?), ROP operations.
 
Perhaps its simply due to overall strategy. I.E. Quadruple the number of pixels but keep the performance per pixel identical, which keeps the old iPads performance in line with the current generation of iPads. They could then move to make the lower spec iPads exactly half the performance/specs of the high end iPad whilst keeping the overall performance relatively similar for development reasons in the following generation after this one and at the same time offer double the performance per pixel for both as the current generation offerings?
 
The multicore-capable GPU architectures from IMG and ARM are all binning tilers so I don't think memory efficiency is going to be affected much if at all. But sure, the only real justification for multicore is R&D efficiency (and so indirectly more flexibility for IP customers). I certainly don't expect NVIDIA to go multicore in the same sense with their next-gen architecture (and assuming it's an IMR that makes it even less likely).

As for the iPad 2: I've never heard a single indication ever that Samsung actually has a 40nm logic process. I've always assumed they did not, and the 32nm process is definitively not ready for mass production in such quantities right now. But on the other hand... if a 400MHz MP2 is realistic on 32nm power-wise (to take the numbers from Lazy8s), what might be realistic on 45nm? Surely power consumption didn't go down 50% in practice so it should be more than a 200MHz MP2 right? Of course there's also the issue of cost but I'm not sure Apple cares excessively about that if it's genuinely a selling point. Not that I'm convinced they would consider slightly higher 3D performance a selling point, mind you...
 
Back
Top