OMAP4 & SGX540

Boy did TI announced OMAP4 ahead of time or what ?, the original posting here relates to their MWC '09 announcement of it.

Anyway, 4430 is already sampling, and apparently 4440 will sample in another few months.
In fact the rim playbook is going to have omap4 in it (whenever IT launches).

Found a link to the 4430 full datasheet via an elinux.org entry:-
http://elinux.org/OMAP4430
http://focus.ti.com/pdfs/wtbu/OMAP4430_ES2.x_Public_TRM_vK.zip
(Dated Sept 2010)

I wasn't aware that as well as having dual-core A9, it also has dual core M3.

In terms of SGX540, the docs state that the SGX clock maxs out at 307Mhz. The block diagram on page 259 would suggest that IMG are supplying some video codecs as well as 2d/3d graphics (even though TI have their own video core in there).

I know the recent 4440 press releases stated Omap4440 has graphics core x1.25 of 4430.
Don't know if that means the 4440 runs at 307Mhz, and the 4430 runs @250Mhz, or if the 4430 runs 307Mhz, and 4440 therefore runs around 385Mhz.

Still, either way its a useful speed increase over the hummingbird, thats clocks the 540 @ 200MHz.
 
Last edited by a moderator:
B
In terms of SGX540, the docs state that the SGX clock maxs out at 307Mhz. The block diagram on page 257 would suggest that IMG are supplying some video codecs as well as 2d/3d graphics (even though TI have their own video core in there).
I would guess you are referring to the diagram on P259.
I'll state for the record that I don't know if Ti have or have not licensed dedicated video hardware IP from IMG, but SGX is quite programmable and can accelerate video processing.
 
I would guess you are referring to the diagram on P259.

Indeed, I've corrected the original page number.

I've had another look at the datasheet and on page 264 states (regarding the SGX540 description).
"Programmable video encode and decode support for H.264, H.263, MPEG-4 (SP), WMV9, and JPEG"

Would it be fair to describe the above as part of the standard featureset of SGX ?, sounds like it from you previous reply.
 
Last edited by a moderator:
"Programmable video encode and decode support for H.264, H.263, MPEG-4 (SP), WMV9, and JPEG"

Would it be fair to describe the above as part of the standard featureset of SGX ?, sounds like it from you previous reply.
Pass - I honestly don't know. I'm the wrong person to ask.
 
Indeed, I've corrected the original page number.

I've had another look at the datasheet and on page 264 states (regarding the SGX540 description).
"Programmable video encode and decode support for H.264, H.263, MPEG-4 (SP), WMV9, and JPEG"

Would it be fair to describe the above as part of the standard featureset of SGX ?, sounds like it from you previous reply.

OMAP3 and several other TI SoCs have had their own hardware blocks for things like this and every indication I've seen of OMAP4 suggests it still does (as well as a pared down C64x DSP). I'd be pretty surprised if this was being offloaded to IMG video IP or SGX.. maybe they're just grouping this information in the wrong location?
 
I'm 99% sure it's just TI copy-pasting a spec sheet without really thinking it through. The main video block was designed in-house AFAIK and is completely independent from everything else (even from the DSP and the CPU). In fact, I'm rather amused to discover in that diagram that it has two dedicated ARM9s, so that brings the number of ARM cores in OMAP4 (before we count TI-engineered peripheral chips) to 6!

BTW, that PDF is really great, thanks a lot for linking it tangey! I'm particularly surprised by the architecture of the audio block; 88KB SRAM total, and that's before we consider what that the Hotchips presentation hints the DSP has 128KB of L2 and it also has to remain on standby in low power audio mode (presumably to keep program instructions among other things).

Can they really achieve such low total power consumption (100 hours/1000mAh 3.7v battery->37mW full system) with more than 200KB of SRAM in retention mode? Hmm. I wonder how much leakage that takes; maybe it takes a lot less than I thought (depends on the SRAM variant?) or maybe the 128KB of L2 doesn't actually have to be on. Hmm!

Another nice tidbit is that (unlike what the Hotchips presentation implied), the C2C interface (allows sharing the system's DRAM to support 3G/4G basebands without an extra memory chip) does support LPDDR1 and not just LPDDR2, so it could actually be used with today's basebands. I'll be curious to see if that gets used in real products to save some cost.
 
http://www.journaldugeek.com/2011/03/18/archos-gen9/

They claim a 1.6GHz dual-A9 from TI, in the next generation of Archos tablets (the curren generation uses OMAP36xx variants, depending on the model).

The best guess would be an OMAP4440, but the white papers claim those are clocked at 1.5GHz.
Not that the extra 100MHz would be hard to achieve, given the OMAP3's clocking flexibility.

So these should be a lot faster than iPad 2, CPU-wise (900MHz vs. 1600MHz), but slower GPU-wise (SGX543MP2 @ 200MHz vs. SGX540 @ 380MHz).



If priced as well as their predecessors, these tablets could be real winners. If they're not too late.
 
So these should be a lot faster than iPad 2, CPU-wise (900MHz vs. 1600MHz), but slower GPU-wise (SGX543MP2 @ 200MHz vs. SGX540 @ 380MHz).

If priced as well as their predecessors, these tablets could be real winners. If they're not too late.
No tablet is going to be a winner based on CPU and GPU specs alone. Having played extensively with a Xoom, it's stunning how they manage to bungle the abundance of CPU and GPU power. Using the thing just feels wrong. Maybe because I'm too used to Apple camp, but then maybe there's also a reason for that.

I think Andy Ihnatko described it best in his article:
No kidding. Go to a Best Buy and an Apple Store and watch how people respond to various devices. When they pick up an Android tablet, you can see their shoulders hunch and their brows tighten. Eyes dart. Fingers fidget. Certain other parts clench.

When they pick up an iPad, they visibly relax. I see this phenomenon all the time.
He might as well have been describing me: I really wanted to like the Xoom and buy one. And in benchmarks it's really fast and all that. But once again they manage to make it feel awkward. Not slow, but awkward.

As long as the Google boys don't get the emotional part, no amount of fast CPU and GPUSs are going to make a difference.

The OMAP4 may well be a winner compared to other non-Apple chips. But they'll keep on playing a well deserved second fiddle as long as these basics are not fixed.
 
No tablet is going to be a winner based on CPU and GPU specs alone. Having played extensively with a Xoom, it's stunning how they manage to bungle the abundance of CPU and GPU power. Using the thing just feels wrong. Maybe because I'm too used to Apple camp, but then maybe there's also a reason for that.

As someone else said to claim that software performance doesn't matter because processor will get faster next year is nonsense. In such a case the heart of the problem doesn't get solved, but it's rather a clear attempt to run around it at a higher cost (more die area and/or higher frequencies).

And no I'm not aiming against any specific products; it rather remains IMHO that a good SoC needs a fine balance between good hw and good code. While there's obviously always headroom for criticism against Apple's solutions, their competition still doesn't seem to have reached a similar balance, and no there a faster CPU or GPU (or even both) doesn't obviously save anything.
 
The Hummingbird CPU might've grabbed the headlines, but OMAP's implementation of Cortex family cores have always shown themselves to be exceedingly clockable, starting with ~500 MHz OMAP34xxs being stable well over 1000 MHz. In fact, OMAP has always given power efficiency a top priority.

Apple's semiconductor team were obviously going to have a top performer with their implementation of VXD/VXE and SGX, but the open market will have something to offer again when the A9600 lands.

The clock speeds ST Ericsson are promising will make their competitors have to stretch a bit in order to compete.
 
I don't think I'd go as far as 'awkward' for the Xoom personally, and I'm not sure what could cause that emotional reaction except for a slightly longer learning curve. Which I don't think is a deal breaker; high-end Android phones have been selling quite well despite having a much worse learning curve. Overall however, I agree there's just not a very good reason to buy any Android tablet rather than an iPad 2.

I'm much more interested in the Playbook personally. If the iPad's UI feels simple and smooth, the Playbool's UI feels powerful and awesome. And honestly I'm not convinced by Apple's 7-inch vs 10-inch arguement given the current pixel density. Maybe with a Retina Display (or maybe not), but certainly not at 1024x768.

And the Playbook is an OMAP4 tablet so I'm on-topic amazingly enough! ;)
 
I don't think I'd go as far as 'awkward' for the Xoom personally, and I'm not sure what could cause that emotional reaction except for a slightly longer learning curve. Which I don't think is a deal breaker; high-end Android phones have been selling quite well despite having a much worse learning curve. Overall however, I agree there's just not a very good reason to buy any Android tablet rather than an iPad 2.

I'm much more interested in the Playbook personally. If the iPad's UI feels simple and smooth, the Playbool's UI feels powerful and awesome. And honestly I'm not convinced by Apple's 7-inch vs 10-inch arguement given the current pixel density. Maybe with a Retina Display (or maybe not), but certainly not at 1024x768.

And the Playbook is an OMAP4 tablet so I'm on-topic amazingly enough! ;)

Having played with a Xoom, what's "awkward" is what has been awkward on every android device; that little intermittent bit of lag or an inconsistency in UI effect. For example, scrolling through the homescreens feels fairly fluid with the exception of a small input delay from my finger moving. But if I have a stack of youtube videos, flipping through them is choppy at best (no actual flip animation).

These are little things of course and don't detract from the functionality but it's noticeable and makes the device feel "slow".
 
But if I have a stack of youtube videos, flipping through them is choppy at best (no actual flip animation).

These are little things of course and don't detract from the functionality but it's noticeable and makes the device feel "slow".
That looks to me like a lack of precaching combined with blocking I/O; Google apparently has/had a tendency to overuse it and it's only slowly improving. I agree it's a clear weakness, however silent_guy explicitly said it wasn't 'slow', just awkward. So I still suspect that's not what he meant.
 
http://www.journaldugeek.com/2011/03/18/archos-gen9/

They claim a 1.6GHz dual-A9 from TI, in the next generation of Archos tablets (the curren generation uses OMAP36xx variants, depending on the model).

The best guess would be an OMAP4440, but the white papers claim those are clocked at 1.5GHz.
Not that the extra 100MHz would be hard to achieve, given the OMAP3's clocking flexibility.

So these should be a lot faster than iPad 2, CPU-wise (900MHz vs. 1600MHz), but slower GPU-wise (SGX543MP2 @ 200MHz vs. SGX540 @ 380MHz).



If priced as well as their predecessors, these tablets could be real winners. If they're not too late.

My guess is that it's NuSmart 2816 (http://www.nufront.com/en/cpzx/eed31e97-d916-4441-8aa1-6f6413a692f9155.html). I mean, I've got to eventually be right about that being in something.

I doubt they'd actually sell an overclocked part, and I doubt you'll see an OMAP4440 device in July.
 
Ohhh, that does make sense - for once I have to agree with you that's the most likely possibility :)
 
My guess is that it's NuSmart 2816 (http://www.nufront.com/en/cpzx/eed31e97-d916-4441-8aa1-6f6413a692f9155.html). I mean, I've got to eventually be right about that being in something.

I doubt they'd actually sell an overclocked part, and I doubt you'll see an OMAP4440 device in July.

I only assumed it was an OMAP4 because the news explicilty state it's a "TI A9".

Any idea on what the GPU is on that NuSmart SoC? The fillrate and triangle rate numbers don't add up to any mGPU I know of.

EDIT: Looking again at the specs, a Mali MP400MP2 @ 400MHz would fit the pixel and triangle rates.
 
Last edited by a moderator:
That looks more like a desktop chip, but maybe you'll be right :)

Nufront themselves have demonstrated it in a netbook, a laptop, and a tablet. I'm sure if Ontario can find its way in tablets so can this SoC. An ARM SoC intended only for desktops would be a real waste..
 
Nufront themselves have demonstrated it in a netbook, a laptop, and a tablet. I'm sure if Ontario can find its way in tablets so can this SoC.
Oh I didn't know, I only saw the small box demo. I still find it odd to see a chip that supports DDR2/3, has SATA2 interfaces being used in a tablet and claims "Typical power consumption small than 2Watts", but that certainly doesn't dismiss it for such uses. After all we might see some Atom in that space this year... or not :LOL:

It's known that Nufront is using the A9 hard macrocell and given the quoted frequency it's the performance hard macrocell, so it's using 40nm G process, not the LP one.

An ARM SoC intended only for desktops would be a real waste..
You should discuss that with people who are doing desktop and server ARM SoC :)

EDIT : I found some information about the Nufront demo tablet; it seems it was a 1.2 GHz part.
 
Back
Top