PowerVR GX5300 - new ultra-small core (0.55mm2)

GX5300, based on series 5, 0.55mm2 @ 28nm, specifically aimed at wearables and other small footprint applications.

IMG is really starting to go after the low end, its GLes2.0 and PVRTC compatible, which is a good basic conformancy level for the areas they are targeting.

http://www.imgtec.com/news/detail.asp?ID=918

And a blog entry to match
http://blog.imgtec.com/powervr/powe...est-gpu-for-next-generation-wearables-and-iot

Interesting. Would have loved to see some power draw figures in the blog post, but alas.
 
Interesting. Would have loved to see some power draw figures in the blog post, but alas.

The blog author responded to my question regarding how it compares to a SGX531 or SGX543 by saying it'll go "up to" 1GFlop at the 280mhz frequency.

If wikipedia is accurate, that puts it just about the same as SGX520 (never productised), which is 2.6mm2@ 65nm. In the early days, IMG did state typical die area for some of its cores. How that area extrapolates to TSMC's 28nm process, I don't know.

I wonder is this a tweaked SGX520 by another name ?
 
Last edited by a moderator:
The IMG press release states 250MHz and not 280MHz. That's 4 FLOPs/clock and the only thing I can think of would be 1 Vec2. Who cares anyway; with 1 TMU and even hypothetically 1 pixel/2 clocks fillrate is more than enough for wearables.
 
Using 0.57 as scaling factor per manufacturing node (more realistic than 0.5) we get 0.57*0.57*0.76(half node) -> ~0.25 scale factor, which would project to ~0.65mm2 part at 28nm. I'd say it's pretty good scaling.
 
I note this family brings full OpenCL 2.0 support. IMG has had GPGPU capabilities since the first series 5 (albeit only support for an embedded variant of OpenCL at that time). Speaking as a layman, who doesn't have great knowledge of these things, I suspect that the vast majority of 5 series SOCs out there have run zero GPGPU processes, and with the exception of Apple, the same might be said for series6 (one assumes apple is doing some GPGPU compute). If it has been largely redundant, then was it a poor design decision to require GPGPU compliance so early on, with what one assumes is a die area penalty.


To be fair, all architectures in the market with unified ALUs have supported compute capabilities (AMD R600, nVidia G80, all things Vivante, ARM Midgard, etc.). Turns out that if you have an ALU that can perform both pixel and vertex shaders, it must be versatile enough to be used for other compute workloads. My guess is all you need to do from there is being able to send the ALU results to some place where the CPU can read them (unified memory in SoCs, system RAM through PCI-Express, etc.), instead hardwiring the results towards an inaccessible framebuffer.

Series 5 was no exception.
 
To be fair, all architectures in the market with unified ALUs have supported compute capabilities (AMD R600, nVidia G80, all things Vivante, ARM Midgard, etc.). Turns out that if you have an ALU that can perform both pixel and vertex shaders, it must be versatile enough to be used for other compute workloads. My guess is all you need to do from there is being able to send the ALU results to some place where the CPU can read them (unified memory in SoCs, system RAM through PCI-Express, etc.), instead hardwiring the results towards an inaccessible framebuffer.

Series 5 was no exception.

Ok, but neither AMD or Nvidia are effectively competing in the same arena as IMG. I don't know much about Vivante, but for ARM, the advent of midgard hasn't stopped them continuing on with their Mali4xx series that is non-GPGPU compliant, and it continues to "win" sockets. Allowing for the fact that many of those design wins may be because of things not related to technical merit, the Mali4xx series still gives ARM a set of IP that is neither GPGPU compliant, nor in fact opengles3.x compliant. Given the widely talked-up impending explosion of IOT and wearable, It would appear that this type of GPU will have legs to run, pretty much un-challenged, for many years to come. It would appear to be a "miss" for IMG not to have something similarly targetted. All IMG 6 & 7 series IP Have GPGPU and opengles3.x as a minimum spec, only the "ultra-small" GX5300 is an exception.

Has anyone categorically confirmed the GPU IP inside the iwatch ? I know it has been reported to have an IMG driver, but I haven't seen anything that says which derivative it is.
 
I don't know much about Vivante, but for ARM, the advent of midgard hasn't stopped them continuing on with their Mali4xx series that is non-GPGPU compliant, and it continues to "win" sockets.

Yet the Mali 400MP4 in Exynos or ULP Geforce GPUs in Tegra 2/3/4 didn't show a substantially superior performance-per-watt than Series 5 or Adreno 3xx back in 2011-2012.
ARM decided to iterate on the Udgard architecture probably because some SoC manufacturers asked for a GPU IP they could sell for nickles, or even bundle for free together with a Cortex A7/A5 core license. Everything you see the ARM 4xx in are bottom-of-the-barrel SoCs.
 
There's a lot one could say in favor of one or the other solution, however folks honestly with the market penetration ARM has in the ULP SoC market with CPU IP I would had been VERY surprised if either way they wouldn't has seen any land with their own GPU IP.
 
Yet the Mali 400MP4 in Exynos or ULP Geforce GPUs in Tegra 2/3/4 didn't show a substantially superior performance-per-watt than Series 5 or Adreno 3xx back in 2011-2012.
ARM decided to iterate on the Udgard architecture probably because some SoC manufacturers asked for a GPU IP they could sell for nickles, or even bundle for free together with a Cortex A7/A5 core license. Everything you see the ARM 4xx in are bottom-of-the-barrel SoCs.

Adreno and Tegra are solely used in house, and so they are picked for entirely different reasons. I suspect that for a particular performance ceiling, Mali4xx provides that same performance at a much smaller area than anything that IMG has. Given that the "new" markets of IOT and wearables that in theory are providing opportunties for big volumes of low end/low compliance GPUs, it would appear that IMG's IP portfolio remains uncompetitive for those design sockets.

But perhaps having mostly lost that low end for quite some time now, in part for reasons that are not on technical merit, they have decided that it is too difficult to get back in, and/or that there is no money in it.
 
What makes you think Mali-400 is much smaller than GX5300 for competitive perf and power, exactly?
 
I suspect that the vast majority of 5 series SOCs out there have run zero GPGPU processes, and with the exception of Apple, the same might be said for series6 (one assumes apple is doing some GPGPU compute). If it has been largely redundant, then was it a poor design decision to require GPGPU compliance so early on, with what one assumes is a die area penalty.
I've heard Instagram uses the GPU for at least some of its filters. You could call this GPGPU as it's not using the pipeline to render triangles.
 
What makes you think Mali-400 is much smaller than GX5300 for competitive perf and power, exactly?

I'm can only go on the public info of course, but when IMGs ultra-small GPU sector on the website consists of a single IP, quoted at 250Mhz, I assume that's the kinda of region it is targetting. Now you'll know (and I don't) whether for the kind of performance that that suggests, whether it is a competitive solution compared to a Mali targetting the same performance. I'm guessing from your response that it is :).

However ARM have a range of Mali4xx, and I think I'm making a fair assumption that this allows them to target systems that have a relatively much high performance requirement, but still do not required much in terms of conformancy. Once IMG get to the ceiling of GX5300, the next IP up has opengles3.0 and GPGPU conformancy. It is this situation I was talking about when I said

I suspect that for a particular performance ceiling, Mali4xx provides that same performance at a much smaller area than anything that IMG has

I didn't mean all situations, just ones that Mali4xx can address but GX5300 can not.

I could of course well be talking out of my ass, and adding 2+2 and not getting 4, but the evidence that GX5300 does not support 3.0 or GPGPU leads me to conclude that excluding those was part of reducing it's footporint. If you are telling me that GX5300 can scale performance wise to match any Mali4xx implementation, and still keep it's size advantage, then...I'll just go and hide somewhere.
 
Last edited:
The market has generally moved on to OpenGL ES 3.0 as the baseline, dragged along mostly by Android. For markets where area and power is the paramount concern, not the base feature level (or in many cases performance), GX5300 exists.

We don't really see much demand for the middle ground at all now, where ES 2.0 is fine but higher performance is needed (at higher area), and Mali-400 or Mali-450 MPs 2-8 are being designed in. As soon as you're considering an MP2+ Mali-4xx, we have ES 3.0 cores at competitive PPA to offer.

There isn't some magical ground that only Mali-4xx is a solution for, that's being designed today, that PowerVR can't address.
 
What makes you think Mali-400 is much smaller than GX5300 for competitive perf and power, exactly?

I for one wouldn't suggest so; I would however note that the idea for a GX5300 or anything similar came quite late compared to the Mali400. Same goes for the IMG XE line of cores.
 
XE had to come later, it's Rogue. And GX5300 is an SGX, just one specifically optimised to be as small and as low power as possible and aimed at IoT-esque things. Maybe we've done a bad job of explaining what GX5300 is and why it exists, but it is not a general purpose Mali-400 competitor. The market has moved on, embedded GPU thinking needs to catch up a bit.
 
Good idea in putting this in the appropriate thread, I forgot I'd started this one.

The market has generally moved on to OpenGL ES 3.0 as the baseline, dragged along mostly by Android.

it appears rightly or wrongly that ARM sees it different, as they continue to develop Mali, having announced the Mali470 only few months ago.
 
Why do they see it differently? Mali-470 is for the same markets as GX5300.
 
I'm saying that Mali-470MP4 markets probably don't exist. Just because a config is possible doesn't mean you'll ever see it. T760MP16, T658, T678, T720MP8....you get the picture.
 
Back
Top