PowerVR Series 6 now official

http://www.thisislondon.co.uk/stand...ies-is-fired-by-growing-appetite-for-chips.do

Past projections/forecasts mentioned 500M for the future; now all of the sudden they upped the ante to 1B/year. Either they have some damn good deals in their sleeves and aren't telling yet or someone is smoking something hallucinogenetic in their marketing department.
3 years ago, when they were shipping 10m a year, they forecast 200m+ for the financial year 2010-2011. Yesterday they confirmed that they did 245m in that year.
 
3 years ago, when they were shipping 10m a year, they forecast 200m+ for the financial year 2010-2011. Yesterday they confirmed that they did 245m in that year.
But if they did increase their forecast from 500m to 1B in a single year, that's not a small increase (any link to the old forecast?) so the question I would ask is this: are they simply forecasting higher volumes, but do they expect to penetrate higher-volume market segments with lower-end cores, thus also decreasing their ASPs? The latter would be more realistic, and may (for example) be based on Android penetrating emerging markets faster than previously thought.
 
But if they did increase their forecast from 500m to 1B in a single year, that's not a small increase (any link to the old forecast?)

I don't have a link but the 500m was often seen on their financial graphs, but done such that the timeframe was not discernible. They did indicate a while back that after this year's results they would set a defined timeframe for 500m. They have decided now not to state a 500m target at all, but rather go straight to 1B. One could read many things into that, but my reading is that 500m is only 2 years away ( or just under it in 2 years and exceeding it in 3), and they felt it was not a big goal to aim for.

Clearly 1 billion requires much more than the phone segment, and indeed much more than just graphics/video ip. Note that they also said yesterday that they had signed their first licence for their ensigma ip, which is a multi format communications (both broadcast and 2-way), ip with an unnamed tier 1 semi. Also note that in the last few months they have acquired other ip, via caustic and hellosoft. I wonder what hellosofts volume is like, and whether they see big ramp up there ?

Sorry have wandered badly off topic.
 
Last edited by a moderator:
But if they did increase their forecast from 500m to 1B in a single year, that's not a small increase (any link to the old forecast?) so the question I would ask is this: are they simply forecasting higher volumes, but do they expect to penetrate higher-volume market segments with lower-end cores, thus also decreasing their ASPs? The latter would be more realistic, and may (for example) be based on Android penetrating emerging markets faster than previously thought.

Reading the preliminary results presentation from yesterday probably helps.

Here's the old forecast:

20.jpg


http://www.imgtec.com/corporate/presentations/AGM10/index.asp?DisplayPage=20&#ViewTop

New:

37.jpg


http://www.imgtec.com/corporate/presentations/prelims11/index.asp?DisplayPage=37&#ViewTop
 
IMG sees a future where just about every appliance will have its own IP and be network connected, so their programmable radio/communication (WiFi, etc.) cores factor into that sales projection.

Also, video decode/encode cores and power management "smart plugs" cores make up some of the volume increase in the forecast.

IMG really needs to strategize about those secondary, low power graphics/compositing cores that are turning up in future SoC designs. Mobile designers don't hesitate to trade area for power savings, and that initiative might not stay limited to CPU and GPU cores going into the future.
 
Last edited by a moderator:
Thanks to Tangey for identifying that graphics demo of which I found a glimpse earlier in this thread.

http://www.youtube.com/watch?v=YAIl08i0_-8

The character design looked like Digital Legends work; Xtreme Running is another in a line of demos that IMG had them make to showcase PowerVR.

Good complexity, and the animation is getting more fluid too. The aliasing on the self-shadowing stands out a bit. The puddle shader was a nice touch.
 
It was a follow-up to the question I had posed about it earlier in the thread after finding a screenshot.

I probably found the screenshot after linking to some trade show coverage where Series 6 got a mention.

But yeah, I've dragged it far enough off topic.
 
Missed that article before. RTX seems interesting. Thanks for the link.

I had a very hard time understanding the gibberish the online translator gave me. OT but I'm too bored to look up a related thread but on the Investor's site they've announced that MediaTek also licensed S5XT.
 
IMG have given us some more info regarding Rogue:-

http://www.imgtec.com/News/Release/index.asp?NewsID=666

Initial cores are G6200 G6400.

"Based on a scalable number of compute clusters the PowerVR Rogue architecture is designed to target the requirements of a growing range of demanding.......

"The first PowerVR Series6 cores, the G6200 and G6400, have two and four compute clusters respectively."

"PowerVR Series6 GPU cores are designed to offer computing performance exceeding 100GFLOPS (gigaFLOPS) and reaching the TFLOPS (teraFLOPS) range enabling high-level graphics performance from mobile through to high-end compute and graphics solutions"

"....PowerVR Series6 GPUs can deliver 20x or more of the performance of current generation GPU cores targeting comparable markets. This is enabled by an architecture that is around 5x more efficient than previous generations."

All members of the Series6 family support all features of the latest graphics APIs including OpenGL ES ‘Halti’*, OpenGL 3.x/4.x, OpenCL 1.x and DirectX10 with certain family members extending their capabilities to full WHQL-compliant DirectX11.1 functionality.

If we take everything at face value, I feel its likely that the A9600 is dual core rogue, ie. G6200.
 
If we take everything at face value, I feel its likely that the A9600 is dual core rogue, ie. G6200.

Dual core would be a eulogy in terms of TMU amounts/core. I have the feeling though that it'll take quite some time until we untangle the mystery behind "compute cluster" and in extension even more so "core" when it comes to Rogue.
 
All members of the Series6 family support all features of the latest graphics APIs including OpenGL ES ‘Halti’*, OpenGL 3.x/4.x, OpenCL 1.x and DirectX10 with certain family members extending their capabilities to full WHQL-compliant DirectX11.1 functionality.
There's a little bit of an inconsistency there. They say all Series6 will support OpenGL 4.x and DX10 while some will support DX11.1. What about vanilla DX11? Doesn't OpenGL 4.x and DX11 pretty much go hand in hand? If so, does that mean all Series 6 will in fact support vanilla DX11 with only DX11.1 being optional?
 
There's a little bit of an inconsistency there. They say all Series6 will support OpenGL 4.x and DX10 while some will support DX11.1. What about vanilla DX11? Doesn't OpenGL 4.x and DX11 pretty much go hand in hand?

Rogue cores will range according to market demands between DX10 and DX11.1. I've no idea if there are going to be DX11.0 Rogues, but considering that 11.1 is a relatively small update to the former and it adds quite an important check whether a TBDR is at work, it sounds a wee bit awkward if you go all the mileage to 11 to not include the 11.1 requirements also.

It's my understanding that under DX11.1 the API checks whether there's a TBDR at work; if there is it skips the unnecessary geometry sorting for the CPU for early Z which should give a nice performance improvement. I for one would want such a goody ;)

If so, does that mean all Series 6 will in fact support DX11?

Nope. Why waste a ton of transistors for capabilities in specific markets where you don't actually need more than DX10? I wouldn't think that an Android device needs anything like DX11 if its going to use OGL_ES3.0/Halti for graphics. Au contraire a win8 device would be better off with DX11.

In the cases where less capabilities are needed and if there's headroom licensees could invest that transistor difference in higher performance.
 
It's my understanding that under DX11.1 the API checks whether there's a TBDR at work; if there is it skips the unnecessary geometry sorting for the CPU for early Z which should give a nice performance improvement. I for one would want such a goody ;)

http://msdn.microsoft.com/en-us/library/hh404455(v=VS.85).aspx

I am curious to know what will the API do when faced with Mali/Adreno. Or has MS also bought into the deferral fantasy of Mali/Adreno?
 
Rogue cores will range according to market demands between DX10 and DX11.1. I've no idea if there are going to be DX11.0 Rogues, but considering that 11.1 is a relatively small update to the former and it adds quite an important check whether a TBDR is at work, it sounds a wee bit awkward if you go all the mileage to 11 to not include the 11.1 requirements also.

Nope. Why waste a ton of transistors for capabilities in specific markets where you don't actually need more than DX10? I wouldn't think that an Android device needs anything like DX11 if its going to use OGL_ES3.0/Halti for graphics. Au contraire a win8 device would be better off with DX11.
http://www.imgtec.com/News/Release/index.asp?NewsID=666

All members of the Series6 family support all features of the latest graphics APIs including OpenGL ES ‘Halti’*, OpenGL 3.x/4.x, OpenCL 1.x and DirectX10 with certain family members extending their capabilities to full WHQL-compliant DirectX11.1 functionality.

But the press release said that all Series6 GPUs will support OpenGL 4.x which includes tessellation and other capabilities that are also part of DX11 and otherwise exceeds a basic DX10 design. As you point out, if they're supporting DX11, they might as well go DX11.1 for those GPUs that require it. But if all Series6 GPUs already support OpenGL 4.x, is there still a huge transistor investment to go DX11?
 
Back
Top