What are the Pros & Cons of having a PowerVR GPU in a Next Gen Console?

onQ

Veteran
I don't really have much input on this but from the things that I do know about the PowerVR Series 5 GPUs make me think that a PowerVR Series 6 with a high number of cores would be perfect for a Next Gen Console.

from the PlayStation Vita thread said:

we don't have any real specs of the Series 6 yet but going off of the Series 5 specs & knowing that the Series 6 can be 32 cores or more this could be a very powerful GPU for a console that might be cheaper & use less power.


I know the people here know a lot more than me when it come to GPU's & Console designs, so I would like to hear the pros & cons of using a PowerVR GPU in a console in this day & age,
 
For the upsides ask SONY how "disappointed" they are with the SGX543MP4+; one major downside would be that console manufacturers like Microsoft and SONY might have been in a rush as to who of the two will launch its next generation console first or at least not launch significantly later than the other. If anyone would be willing to tell you when those two re-started to evaluate various IP for upcoming consoles, the solution to the riddle might be times easier.

As for Rogue/Series6 itself it's still a major question mark considering its detailed capabilities and/or efficiency. Both marks should be quite high, but without any details no one could guess how it compares to competing architectures from AMD or NVIDIA. Just because the design can scale over N amount of cores it should mean what exactly? It's not that AMD's Fusion GPUs aren't scalable either as just one example, rather the exact contrary.

At this point both Microsoft and SONY have decided quite some time ago what their next generation consoles will contain: http://forum.beyond3d.com/showpost.php?p=1601559&postcount=675
 
I don't really have much input on this but from the things that I do know about the PowerVR Series 5 GPUs make me think that a PowerVR Series 6 with a high number of cores would be perfect for a Next Gen Console.



we don't have any real specs of the Series 6 yet but going off of the Series 5 specs & knowing that the Series 6 can be 32 cores or more this could be a very powerful GPU for a console that might be cheaper & use less power.


I know the people here know a lot more than me when it come to GPU's & Console designs, so I would like to hear the pros & cons of using a PowerVR GPU in a console in this day & age,

I guarantee I don't know more about GPUs than you!

Deferred rendering is becoming popular again, perhaps the PowerVR systems with their embedded tile buffers could avoid having to make straight choice between a fat bus (e.g. 256 bit) or dedicating a lot of die space to edram video memory (like the 360). If they offer sufficient cushion to local (tile based) framebuffer operations while being able to dedicate more die area to shaders or CPU cores, perhaps they could offer a real alternative to nVidia or AMD?

I'd like to see a PowerVR equipped PS4.
 
Could PowerVR really build a GPU that is competitive with an NVIDIA or ATI chip? I know very little about the company or the hardware they make, but I do know they haven't produced a high end GPU in a very long time. Why do we think they could simply come out of nowhere with a better chip than the guys who have been doing it for years?

Rys, where art thou? And don't worry about NDAs, I promise we won't tell anybody :D
 
With paradigm more Deferred Shading the PowerVR 6 MP32 could probably one of the best answers and a chance to reach 2+Tflops( 144 flops cycle per core?)* at 500MHz with low wattage may be the ideal choice for closed box consoles.


* http://forum.beyond3d.com/showpost.php?p=1557247&postcount=11


"Quote:
Originally Posted by mczak
I'm not quite sure what the chips are missing for DX10 compliance, and if that really would cost that much. Some though definitely don't have the required precision (for the ALUs for instance, and also z-buffer) which makes them not even really DX9 compliant.


Answer: All SGX parts meet shader precision requirements for Dx9 SM3.0

Quote:
210Gflops for Rogue? Where did you get that number? IIRC SGX543 has got something like 16 (fp32) flops / clock which at usual clocks is about 2 orders of magnitude lower, so that would be more than a drastic increase.

Answer: SGX543/544 are actually 36 flops/clock for a single core, 554 is 72 flops/clock also for a single core. So not sure how you get to two orders of magnitude.

John."
 
What is the largest PVR chip, in regards to area and TDP, in the last 4 years?
 
So PowerVR hardware is not even DX10 compliant? Well the next gen consoles will be DX11+ compliant so either PowerVR is working on something that is or they aren't in the next gen consoles.
 
Chances are they aren't in the next gen consoles. They are in the Vita. They should have been in the 3DS. IMG is a big player in the mobile market, but again they do not manufacture the chips themselves and license their IP to other companies. I've no idea what the costs are, but it's got to be better than paying a company like Intel Nvidia to make the chips for you.

The cons to using PowerVR in a console are numerous. They haven't been in the high end for years. We've little idea of the performance and how well the cores scale with say a 32 core GPU. Not sure if high geometry counts would be a chore on Rogue, I hope not! What about tessellation, does that go well with PowerVR?

Some pros I see is on the Sony side really, seeing they have PowerVR in the Vita. They could have a 32 core Rogue in PS4 or whatever and perhaps make ports that much easier? They could tap into the knowledge they've learned about the architecture from Vita and apply that to PS4 titles.

I'd love to see IMG make a comeback into consoles. They would have stayed around if SEGA hadn't gotten out of the game so quick I am sure! Here's a question. Where would IMG be now if SEGA stuck it out with the Dreamcast and actually manage to remain in business long enough to do another console? Interesting question, but only as it pertains to IMG.
 
what about the fact that it might run cooler , use less power & be a lot cheaper?

that would save money & space because it wouldn't take so much to keep it cool, & the money & space saved by using the PowerVR can be used for other parts like more / better ram.

& the fact that the PowerVR Series 6 will be used in so many other products over the years the price will go even lower, & down the line when the Series 7 come out & if the console is still being sold the new models of the console could easily use the Series 7 without losing compatibility with the older software.

& Consoles are for far more than gaming now so it would be best for them to have a low powered console that can stay on all the time for it's media capabilities.
 
I would hope console makers have the option of building prototypes before signing for a chip.
Best way to see if it works well is the Vita (which might be too early in its life cycle at this stage) and building a prototype to test.
 
what about the fact that it might run cooler , use less power & be a lot cheaper?

First two are unknowns due to lack of details.

GPU IP royalty so far is in the $4-5 ballpark per sold unit/console for either PS3 or XBox360. If memory serves well depending on the age of IMG's GPU IP royalties should range from something like a dozen cents to $1 for today's embedded designs. Granted a console would be a high end design, but I wouldn't suggest that IMG or any IMG could afford to sell a gazillion of core IP at a significantly lower price than the above.

& the fact that the PowerVR Series 6 will be used in so many other products over the years the price will go even lower, & down the line

I don't think royalties per chip change during the lifetime of a console. As for Series6 being employed in N amount of products has exactly what to do with any console manufacturer? It's N amount of products for X amount of SoC manufacturers and Y amount of OEMs. It's not like if SONY would had licensed Series6 that they'd also have an exclusive deal with IMG to manufacture all of their products.

If it helps AMD/NV power a "lot" of products too.

That said the older GPU IP gets, the lower the price. However in a licensing deal the licensee has to pay royalties per chip as the licensing contract foresees.

when the Series 7 come out & if the console is still being sold the new models of the console could easily use the Series 7 without losing compatibility with the older software.

Oh I read yesterday the specifications of Series666; I can tell you the were hellishly mind blowing :devilish:

& Consoles are for far more than gaming now so it would be best for them to have a low powered console that can stay on all the time for it's media capabilities.

It shouldn't take a wizzard nowadays that both XBoxNext and PSx will most likely contain SoCs. Exactly because they probably wanted to reduce manufacturing costs in the first place. IMG's Rogue would had been a fine choice for either/or, but I'm as sure as I can be that you're still barking at the wrong tree. Alea iacta est.
 
Chances are they aren't in the next gen consoles. They are in the Vita. They should have been in the 3DS. IMG is a big player in the mobile market, but again they do not manufacture the chips themselves and license their IP to other companies. I've no idea what the costs are, but it's got to be better than paying a company like Intel Nvidia to make the chips for you.

Both RSX/PS3 and Xenos/XBox360 are based on GPU IP and besides the original licensing fees, NVIDIA and AMD still receive royalties per unit sold.

The cons to using PowerVR in a console are numerous. They haven't been in the high end for years.

There's quite a difference between a single high end monster GPU core and a GPU cluster within a SoC consisting of a gazillion of small cores. The only other headache is to get as close as possible to linear performance scaling which according to IMG at least is as high as possible with their multi-core configs, due to hw assistance and not just some sw hack like AFR.

We've little idea of the performance and how well the cores scale with say a 32 core GPU. Not sure if high geometry counts would be a chore on Rogue, I hope not!

On Series5XT geometry scales at 95% with multiple cores.

What about tessellation, does that go well with PowerVR?

Very good question. That's probably one of the biggest points of interest: how they solved programmable tessellation under DX11.

I'd love to see IMG make a comeback into consoles. They would have stayed around if SEGA hadn't gotten out of the game so quick I am sure! Here's a question. Where would IMG be now if SEGA stuck it out with the Dreamcast and actually manage to remain in business long enough to do another console? Interesting question, but only as it pertains to IMG.

IMG's best ever business decision was to concentrate on the embedded market and leave high end designs behind. No idea what the company was worth during the Dreamcast days but it wasn't for sure more than a couple of hundred million $ at best. Nowadays after tremendous growth they're worth something in the $2 billion league. How many chips would IMG had sold with a console design? 70-80M at best over =/>5 years? For 2010 alone they reached 245Mio units with their IP.

IMG probably would be nowhere today, if it wouldn't had changed strategy in the past.
 
What is the largest PVR chip, in regards to area and TDP, in the last 4 years?

We can only have an idea, but maybe watching data about SGX543/544 8mm^2 at 65nm* per core and preset OMPA4430( with one SGX543 at 304MHz/45nm) consumption in the order of milliwatts (less than 200miliwatts per core at 200MHz), we can have some reasonable/speculating information that rogue have at least twice the capacity per core than SGX554(this core have twice power than SGX543...see my post before).

*
http://en.wikipedia.org/wiki/PowerVR

About OMAP 4430:

http://en.wikipedia.org/wiki/Texas_Instruments_OMAP

http://www.ti.com/general/docs/wtbu...ateId=6123&navigationId=12843&contentId=53243


http://www.ti.com/general/docs/wtbu...teId=6123&navigationId=12862&contentId=101230

Block diagram:
http://www.phytec.com/products/som/Cortex-A9/phyCORE-OMAP4430.html

Here they measure TDP of OMAP4430:
http://www.vectorfabrics.com/blog/item/power_consumption_omap4430_pandaboard


My 2 cents is Rogue MP32 could be fit 250/260mm^2 at 28nm reach 2+Tflop with 500MHz and wattage less than 20 watts(if works so well than AMD and nividia...thats another story...).
 
Last edited by a moderator:
If claims here in the handheld forum were correct the SGX543MP4+(@200MHz) in the PS Vita should be around 35mm2 at Samsung 45nm.

My 2 cents is Rogue MP32 could be fit 250/260mm^2 at 28nm reach 2+Tflop with 500MHz and wattage less than 20 watts(if works so well than AMD and nividia...thats another story...).

No idea yet about die area and/or power consumption considering Rogue. However a 500MHz frequency is quite modest for a hypothetical high end console design under 28nm. 500MHz for GPU blocks will be common place for 28nm/2012 smartphone/tablet designs. Tegra3 albeit being manufactured at 40nm TSMC should have in its tablet variant the GPU clocked already at 500MHz and I'd be very surprised if upcoming SoCs with single or MP SGX544's won't be clocked at least at 500MHz.

Reverse speculative math based on the ST Ericsson Novathor A9600: they're claiming >210 GFLOPs, >5 GTexels fill-rate (without overdraw) and >350M Tris. Assuming it's a quad core design, always per core a possible scenario would be 8 Vec5 ALUs, 2 TMUs per core clocked at 667MHz.

8 * 10 FLOPs * 0.667GHz = 53.36 GFLOPs
2 * 667MHz = 1.33 GTexels

53.36 * 4 cores = 213.44 GFLOPs
1.33 * 4 cores = 5.32 GTexels

...and that's still high end smart-phone/tablet ballpark for the A9600. If true you'd need roughly 37 cores to reach the 2 TFLOPs mark at 667MHz, or as an alternative clock at 1GHz and get away with 25 cores.

***edit: if you now would want to change VecX ALUs into the typical SP (stream processor) parlance of the desktop GPU world, you'd have for 25 cores = 1000 SPs and for 37 cores = 1480 SPs. I'm fairly sure that I'll be wrong at the end, but that's what speculative (reverse) math is good for.
 
Cell + PVR would be an interesting (though ironic) combo ...

In such an arrangement, I'd expect a fairly beefy Cell SPU count to make up for the lack of dx11 features, but if the core count of both is sufficient, it should be rather potent, efficient, and flexible.
 
Thanks!
I thought I remembered seeing something along those lines of dx11 compatibility for new PowerVR chips ...

In that case, it would have to be an evaluation on Sony's part for if the added silicon for dx11 is better applied to more SPU cores or left alone as it seems to be a modular add-on from PVR.
 
http://www.imgtec.com/News/Release/index.asp?NewsID=631
"Cores in the POWERVR Series6 family, which is codenamed ‘Rogue’, support from DirectX 10 up to DirectX 11."
...

Albeit just a relatively minor update, I'd very surprised if it wouldn't be also up to DX11.1. Isn't DX11.1 checking whether there's a TBDR at work in order to offload the CPU from geometry sorting with early Z?

In that case, it would have to be an evaluation on Sony's part for if the added silicon for dx11 is better applied to more SPU cores or left alone as it seems to be a modular add-on from PVR.

DX10 "downgrades" (ie skipping some parts of logic to save die area) for Rogue are mostly for cases where the feature overhead is redundant. A partner might want DX11 for a future win8 device, yet not necessarily for a OGL_ES (Halti) based environment. SONY is still evaluating IP? I thought it's a done deal for quite some time now.
 
SONY is still evaluating IP? I thought it's a done deal for quite some time now.

Indeed Sony should have had hardware design(s) set in stone by now.

Not sure if that is the case or if they have multiple designs they are toying with.

Hopefully Sony isn't planning to launch too late ...
 
The thing with that ChefO is, with the PS3 they had the GPU set pretty late in the overall development of the machine.

Thank you Ailuros for for answering these things for me. So the geometry scales very well, and is assisted by logic in the chip itself? ou may be right in terms of IMG being nowhere if not focusing on the embedded market when it did so. Were they involved at all with the development of combining the SH4 and CLX2 on one chip? It is quite interesting that in a sense they went from the very high end (at least in terms of arcade hardware) to low end and have reaped the rewards. I for one am glad to see them have not not fizzled out and died and instead they have succeeded in a rather competitive marketplace. It is one amazing company!
 
Back
Top