PowerVR Series 6 now official

Can it be co-incidental that OS7.0 is listed in the build ?

iOS 7.0 was listed in the build for some time now, and the FPS didn't alter, only in the last 24 hrs did the performance improve, so it must be driver related. Now whether these drivers are unique to 7.0, or compatible with 6.0.x, I can't say, but given that the majority of iPad 4 owners will update to the latest version of iOS when available, it probably doesn't matter.
 
Can it be co-incidental that OS7.0 is listed in the build ?

No idea; however keep in mind that if memory serves well iPad4 started out in GLB2.7 with ~840 frames; meaning the score has improved in the meantime about 28%.
 
While the 330 impressively tops the GeForce implementation in Shield (let alone how it'll compare to a Tegra 4i direct competitor when that eventually shows up), I still believe that Rogue in the next iPad and probably even the next iPhone (if Apple doesn't try to slim the profile too much of its already-disadvantaged-by-size 4.0" phone) should easily top the more forward-looking/demanding graphics tests, like those of Gfxbench.

However, with the higher variability being introduced to the power profiles of recent SoCs, I wonder if the advantage wouldn't start to be noticed in a more relevant real-world context like a high-end gaming session where the less efficient SoC/GPU might be power/thermal limited more frequently. Though I can feel the SoC heating up during an extended session of Infinity Blade II on my iPad, I've never noticed any performance inconsistency as the power/thermals are managed.
 
It's also improved relative to other competitors, though. It's now beyond all 320 implementations, for instance.
 
Yeah, that was 2.5, Egypt HD.

All GPUs breathed at least a little sigh of relief with that workload optimization, though some definitely more than others.
 
Last edited by a moderator:
From my (perhaps simplistic view) I am not easily seeing how a next gen iphone with G6200 will outperform the 330, going by the marketing data that is available on the img blog.

http://withimagination.imgtec.com/i...ervr-series6-gpus-to-a-mobile-device-near-you

The bar chart, which has been confirmed to have been normalised for frequency, shows 6200 giving around 4x of a 544MP1. Understanding of course this is marketing, however it is all we have to go on at this time.

iphone5 is running 543mp3@ 325mhz(?) and is getting 350-ish in Gl2.7. (I assume 543 and 544 have the same graphics performance, with 544 having better compliance ?)

The bar chart would suggest that a G6200 will provide about x1.3 improvement of a 543mp3, for the same frequency.

A target of 1200 in Gl2.7, requires x3.5 improvement.

That would need a G6200 running around 750Mhz, which I feel is not realistic.

Assuming a G6400 gives a x2 improvement over a G6200, then a G6400@375Mhz would get there, but will apple be that aggressive ? If so, you might well see G6400 in both iphone and ipad, with ipad clocking it higher ?
 
Maybe I'm mixing it up with the 2.5 HD test, they reduced geometry load on one of them but can't remember clearly.

There definitely was a heavy reduction of geometry load in 2.5 (shortly after its introduction); I hadn't heard anything for 2.7 but I don't insist either since I don't know.
 
From my (perhaps simplistic view) I am not easily seeing how a next gen iphone with G6200 will outperform the 330, going by the marketing data that is available on the img blog.

http://withimagination.imgtec.com/i...ervr-series6-gpus-to-a-mobile-device-near-you

The bar chart, which has been confirmed to have been normalised for frequency, shows 6200 giving around 4x of a 544MP1. Understanding of course this is marketing, however it is all we have to go on at this time.

iphone5 is running 543mp3@ 325mhz(?) and is getting 350-ish in Gl2.7. (I assume 543 and 544 have the same graphics performance, with 544 having better compliance ?)

The bar chart would suggest that a G6200 will provide about x1.3 improvement of a 543mp3, for the same frequency.

A target of 1200 in Gl2.7, requires x3.5 improvement.

That would need a G6200 running around 750Mhz, which I feel is not realistic.

Assuming a G6400 gives a x2 improvement over a G6200, then a G6400@375Mhz would get there, but will apple be that aggressive ? If so, you might well see G6400 in both iphone and ipad, with ipad clocking it higher ?

As Apple can absorb costs better than a typical MediaTek customer, and seems content with a high IPC dual-core even in a $1000 dollar Macbook Air, I wouldn't be surprised if they stick with a dual-core SoC, in that case why not use a low-clocked G6630 in the iPad 5, and G6400 in the iPhone, the total die area probably wouldn't be much greater than a quad-core Snapdragon 800 for the iPhone example.
 
I wouldn't be surprised if they stick with a dual-core SoC, in that case why not use a low-clocked G6630 in the iPad 5, and G6400 in the iPhone, the total die area probably wouldn't be much greater than a quad-core Snapdragon 800 for the iPhone example.

With pretty much everyone at the high end running quad-core, and some running 4x4 and some indeed looking at x8, I think Apple would struggle from a marketing perspective with "only" have dual core anything in the smartphone, not only now, but this phone has to last thru to sept 2014.

Whether or not having more cpu cores actually is a major benefit, unless they can come up with some marketing magic, I'm guessing the cpu count will change, even if it is some sort of addition of low end cores.
 
From my (perhaps simplistic view) I am not easily seeing how a next gen iphone with G6200 will outperform the 330, going by the marketing data that is available on the img blog.

http://withimagination.imgtec.com/i...ervr-series6-gpus-to-a-mobile-device-near-you

The bar chart, which has been confirmed to have been normalised for frequency, shows 6200 giving around 4x of a 544MP1. Understanding of course this is marketing, however it is all we have to go on at this time.

Assuming that Mediatek's claim for a 46 fps score in GLB2.5 is correct, that's 4.5x times that of a SGX544MP1@286MHz in the MT6589: http://gfxbench.com/device.jsp?benchmark=gfx27&D=Zopo%20C2

And yes I still believe that the G6200 is clocked around 284MHz as the 544 in the 6589 because it actually matches the GFLOP differences stated. Assuming linear frequency scaling a G6200@500MHz should be </= 80 fps in GLB2.5.

No idea though for GLB2.7 performance and yes it would be a crappy idea to extrapolate even on a speculative basis 2.7 performance based on 2.5 scores.

iphone5 is running 543mp3@ 325mhz(?) and is getting 350-ish in Gl2.7. (I assume 543 and 544 have the same graphics performance, with 544 having better compliance ?)

The bar chart would suggest that a G6200 will provide about x1.3 improvement of a 543mp3, for the same frequency.
If there's no mistake in my speculative math above you'd get 52 fps in GLB2.5 for a G6200@325MHz, while you get on the iPhone5 almost 30 fps: http://gfxbench.com/device.jsp?benchmark=gfx27&D=Apple iPhone 5

In terms of theoretical GFLOPs the 543MP3 in A6 is at 35+ GFLOPs@325MHz while the G6200@325MHz at 41.6 GFLOPs.

A target of 1200 in Gl2.7, requires x3.5 improvement.

That would need a G6200 running around 750Mhz, which I feel is not realistic.
Ignoring all the above (since it's a very bad idea to extrapolate on a previous architecture or different benchmark results), I'd be VERY surprised if Apple would break its "tradition" and not introduce in a new iPhone yesteryear's tablet GPU performance. For the record when iPhone5 shipped it was slightly faster than iPad3 in GLB2.5.

Assuming a G6400 gives a x2 improvement over a G6200, then a G6400@375Mhz would get there, but will apple be that aggressive ? If so, you might well see G6400 in both iphone and ipad, with ipad clocking it higher ?
Rogue is a new architecture with completely different latency characteristics than Series5; it's my understanding that Rogue was intended for way higher frequencies and I don't see something like 500MHz being anything but peanuts under 28nm. Besides before Apple used the first 543MP2 in their SoCs the general consensus was that Apple "never" uses high end hw.

Besides let me take your 375MHz scenario; assuming they still keep the multipliers you could have 5*375 = 1875MHz for the CPU; and why not 4*500 = 2000 instead? Because a slightly pumped up Swift CPU would have a hard time with a 2GHz frequency? In the meantime T4i with A9r4 cores will get clocked up to 2.3GHz and likewise Qualcomm's Krait in S800.

***edit:

let's move back for a second to the MT8135 graph:

http://withimagination.imgtec.com/w...compute-performance-of-MediaTek-platforms.png

http://withimagination.imgtec.com/i...ervr-series6-gpus-to-a-mobile-device-near-you

SGX531 = 2 Vec2 ALUs = 8 FLOPs/clock
SGX544 = 4 Vec4+1 ALUs = 36 FLOPs/clock
G6200 = 2*SIMD16 (2 FMACs/SIMD lane) = 128 FLOPs/clock
---------------------------------------------------------------------------
128/36 = 3.55x
36/8 = 4.5x
128/8 = 16x

SGX531 = 2 TMUs
SGX544 = 2 TMUs
G6200 = 4 TMUs

At 286MHz =
SGX531 = 20M Tris/s
SGX544 = 50M Tris/s
G6200 = 75M Tris/s (?)

GLB2.5 offscreen 1080p =

MT8377/SGX531 = 292 frames (2.6 fps)
MT6589/SGX544 = 1151 frames (10.2 fps) (3.92x difference)
MT8135/G6200 = 5190 frames (46.0 fps) (4.5x difference)

46 / 2.6 = 17.7x times

....damn I'm getting good at this :LOL: Does any marketing department out there want to hire me? I'm an awefully bad liar though, if that counts for anything :devilish:
 
What for? Is the actual problem of the Vita that its hw is too weak or rather that there aren't many left to care much about handheld consoles considering how capable smartphones and tables have become for casual gaming?

Yet the 3DS is selling like hot cakes ?
 
Yet the 3DS is selling like hot cakes ?
its selling far more than the vita but its hardly selling well, eg last NPD figures were only a measly 150k for the month (3rd year or so i.e. this is when it should be peaking) coupled with according to nintendo bad european numbers, yes in japan its selling well but elsewhere?
 
Back
Top