BACON.
Oh dear, not this again.
BACON.
Oh dear, not this again.
PS - Brown saurce was applied.
WRONG
I just looked at the first post in this thread and it's 3 years old!
Woah, it looks like it'll take well over 3 years for IMG to get a GPU family from announcement to a shipping product.
I have no idea but judging by the slow GUI on my TV I'd guess good GPU would help TVs not suck at user experience.But for what, fancy menus and overlays and overkill for things like Netflix?
Nobody knows and if there are people who do know, you won't get answer from them. Asking about it is pretty much pointless.
It's been previously reported that lossless compression was added to the G6x30 over the G6x00 and IMG seems to officially confirm this in reply to someone's question in this article. They claim 2:1 compression rate or better for render targets.Here's one simple example that goes against your assumption: the PowerVR G6x30 GPUs add incremental area for features such as image lossless compression. For render targets, this provides a typical 2:1 compression rate, but it can be much higher, depending on the frame being compressed. The idea of adding more silicon in this case is to actually save on power consumption by reducing memory bandwidth.
Based on the info released, 6XT doesn't appear to provide anywhere near the same boost in upper performance ceiling to 6, as 5XT did to 5.
Was a single-core SGX 5XT really more than 50% faster than a SGX5 in for example GLBenchmark?
50% faster clock for clock, cluster for cluster compared to their Series6 counterparts; these performance measurements are based on frames from industry standard benchmarks.
Was a single-core SGX 5XT really more than 50% faster than a SGX5 in for example GLBenchmark?