Qualcomm Roadmap (2011-2012)

When did this happen?

Ok, it seems my memory failed me. They were talking only about 3G version of the tablet, but here you go
http://www.slashgear.com/no-transformer-prime-3gumts-version-says-asus-germany-01199243/

But it's still highly unlikely that they would use msm8960 just to put LTE connectivity, why wouldn't they just change UMTS modem to LTE modem and stick with tegra3. Makes no sense from marketing point of view(you know - quad core vs. dual core), and the higher resolution screen still bugs me...

Unless it's only a prototype for Transformer Prime v2.0
 
Could it simply the Transformer Prime available in other regions? Smartphones have used different SoCs for different regions under the same model. The Adreno 225 seems to perform worse than Tegra 3, and 2 cores isn't as fancy as 4 cores from a marketing perspective even if Krait is a new architecture so perhaps this version of the Transformer Prime is intended for Eastern European or emerging markets?

Well this is disapointing indeed if true, the hope was that with the supposedly un tapped power of the adreno 220 (according to Anand)that a higher clocked variant, with much better drivers and higher bandwith would let it compete against the ipad 2...there seems to be no hope of that..

Strange as on xda forum they have got the new drivers increasing performance by 2x on adreno 220 & 205.

Luckly Qualcomm has a power vr licence, i presume (hope) thats for the rogue and not for XT cores...
 
Ok, it seems my memory failed me. They were talking only about 3G version of the tablet, but here you go
http://www.slashgear.com/no-transformer-prime-3gumts-version-says-asus-germany-01199243/

But it's still highly unlikely that they would use msm8960 just to put LTE connectivity, why wouldn't they just change UMTS modem to LTE modem and stick with tegra3. Makes no sense from marketing point of view(you know - quad core vs. dual core), and the higher resolution screen still bugs me...

Unless it's only a prototype for Transformer Prime v2.0

Cost/power/performance of a 28nm LTE chip is a big reason. The first available will be that of 8960. And really, it's not like the chip doesn't perform.

The article linked shows the German division not wanting a UTMS version; for Europe this makes a lot of sense.

The U.S. seems to love carrier-subsidized and carrier-sold tablets with bundled connectivity, however.
 
Cost/power/performance of a 28nm LTE chip is a big reason. The first available will be that of 8960. And really, it's not like the chip doesn't perform.

The article linked shows the German division not wanting a UTMS version; for Europe this makes a lot of sense.

The U.S. seems to love carrier-subsidized and carrier-sold tablets with bundled connectivity, however.

US carriers love it. US customers do not. Heck, the Motorola Xoom 2 sold for $599 ON CONTRACT. I don't know what the hell Verizon is smoking but I can't see too many people falling for that ruse.
 
Qualcomm's PowerVR license is for display IP, not graphics.

With the CPU clocked at 1.5 GHz on that Transformer TF202, the GPU clocks are probably at or close to their production level. Maybe the drivers will improve before it hits the market; the Tegra 3 TF201 upped its score a decent amount from its early appearance on the benchmark.
 
The GPU performance is about what you'd expect for a ~50% clockspeed increase, which is all Adreno 225 is in the performance category; the major changes have to do with DirectX 9 support.

This brings it up to Mali400MP4 level. But I agree, to be competitive, significant driver improvements are required.
 
Qualcomm's PowerVR license is for display IP, not graphics.

With the CPU clocked at 1.5 GHz on that Transformer TF202, the GPU clocks are probably at or close to their production level. Maybe the drivers will improve before it hits the market; the Tegra 3 TF201 upped its score a decent amount from its early appearance on the benchmark.

Oh i see, that now make more sense. still dont know how they are going to compete with mali t604/t658 + rogue...
Another thing to throw into the works that is more interesting than a threat is Zii labs new ZMS-40 chipset..quad core a9+neon 96 shader core/56 gflops gpu, 2500x1600 display output..what do you guys think...
http://www.engadget.com/2012/01/05/...rocessor-optimized-for-android/#disqus_thread
..want one:D
 
Found some more interesting info..this chip really does some mind blowing things...if it can live up to the hype then it could suprise many..
http://www.physorg.com/news/2012-01-ziilabs-unveils-core-zms-processor.html
Hmm I must have missed the hype...
If I understand that right, they've got a nice array of fp alus for doing all the work, but no dedicated hardware for texture filtering, rasterization, etc. This provides maximum flexibility in what you can do, but there's a reason gpus still include that stuff. So I would expect it to fair poorly in pure 3d graphic related tasks (I only found some low-level gles 1.1 benchmark numbers for the ZMS-08 and they are quite slow).

To get back on topic I'm really wondering about the Krait cpu core, it looks good on paper. I was not really that excited about the Scorpion core, it always looked to me like a bit better than Cortex-A8 but appearing significantly later (it might have had better efficiency but it's difficult to tell, it was also able to clock higher). It didn't really look all that different to a Cortex-A8 in the end. But Krait seems to be quite significantly different from a Cortex-A9.
 
Last edited by a moderator:
Hmm I must have missed the hype...
If I understand that right, they've got a nice array of fp alus for doing all the work, but no dedicated hardware for texture filtering, rasterization, etc. This provides maximum flexibility in what you can do, but there's a reason gpus still include that stuff. So I would expect it to fair poorly in pure 3d graphic related tasks (I only found some low-level gles 1.1 benchmark numbers for the ZMS-08 and they are quite slow).

To get back on topic I'm really wondering about the Krait cpu core, it looks good on paper. I was not really that excited about the Scorpion core, it always looked to me like a bit better than Cortex-A8 but appearing significantly later (it might have had better efficiency but it's difficult to tell, it was also able to clock higher). It didn't really look all that different to a Cortex-A8 in the end. But Krait seems to be quite significantly different from a Cortex-A9.

Shouldn't we be comparing Krait against a Cortex-A15?
 
Hmm I must have missed the hype...
If I understand that right, they've got a nice array of fp alus for doing all the work, but no dedicated hardware for texture filtering, rasterization, etc. This provides maximum flexibility in what you can do, but there's a reason gpus still include that stuff. So I would expect it to fair poorly in pure 3d graphic related tasks (I only found some low-level gles 1.1 benchmark numbers for the ZMS-08 and they are quite slow).

To get back on topic I'm really wondering about the Krait cpu core, it looks good on paper. I was not really that excited about the Scorpion core, it always looked to me like a bit better than Cortex-A8 but appearing significantly later (it might have had better efficiency but it's difficult to tell, it was also able to clock higher). It didn't really look all that different to a Cortex-A8 in the end. But Krait seems to be quite significantly different from a Cortex-A9.

Yea i found a thread dedicated to zii labs and that is pretty much what was said there.

The scorpian core if i am correct had quite a few advantages over cortex a8, it was slightly more powerfull clock for clock, had a longer pipeline so could scale to higher frequencies and could do that whilst using less power..
It also dubuted in smatphones first @1ghtz (Toshiba T-go1) and it went up against iphone 3gs a8 @ 600mhtz
..so it was better, AND came at the same time/earlier.

Yes Krait should be compared to cortex a-15, although from what ive read, including from Arun the a-15 should be faster clock for clock, and that Krait should be more energy efficient.
..on that note st errickson nove thor have managed to get 4.0 DMIPS/MHZ...
 
Shouldn't we be comparing Krait against a Cortex-A15?
Well that isn't here yet. Granted Krait isn't in any devices quite yet neither but apparently soon (maybe half a year earlier than A15?). It makes sense to compare it to both A9 and A15 imho. Scorpion was somewhere between A8 and A9 in performance and features (but closer to A8), and appearing in devices also after Cortex-A8 devices, but before A9 devices. So same story here, though architecturally there are definitely more differences now between Krait and either the A9 or A15 as far as I can tell. Maybe it's closer to A15 (in terms of performance and complexity), not sure.
 
Yea i found a thread dedicated to zii labs and that is pretty much what was said there.

The scorpian core if i am correct had quite a few advantages over cortex a8, it was slightly more powerfull clock for clock, had a longer pipeline so could scale to higher frequencies and could do that whilst using less power..
Actually the A8 had a longer pipeline (13 vs 10 stages). But imho compared to the other Cortex chips (A9 in particular and most likely A5 and A7) it just wasn't a very efficient design, so it's not surprising Qualcomm could do something more efficient.

It also dubuted in smatphones first @1ghtz (Toshiba T-go1) and it went up against iphone 3gs a8 @ 600mhtz
..so it was better, AND came at the same time/earlier.
Cortex A8 was definitely appearing earlier in smartphones (approximately half a year) than Scorpion (June vs. December 2009). If it would have been earlier I guess almost noone would have bothered with the Cortex A8 :).

Yes Krait should be compared to cortex a-15, although from what ive read, including from Arun the a-15 should be faster clock for clock, and that Krait should be more energy efficient.
..on that note st errickson nove thor have managed to get 4.0 DMIPS/MHZ...
Yeah hmm I don't put too much faith in that number. You can trivially build a 8-issue in-order chip with 8 dmips/mhz which with real code never really achieves more than 1 dmips/mhz (granted it would not make sense to build such a chip but still Cortex A8 and A9 have nearly the same dmips/mhz flops rating despite the A9 being much faster).
We'll see about the energy efficiency (haven't seen many reviews measuring power draws while running benchmarks in smartphones yet though...) though Arun is probably right. I think in particular power efficiency while running light loads might be interesting, there's a reason after all the Cortex A7 is supposed to be also used as a "companion" core alongside A15 for power saving reasons. Of course, the more complex Krait core (compared to A9) might have problems against A9 in that area too.
 
Well the 1ghz Snapdragon SOC was first introduced in the Toshiba TG01 smartphone, that was announced and demoed at CES 2009 and here is a full review of it on 13th july...not sure if it released earlier..
http://www.techradar.com/reviews/phones/mobile-phones/toshiba-tg01-615602/review

The first Cortex A8 solution came in the form of 600mhz iphone 3gs..that was when around June/july? so it was easily the most powerfull and came out at least the same time.
Cortex A8s in general were a not very effecient architecture,3gs sucked juice like a toaster... but scorpion and samsungs Hummingbird were the most power effecient of that class IMHO.
 
big_LITTLE

Yea its part of the big_LITTLE push from ARM isnt it? heard the A7 is nearly as fast as a A9 but alot more power efficient.

Do you know if the A7s can be used in a quad core setup doing light threads in the back ground with the A-15s providing the muscle at the same time?
...Or can they only be used as companion cores 2 at once, with the daddy cores taking over from them when things get dicey?
 
Do you know if the A7s can be used in a quad core setup doing light threads in the back ground with the A-15s providing the muscle at the same time?
...Or can they only be used as companion cores 2 at once, with the daddy cores taking over from them when things get dicey?

You could do both, but ARM is already providing the groundwork for their LITTLE_big approach and their LITTLE_big approach is a little easier from a software standpoint.
 
Well the 1ghz Snapdragon SOC was first introduced in the Toshiba TG01 smartphone, that was announced and demoed at CES 2009 and here is a full review of it on 13th july...not sure if it released earlier..
http://www.techradar.com/reviews/phones/mobile-phones/toshiba-tg01-615602/review
Oh that's interesting, indeed looks like july for the tg01. I was going by wikipedia which quoted december 09 for the "first US phone with Snapdragon": http://en.wikipedia.org/wiki/Snapdragon_(system_on_chip)
Maybe the cdma version of the chip was available later (even if the quoted chip availability time is the same). iPhone 3gs was june 2009 which indeed would put it almost exactly the same release time (I'm not really looking into prerelease stuff for availability that's unfair as different companies have different policies - there were also palm pre's with Cortex-A8 on CES 2009 for instance).
That makes Snapdragon Scorpion more amazing than I thought,

Yea its part of the big_LITTLE push from ARM isnt it? heard the A7 is nearly as fast as a A9 but alot more power efficient.
It won't be as fast as A9. It can apparently exceed A8 performance at least for some workloads despite being significantly less complex (and thus has way better power efficiency, though I don't know how exactly it compares to A9 in that area which should also be significantly better than A8 in that area).

Do you know if the A7s can be used in a quad core setup doing light threads in the back ground with the A-15s providing the muscle at the same time?
...Or can they only be used as companion cores 2 at once, with the daddy cores taking over from them when things get dicey?
From a hw perspective you should be able to build cpus with different cores running at the same time but from a software point of view that's probably too much trouble (cpus with heterogeneous cores certainly exist, but they tend to run very specialized tasks on the "limited" cores). If you're suggesting two A7 cores and two A15 cores I don't think that makes a lot of sense, as all typical light work should probably easily run on one A7 just fine, and a second A7 in such a combo wouldn't really increase performance all that much (if a A15 is twice as fast per clock as the A7 and has a clock target somewhere twice as high as well that would make the A15 four times as fast per core). I would certainly expect SOC manufacturers to use big.LITTLE instead.
 
The first Cortex A8 solution came in the form of 600mhz iphone 3gs..that was when around June/july? so it was easily the most powerfull and came out at least the same time.

You could buy Archos PMPs with OMAP3 in fall of 2008. I saw them for sale in Circuit City. Not a phone, but still hit the market much earlier than summer of 2008.

Cortex A8s in general were a not very effecient architecture,3gs sucked juice like a toaster... but scorpion and samsungs Hummingbird were the most power effecient of that class IMHO.

Hummingbird uses Cortex-A8. It was also a 45nm part vs 3GS which used a 65nm SoC.. hard to really compare the two.
 
The first Cortex A8 solution came in the form of 600mhz iphone 3gs..that was when around June/july? so it was easily the most powerfull and came out at least the same time.
Cortex A8s in general were a not very effecient architecture,3gs sucked juice like a toaster... but scorpion and samsungs Hummingbird were the most power effecient of that class IMHO.
http://www.anandtech.com/show/2798/12

While not long compared to today's phones, was the iPhone 3GS battery life really behind other smartphones available at launch? Compared to the previous ARM11 iPhone 3G, the Cortex A8 iPhone 3GS actually had improved battery life in all areas except continuous gaming, which is probably more due to the move to SGX535 than the change in CPU. Of course, the iPhone 3GS was on 65nm compared to 90nm for the iPhone 3G and improvements in support components probably helped too. The end result product was still a smartphone that was faster than it's predecessor without giving up battery life so I don't think manufacturers viewed moving to Cortex A8 as a huge risk to battery life.
 
Back
Top