Qualcomm Krait & MSM8960 @ AnandTech

Here's the press release.

They're not being inconsistent. There's no "World's First 64bit chipset with 4G LTE World Mode! claim.
64bit wasn't given much relevance to the announcement. It just happens to be part of ARMv8's feature checklist.
 
Here's the press release.

They're not being inconsistent. There's no "World's First 64bit chipset with 4G LTE World Mode! claim.
64bit wasn't given much relevance to the announcement. It just happens to be part of ARMv8's feature checklist.


Yeah, but why aren't they using a Qualcomm architecture? Maybe I've overlooked something, but I thought every Snapdragons has used Qualcomm architectures for years.

Switching back to ARM makes the project seem rushed. And why would they rush if not because of 64-bit?
 
Yeah, but why aren't they using a Qualcomm architecture? Maybe I've overlooked something, but I thought every Snapdragons has used Qualcomm architectures for years.

You've overlooked something, while the performance orientated Qualcomm SoCs have been using their own Scorpion and then Krait cores, their budget SoCs have followed the Arm11 -> Cortex-A5 -> Cortex-A7 progression, which A53 is the next step of.
http://en.wikipedia.org/wiki/Snapdragon_(system_on_chip)

Since these budget cores are already very small and very low power, there is less advantage to be had through custom design. The licensing costs from ARM for these cores are probably a whole lot less compared to A9, A15, etc. as well.
 
Last edited by a moderator:
You've overlooked something, while the performance orientated Qualcomm SoCs have been using their own Scorpion and then Krait cores, their budget SoCs have followed the Arm11 -> Cortex-A5 -> Cortex-A7 progression, which A53 is the next step of.
http://en.wikipedia.org/wiki/Snapdragon_(system_on_chip)

Since these budget cores are already very small and very low power, there is less advantage to be had through custom design. The licensing costs from ARM for these cores are probably a whole lot less compared to A9, A15, etc. as well.


Didn't know that, thanks.
 
The Qualcomm SoC in the Moto G for example is a quad core Cortex-A7 design with Adreno 305 graphics IIRC. This chip is pretty much the successor of the exact chip used in the Moto G.
 
The Qualcomm SoC in the Moto G for example is a quad core Cortex-A7 design with Adreno 305 graphics IIRC. This chip is pretty much the successor of the exact chip used in the Moto G.

I actually picked up one of these recently and was fairly impressed. Its a lot faster and smoother than i expected given the puny cores. Battery life is very good too

Only thing i really miss is a better camera and LTE
 
Yeah, DirectX 12 will boost mobile gaming... to those who buy a windows phone to play mobile games.

Which is no one.

No objection; however if DX12 reduces driver overhead significantly then it obviously will also reduce power consumption quite a bit. One question would be how things look like with driver overhead in OGL_ES. Kishonti has a low level test for it, but I've no idea how reliable it is.
 
No objection; however if DX12 reduces driver overhead significantly then it obviously will also reduce power consumption quite a bit. One question would be how things look like with driver overhead in OGL_ES. Kishonti has a low level test for it, but I've no idea how reliable it is.


Just out of curiosity, is anyone measuring battery life on smartphones using actual games?
 
Just out of curiosity, is anyone measuring battery life on smartphones using actual games?

I don't think so, but I'd dare to estimate that the average time you can play mobile games on smartphones is roughly 2 hours.
 
Maybe a little late to the game, but since nobody ever posted this, I went to research a bit for power numbers on the S800.

And guess what:

https://github.com/Wootever/kernel_samsung_hlte/blob/kitkat/arch/arm/mach-msm/acpuclock-8974.c#L2035

They conveniently list the µA draw for all voltage tables for the DVCS thermal driver. The second column is the frequency, second-last is the voltage, and last one is the µA draw.

For the pro_rev1_2p3g_pvs8 (8974AB mid-bin)

czniqrc.png


Keep in mind this doesn't include L2 or common blocks power.

*And obviously it's KHz, not MHz, my typo in the table.
 
Maybe a little late to the game, but since nobody ever posted this, I went to research a bit for power numbers on the S800.

And guess what:

https://github.com/Wootever/kernel_samsung_hlte/blob/kitkat/arch/arm/mach-msm/acpuclock-8974.c#L2035

They conveniently list the µA draw for all voltage tables for the DVCS thermal driver. The second column is the frequency, second-last is the voltage, and last one is the µA draw.

For the pro_rev1_2p3g_pvs8 (8974AB mid-bin)

czniqrc.png


Keep in mind this doesn't include L2 or common blocks power.

*And obviously it's KHz, not MHz, my typo in the table.

So a Krait 400 at 800 MHz uses roughly the same power as an ARM A7 at 1300 MHz. I'm assuming the Krait 400 would still be faster locked at 800 vs a 1300 MHz A7, if correct then Qualcomm adopting big.LITTLE may even be a retrograde step, unless the A53 is a major improvement in perf/w and / or TSMC & Qualcomm are confident of doing a better design / fabrication job than Samsung.
 
So a Krait 400 at 800 MHz uses roughly the same power as an ARM A7 at 1300 MHz. I'm assuming the Krait 400 would still be faster locked at 800 vs a 1300 MHz A7, if correct then Qualcomm adopting big.LITTLE may even be a retrograde step, unless the A53 is a major improvement in perf/w and / or TSMC & Qualcomm are confident of doing a better design / fabrication job than Samsung.
Or Qualcomm simply has no 64-bit core.
 
Back
Top