Qualcomm Krait & MSM8960 @ AnandTech

Anandtech's story on Xiaomi MI3 says the smartphone's SoC will be a MSM8974AB (Snapdragon 800). The only difference for the regular MSM8974 seems to be the clocks, which go from 450 to 550MHz in the Adreno 330, higher supported speeds for LPDDR3 and higher clocks for the ISP (which I'd assume means higher frame-rate/resolution videos and photos with more MPixels at the same speed?).
 
I mentioned to @Nerdtalker that I suspected that a Snapdragon 8974ab existed, mainly because of the unreleased Galaxy S4 s800 version listed in GLBenchmark 2.7, that was a clear 10%+ faster than any other S800 device, despite having been tested on Android 4.2.2, so not having the latest OpenGL 3.0 Qualcomm drivers, that come with 4.3, although I doubt that OpenGL ES 3.0 drivers will boost a OpenGL ES 2.0 benchmark, but they're probably the most up to date, however there was a big v14.0 driver update recently made available for 4.2.2 devices from Qualcomm, that already boosted results considerably.
OpenGL ES 3.0 is a superset of OpenGL ES 2.0. There should be only one driver.
 
Benchmarks would suggest it does.

The Gfxbench database now has an updated score for the Tegra 4 version of the Xiaomi MI3 superphone, and it edges out the iPad 4. The implementations keep steadily improving for Tegra 4 devices which is good to see.

Intel has a new device on the list, too, with some impressive sub-test scores like fill rates. Is that Bay Trail and its Intel GPU IP (I'm not too up-to-date on expected release schedule for Intel's next platforms)?
 
The huge battery is definitely a contributor, depending on the device to which you're comparing. The Note 2 has a similarly sized battery, though.
 
The huge battery is definitely a contributor, depending on the device to which you're comparing. The Note 2 has a similarly sized battery, though.
Of course, but as you said, the battery improvement is bigger than the battery capacity improvement, while offering a large performance boost. So general perf/W is way up.
 
Gob smacked, gpu performance I expected to be a little better though.

Edit: here is gsm arena review of the g2:
http://m.gsmarena.com/lg_g2-review-982p5.php

Confirms anandtechs tests of brilliant battery life with a rating of 62hrs in their endurence test.

More interesting is their claim that that lg dedicates 50mb less ram to graphics than the xperia z ultra. ..which they claim allows the sony to post better graphics benchmarks...interesting.
 
Last edited by a moderator:
Yes, these battery life measurements are quite impressive. As noted earlier, the battery capacity is ~ 30% higher on the LG G2 vs. the HTC One. Normalizing for battery capacity, the G2 has 16.8% better 3G web browsing battery life, 5.5% better WiFi web browsing battery life, and 28.1% better cellular talk time battery life than the HTC One. So in addition to TSMC's 28nm HPM fabrication process, LG is using something called Graphics RAM [GRAM] that, when the display is static, "the G2 can run parts of the display subsystem and AP off and save power".

I can't help but think that reviewers are not able to capture true power efficiency when performing their battery life tests. To accurately measure power efficiency, one needs to measure both power consumption and performance. Measuring power consumption without measuring performance (and vice versa) doesn't give us the full story. Given the difficulty in easily measuring application-specific power consumption on handheld devices, there may be no good solution to this at this point in time.

On a side note, the G2's GPU performance is very good for a phone but is anywhere between ~15-25% lower than Qualcomm's MSM8974 MDP tablet reference platform, which just goes to show how these high performance SoC's are not able to stretch their legs in this form factor.
 
Last edited by a moderator:
Yes, these battery life measurements are quite impressive. As noted earlier, the battery capacity is ~ 30% higher on the LG G2 vs. the HTC One. Normalizing for battery capacity, the G2 has 16.8% better 3G web browsing battery life, 5.5% better WiFi web browsing battery life, and 28.1% better cellular talk time battery life than the HTC One. So in addition to TSMC's 28nm HPM fabrication process, LG is using something called Graphics RAM [GRAM] that, when the display is static, "the G2 can run parts of the display subsystem and AP off and save power".

I can't help but think that reviewers are not able to capture true power efficiency when performing their battery life tests. To accurately measure power efficiency, one needs to measure both power consumption and performance. Measuring power consumption without measuring performance (and vice versa) doesn't give us the full story. Given the difficulty in easily measuring application-specific power consumption on handheld devices, there may be no good solution to this at this point in time.

On a side note, the G2's GPU performance is very good for a phone but is anywhere between ~15-25% lower than Qualcomm's MSM8974 MDP tablet reference platform, which just goes to show how these high performance SoC's are not able to stretch their legs in this form factor.

Well do we know whether the mdp qualcomm refference device carries the same soc, how do we not know whether this mdp carries the top bin 550mhz adreno 330?

Also there was a comment on gsm arena about lg using less (50mb) than ram set aside for gpu?

Edit: found out some info on why s800 is good on power efficiency, some of it has to do with an intergrated LTE power technology called envelope tracking...
http://gigaom.com/2013/09/05/thanks...xy-note-3s-huge-screen-wont-kill-its-battery/

Not sure if this is a s800 specific thing or another qualcomm chip that supplements the soc, interesting nonetheless.

Edit 2: the galaxy note 3 contains the new Qualcomm QFE1100 chip which contains the new envelope tracking technology, said to save around 20% in radio power consumption and 30% heat.
-galaxy note 3 is said to be the first product to have this on board, so ik guessing this must be another feature of the A/B s800 top bin soc.
Further information is needed to confirm.
 
Last edited by a moderator:
I'm going to post something a bit OT because I don't think it deserves a new thread.
So, I just bought G pro, which is using S600 (less than 2 months old). Right now they accepting pre-orders for G2 (S800) for the same price as when I bought G pro! Now I'm having a buyer's remorse moment.

Having said that, I probably wouldn't buy the G2, but I'm hoping that if I waited after the G2 launched, I might be able to buy G pro at an even cheaper price.

As for why I don't want the G2, it's because I like the bigger screen of the G pro. I wouldn't mind a touch bigger phone, but I certainly don't want to go smaller. And the G2 doesn't have micro SD slot and use non removable battery. If I ever go to the no mSD and no removable battery, I would buy whatever Sony is cooking because I know I would get a waterproof phone.

Again, sorry for the OT post. I just need to vent out my buyer's remorse feeling.
 
Sony Xperia Z1 benchmark impressive !!

BUIvfXNCAAAs_t_.jpg
 
Not many numbers, but some interesting benchmark comparisons between the Asus Padphone using the S800 and the actively cool Nvidia Shield.

http://reviews.cnet.com/smartphones/asus-padfone-september-2013/4505-6452_7-35827814.html

Only 3dmark unfortunately.

Overall faster. GPU 1 (polygon heavy) the Shield wins. GPU 2 (effects heavy) the Padphone wins. And CPU score is basically the same.

Not bad considering this is constrained to the limited cooling available in a phone form factor. If the phone wasn't so damn big (4 inch screen is about my limit), I'd be tempted to pick one up.

Regards,
SB
 
Good pricing. That, combined with the most refined (in terms of carrier certification) baseband/modem solution, continues their dominance. Once a company like MediaTek gets their footing, though, they'll have to start sharing the market a lot more.

TechInsights claims the AB variant of the 8974.
 
Back
Top