Samsung Exynos 5250 - production starting in Q2 2012

  • Thread starter Deleted member 13524
  • Start date
Chromebook: 4800 mah

Nexus 10 : 9000 mah

iPad 3: 11600 mah

mAh ratings aren't very useful by themselves without knowing how much voltage the battery supplies, since power per amp scales with voltage.

Chromebook isn't 4.8Ah either, it's 4.08Ah, with a 30Wh rating (http://www.samsung.com/us/computer/chrome-os-devices/XE303C12-A01US-specs). That's 7.35V, and the specs indicate 2-cell Li-Po cells so it'd be two cells in parallel.

Nexus 10 is probably using a single cell Li-Po, so probably 33-34Wh rating, not a huge difference. iPad 3 is 42.5Wh.

My guess is that Chromebook is just not that aggressively power optimized, either in the OS or the component selection or overall design. Starting from a higher voltage means low voltages (CPU and RAM take really low voltages these days) have to be stepped down further, reducing efficiency on what might be less efficient cheaper regulators. The screen itself may not be as power efficient as some. And Samsung may be binning the less efficient Exynos 5s to Chromebook while the more efficient ones go to the more lucrative Nexus 10, not that I'd expect such a thing to make that much of a difference.
 
While dev boards can focus less on power efficiency than consumer mobile devices and sometimes overestimate the performance of those resulting devices, the trend has typically been that the dev systems understate the performance of the finished product due to the refinement in implementation, software, and drivers.

I expect subsequent GLBench run-throughs of the Arndale board would reach just over 4000 frames on the standard 2.5 Egypt HD Offscreen and then consumer devices pushing a little beyond that, so I believe T-604 devices will outpace Adreno 320 devices by maybe 20% in speed. It apparently will never make the same GL ES 3.0 grade, though, which won't matter too much for the time the T-604 is relevant before being replaced by devices with higher compliance Malis.
 
Anandtech did some tests:
http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6
About 50% faster than the Atom N570, assuming Google put as much effort into the x86 Javascript engine as the ARM one in ChromOS.

The power tests aren't too promising, though. 6W idle, 10W load. I understand that at 1.7GHz this isn't the best perf/W operation of the 5250, but I expected better. Clovertrail tablets idle at 2W (both sets of figures are with the display on) and won't hit 4W under load.

I guess this explains why we haven't seen A15 in any phones despite everyone expecting it for a while now. Doesn't seem like there's much of an advantage over A9 in perf/W. Nexus 10 battery tests will be interesting.
 
No way those idle numbers can be representative of the capabilities of the chip, they already have clock gating on every single function block on the SoC which isn't being used, and on top of that I know for a fact that Google implemented a coupled CPUIdle driver for full Arm Off Top Running; I.e. full power collapse on both the CPU cores while the system is running and in idle. The idle power on the Exynos 4's goes below 15mW in these situations with top off and less than 100mW with top on.

I think this is just a case of the software power management not working correctly or not fully implemented. I doubt the screen accounts for that much of the idle power to justify those values.
 
Last edited by a moderator:
Those tests aren't deep/long idle, they're with the screen on and maybe the mouse moving. It's also Exynos 5, not 4. It's not just the idle figures that are surprising, as the load figure is really high. Intel got Core2 Duo on 45nm in that thermal envelope.

I considered the software issue, but this is Google. They're not going to screw up power management.
 
AFTR has a target residency of 1ms, and I'm not talking about deep idle either. It's perfectly reasonable to reach those idle states even while moving the mouse on screen on (That's the whole point of it). The SoC power management between Exynos 5 and 4 are identical, so it's pretty relevant, it doesn't matter if the cores are A15 or not beyond a base delta in power consumption in that case.
 
Anandtech did some tests:
http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6
About 50% faster than the Atom N570, assuming Google put as much effort into the x86 Javascript engine as the ARM one in ChromOS.

The power tests aren't too promising, though. 6W idle, 10W load. I understand that at 1.7GHz this isn't the best perf/W operation of the 5250, but I expected better. Clovertrail tablets idle at 2W (both sets of figures are with the display on) and won't hit 4W under load.

I guess this explains why we haven't seen A15 in any phones despite everyone expecting it for a while now. Doesn't seem like there's much of an advantage over A9 in perf/W. Nexus 10 battery tests will be interesting.
Don't forget the display in Nexus 10.

AFAIK, both android and Chrome OS share the v8 JIT.
 
Clovertrail tablets idle at 2W (both sets of figures are with the display on) and won't hit 4W under load.

Want to bet? Medfield phones hit 3.25W under load (http://images.anandtech.com/reviews/smartphones/apple/iPhone5/review/krakenmedfieldsm.png), add a second core and higher base clocks, exercise HT, and heavier base hardware like a bigger screen and I'm sure you'll exceed 4W under heavy utilization.

Web browsing isn't heavy CPU load.

I guess this explains why we haven't seen A15 in any phones despite everyone expecting it for a while now. Doesn't seem like there's much of an advantage over A9 in perf/W. Nexus 10 battery tests will be interesting.

Most of us weren't expecting Cortex-A15 phones until 2013. I definitely didn't expect an Exynos 5 product release before November.

Something like Chromebook is much more tenable for early release than a phone. Tablets don't require as much validation as phones. Chromebook is probably not expected to be high volume (hence why they sold out so easily despite not exactly getting a huge amount of buzz.. who wants to use ChromeOS anyway? I ordered one to get any sort of real GNU/Linux on it but this is probably even a bigger minority position; I have special needs for an ARM laptop but otherwise would be serviced fine by my IB laptop). And even if Samsung hasn't really gotten the power consumption of the chip on the process optimized it can still work for a device like this. I suspect this to be the case, at least to some extent.

This is analogous to Tegra 2 releasing in a crappy laptop months before it released in anything else, or Tegra 3 releasing in tablets months before it released in phones. If you don't think Samsung can put Cortex-A15 in phones what do you expect them to put in Galaxy S4? They don't have anything else in the pipeline, surely they're not giving up on launching their flagship phone with their flagship SoC in at least some regions..

However I do think you may be right that at higher perf values Cortex-A15 lags behind Cortex-A9 in perf/W. 1.7GHz Cortex-A15 is a big leap ahead of the 1.4-1.6GHz Cortex-A9 cores we see in phones. It could probably run at 1.2GHz and still be competitive with iPhone 5, but it may well still be the case that Apple managed a more power efficient design. Nonetheless, if you normalize Cortex-A15's clocks such that it performs the same as high end Cortex-A9s now I suspect the power consumption won't be worse, since it should be able to operate at much lower voltages.
 
I was positively surprised how the 1.7 GHz A15 performed against the similarly clocked dual core ATOM (1.66 GHz). Assuming the JS engine is equally well optimized for both chips, the single core IPC of A15 is nice deal ahead of ATOMs. However we need more benchmarks to be able to draw any conclusions about performance when all cores/threads are used. ATOM badly needs software to use all the four threads, because it's in-order architecture cannot fill the execution pipelines efficiently using instructions just from a single thread (HT is needed to efficient execution).

I tried to find the official TDP figures for the (1.7GHz) A15, but couldn't find them. That 9.27W peak figure in Kraken (Anandtech) includes the whole platform (without display). What is more interesting is the comparison between idle and peak, there's a 5.2W difference for A15. That would indicate that the chip has pretty high TDP compared to simpler/slower/older ARM chips (maybe somewhere in 6-8W range?) . But when you compare it to (40nm) ATOM or existing (40nm) 9W Bobcats (dual core, 1GHz), it looks very good indeed. It would be interesting to see how a high clocked quad core A15 compares against AMDs forthcoming 28nm Jaguar quad core (both should have comparable clock ceilings).
 
The "retina display" iPads rely on some resolution scaling, and, as expected, to a relatively higher degree in 3D games, but it really isn't too prevalent nor noticeable as an overall takeaway experience when using the platform.

I have to wonder, though, if the issue won't stand out significantly more on the Nexus 10 under varied usage, including gaming, with its extreme display being driven by T-604s (especially when compared with its late-2012 iPad counterpart and expected comparatively high fill rate).
 
Last edited by a moderator:
Want to bet? Medfield phones hit 3.25W under load (http://images.anandtech.com/reviews/smartphones/apple/iPhone5/review/krakenmedfieldsm.png), add a second core and higher base clocks, exercise HT, and heavier base hardware like a bigger screen and I'm sure you'll exceed 4W under heavy utilization.
I'll take you on that bet. Higher base clocks and lower peak clocks reduce the jump from idle to load, not increase it. That jump is 1.4W for Medfield in that graph, and given that Clovertrail has a sub-2W TDP, I seriously doubt that tablets based on it will jump from 2W idle to 4W+ load.

Most of us weren't expecting Cortex-A15 phones until 2013. I definitely didn't expect an Exynos 5 product release before November.
Given that it was announced back in 2010, and that mass production was expected in 2011, my expectations weren't as loose as yours.

If you don't think Samsung can put Cortex-A15 in phones what do you expect them to put in Galaxy S4? They don't have anything else in the pipeline, surely they're not giving up on launching their flagship phone with their flagship SoC in at least some regions..
I never said that I don't expect it in phones. I just said that low perf/W may explain the slow adoption. It's possible that cheap Chromebook components brought up the idle consumption of that platform, but the 4-5W jump under load is really big. 1.7GHz isn't that high, either, considering that ARM was touting 2.5GHz, so voltage probably isn't that high (and why would Google bother with trying to get 10% more perf out of it anyway when battery life is it's biggest weakness?).

At the moment, it looks like the 5250 will have to run at 1.0-1.3GHz to match Clovertrail's TDP, which would make their performance on par. We need more data to verify this, of course.
What is more interesting is the comparison between idle and peak, there's a 5.2W difference for A15. That would indicate that the chip has pretty high TDP compared to simpler/slower/older ARM chips (maybe somewhere in 6-8W range?) .
Precisely, though I think your estimates are a bit high. I don't see the point in comparing to 40nm Atom, though. Clovertrail is properly optimized for low power, and has a 2W TDP.
 
Last edited by a moderator:
I'll take you on that bet. Higher base clocks and lower peak clocks reduce the jump from idle to load, not increase it.

How did you arrive to that conclusion? Idle power consumption has nothing to do with base clock speed. Higher base clock means it can spend more TDP while under full load.

Either way, you said that Clovertrail tablets won't use more than 4W under full load, after first saying they'll use 2W under tablet. You never said that the difference in idle and load consumption would be 4W, on the contrary you said it'd be under 2W.

That jump is 1.4W for Medfield in that graph, and given that Clovertrail has a sub-2W TDP, I seriously doubt that tablets based on it will jump from 2W idle to 4W+ load.

Let's try this again: the presented Medfield phone uses 3.25W under full load. What you're basically saying is that you can take that, and add:

1) A second core
2) A multicore GPU with a base core that's stronger
3) A higher base speed
4) A much bigger, and in many cases higher resolution screen

And you think all of that will take less than 750mW? The second core alone will take at least that...

Given that it was announced back in 2010, and that mass production was expected in 2011, my expectations weren't as loose as yours.

My expectations were based on what ARM and third party vendors were actually saying in late 2010, which was that it'll be in devices in 2013, which was later occasionally amended to late 2012. Given that release estimations are often very optimistic I'd say that expecting them to even meet them was not the least bit conservative.

I have no idea what your expectations were based on. This wasn't at all inconsistent with lead times between when ARM announced Cortex-A8 and A9 and when they first appeared in devices.

I never said that I don't expect it in phones. I just said that low perf/W may explain the slow adoption.

Well no, you said the perf/W has already been slowing adoption, which is almost definitely not the case.

It's possible that cheap Chromebook components brought up the idle consumption of that platform, but the 4-5W jump under load is really big. 1.7GHz isn't that high, either, considering that ARM was touting 2.5GHz, so voltage probably isn't that high (and why would Google bother with trying to get 10% more perf out of it anyway when battery life is it's biggest weakness?).

ARM's 2.5GHz number was probably never something you could have expected to realize in a device with a tablet sized battery. They announced 2GHz Cortex-A9 hard macros too but I'm not aware of anyone having used them.

The load usage is big, too high, I agree with this, but it's not as if there aren't factors of the design that can affect total power consumption and not just idle, and it's not as if the SoCs used are necessarily the best of the lot.

At the moment, it looks like the 5250 will have to run at 1.0-1.3GHz to match Clovertrail's TDP, which would make their performance on par. We need more data to verify this, of course.

IMO you shouldn't compare values measured in a review with value that Intel claims. I wonder what the TDP on Medfield is.. do you think it's enough to allow for using 750mW more under load than Swift uses (when the GPU is barely used on either of course)? The core shouldn't be using more than 750mW just by itself, or at least at 1.6GHz.. if it's turboing up to 2GHz it might be a different story. But this is still pretty massive.

TDP is supposed to be an honest number since the design relies on it, but the trick Intel pulls with their Atom SoCs is that they're allowed to throttle to stay within safe thermal ranges. The reason why this is a trick is because the throttling is based on reading temperature, not power, meaning that if you design the cooling to a higher specification than what the TDP requires it'll hit higher performance margins.

Compare Cedar Trail N2600 and Clover Trail Z2760, and we see that Z2760 has a higher max clock speed (1.8GHz vs 1.6GHz), a faster GPU (533MHz vs 400MHz), and several more integrated peripherals. The only thing that uses more power for N2600 is that the memory controller supports DDR3 over LPDDR2 and the DMI to connect to an NM10. Yet it carries just over half the TDP. What does that tell you?
 
What makes you think that's Clover Trail and not Cedar Trail?

Because I always lose Trail with Intel's code names for some reason.

Doesn't count, my statement applies to hard macros on 45nm, of course you can deliver higher on better nodes.

Wait a minute; did ARM state that A9 can go up to 2GHz under 45nm or am I missing something? A15 was announced to yield higher frequencies than A9 of 2GHz and beyond.
 
I actually dusted off my old S2 measurements on the Exynos 4210, the idle to load power delta is 3W at 1200MHz and 4.1W at 1400MHz. So, 500mW idle power and 3.5W respectively 4.5W on load. I measured this at the time with a multimeter directly between battery and input pins, so it's a measurement for the whole system. Screen on etc in both cases.

Arguably this was on 45nm which was more power hungry than quad A9's on 32nm.
 
I actually dusted off my old S2 measurements on the Exynos 4210, the idle to load power delta is 3W at 1200MHz and 4.1W at 1400MHz. So, 500mW idle power and 3.5W respectively 4.5W on load. I measured this at the time with a multimeter directly between battery and input pins, so it's a measurement for the whole system. Screen on etc in both cases.

Arguably this was on 45nm which was more power hungry than quad A9's on 32nm.

Starting to be more clear why Apple capped their SoCs at a lowly 800MHz for phone usage (AFAIK the Cortex-A9 layout in A5/A5X and Exynos 42xx is identical).. that's a pretty stiff price for that last 200MHz.. this is with both cores running, right?

Would be curious to see what the numbers are like at some lower frequency.

Are SoC vendors paying a big premium to hit these frequencies on leakage optimized processes? nVidia claims so, but they seem to use more power under load than anyone on Tegra 3 phones..
 
Starting to be more clear why Apple capped their SoCs at a lowly 800MHz for phone usage (AFAIK the Cortex-A9 layout in A5/A5X and Exynos 42xx is identical).. that's a pretty stiff price for that last 200MHz.. this is with both cores running, right?

Would be curious to see what the numbers are like at some lower frequency.

Are SoC vendors paying a big premium to hit these frequencies on leakage optimized processes? nVidia claims so, but they seem to use more power under load than anyone on Tegra 3 phones..
Here's my measurement table from last year. Remember, 45nm Exynos 4210 which had the reputation of getting ridiculously hot. The load values are actually with the screen off, so with screen on if I remember correctly it went as high as 5.5W.

nyKHt.png


Off = CPU on with top off (Screen turned off and wakelocked system)
SetCPU = Idling while screen on.
ST off single = Stress test with single core enabled with top off
ST off dual = Stress test with both cores enabled with top off

Frequencies are on the left most column. Orange column is calculated mW with the other two being voltage and mA measurements. DVFS was turned off and CPU was frequency locked.
 
Back
Top