Samsung Galaxy S series rumours.....

Did Anand get that diagram from Qualcomm, and did Qualcomm re-work their CPU architecture when moving from dual-core Krait to quad-core Krait? According to NVIDIA, quad-core Krait has 512KB dedicated L2 cache for each CPU core, rather than a larger shared L2 cache. Xbit Labs makes the same claim too (albeit for the upcoming quad-core S800).
It's from Qualcomm I think, iXBT has the same graph; http://ixbtlabs.com/articles3/mobile/snapdragon-s4-p1.html

But they even iterate it in the article:

"The L2 cache is shared among all cores. In dual-core designs the L2 cache is sized at 1MB (up from 512KB in Scorpion), while quad-core Krait SoCs will have a 2MB L2. Krait's L2 cache is 8-way set associative."

I think the confusion may be in the way the cores have access to the cache, rather than the cache being divided for each core.

Edit:

http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/4

They specifically state how the L2 cache has its own power island (and thus not related to the cores):

"I suspect that missing the L2 cache power island here is lowering Qualcomm's power consumption by 100 - 200mW but overall CPU-only power consumption would still be lower. "
 
Last edited by a moderator:
I think the confusion may be in the way the cores have access to the cache, rather than the cache being divided for each core.

Yeah I suppose that makes sense. It appears that the aSMP Krait can only allocate up to 512KB L2 cache towards a single CPU core, even if there is 2MB L2 cache available in total for all four CPU cores. The quad-core Cortex A15 designs have much more flexibility in how much L2 cache can be allocated towards each CPU core. The power consumption differences between each approach is still unclear though, especially with Krait having it's own L2 cache power island as you pointed out. That said, I have little doubt that vSMP CPU architectures used with a battery saver companion core(s) (as in big.LITTLE, 4+1, Omap5, etc.) will result in improved talk time and improved video playback time vs. aSMP CPU architectures such as Krait, but I suppose we will find out soon enough.
 
Last edited by a moderator:
Having L2 on a separate power rail doesn't mean that there aren't exclusive/separate L2 caches for each core. All of Intel's Atoms thus far have had separate L2 caches but that hasn't stopped them from using a separate L2 cache power domain for Medfield and the like. The motivation for this isn't to individually power gate the L2, it's to better optimize different components for different voltages.

Linley had this confusing thing to say about Scorpion's L2 cache:

The dual CPU models include a shared 512KB level two (L2) cache twice the cache size of the single CPU versions. The L2 cache transfers data at the same clock speed as the CPUs and is directly connected to both CPUs.
Scorpion was asynchronously clocked like Krait is, and if you have L2 cache that runs at the same clock speed as the CPU that means L2 cache running at two different clock speeds. That probably really means two exclusive L2 caches. I think Qualcomm is just using funny terminology when it talks about shared L2.

It could be that Krait changed things and there is a shared L2 is on a totally separate clock domain. That would of course be bad for power consumption and latency.

EDIT: Okay, apparently Qualcomm really is using an asynchronously clocked L2 for Krait AND Scorpion: http://www.qualcomm.com/media/documents/files/4g-world-2011-snapdragon-overview.pdf That means the L2 cache was never tightly coupled in the first place like Linley claimed. I hoped an analyst who charges thousands of dollars for technical reports would know what they're talking about but I guess they don't. This sure casts a negative light on some of the other things they've said where I was skeptical but gave them the benefit of the doubt...
 
Whatever the reason for the snapdragon 600 being used, its not because the octa core version doesn't support LTE.

http://www.droid-life.com/2013/03/28/samsung-exynos-5-octa-works-on-all-20-lte-bands/

Seems like the articles assertion that its a volume issue, might well be correct.

Is this really surprising though? Not sure why all these websites are parroting this, Exynos 4412 supported LTE, why wouldnt Exynos Octa do the same

I would speculate the reasoning is a combination of money and production issues with Octa. Looking at IHS bill of materials, Samsung is saving a nice chunk of money by going with SD600 considering the volumes we are talking about here
 
I'd love to know how iHS comes up with their numbers for the cost of custom SoCs, especially when they're basically fabbed in-house. Do they even have die sizes?

It's not a question of whether or not you can pair an LTE chipset with Samsung's SoCs, of course you can. The question is whether or not it requires the same level of support hardware as SoCs Qualcomm offers due to whether or not the SoC integrates any baseband processors. Exynos 4412 certainly did not, nor does Exynos 5250; you can confirm that by looking at the publicly available user manuals. Exynos 5410 could change this but I doubt it.

Segmenting based on limited volume makes sense (and would explain why Samsung went with nVidia in some models in the past) but it's a big stretch to think that this just happened to lead to the Exynos Galaxy S4s ending up in all the countries with less developed networks.
 
Last edited by a moderator:
Is this really surprising though? Not sure why all these websites are parroting this, Exynos 4412 supported LTE, why wouldnt Exynos Octa do the same

I would speculate the reasoning is a combination of money and production issues with Octa. Looking at IHS bill of materials, Samsung is saving a nice chunk of money by going with SD600 considering the volumes we are talking about here

I also find it hard to fathom where they get the pricing for the 5410. And let's not forget that whatever that money is, it all remains within the parent group, albeit a different division. I imagine the actual cost to the parent group of the 5410 is a lot less than the cost of a snapdragon s600.
 
I also find it hard to fathom where they get the pricing for the 5410. And let's not forget that whatever that money is, it all remains within the parent group, albeit a different division. I imagine the actual cost to the parent group of the 5410 is a lot less than the cost of a snapdragon s600.

You have to factor in the opportunity cost of producing chips for their sister company, rather than selling Exynos to company x., assuming they do indeed sell their SoCs at cost to big sister.
 
Anybody know what Charlie's blabbering about behind the paywall of this article? http://semiaccurate.com/2013/03/22/analysis-backstory-samsung-chose-qualcomm-chips/#.UVckeF_bmco

Anyway. Strange developments for this device. The Korean E300S was originally a Qualcomm device at the very beginning as leaked by GLBenchmark results, but now reports are coming in it's a Exynos + LTE device. The results page of GLB confirm this with the system information now showing both device characteristics.

I'm curious about the heat dissipation, a Taiwanese website claimed that the Octa version stayed cool and nice even after 1.5h of use and battery temp (Which is actually charger chip temperature) maxed out at 31°C. From the IT168 teardown we can see they use, in my opinion, an innovative design, in as they use the metal backplate of the AMOLED screen as a heat sink for the CPU with a thermal pad between the two, and the CPU is basically in the upper middle of the screen. That's quite a large area for heat dissipation, basically the whole back of the screen is a heat-sink.

In any case, early reports show excellent battery life. We also have reports of 28.3k scores on Antutu, which now very clearly comes in line with the Tegra 4 scores, if you consider one is clocked at 1.6GHz and the other one at 1.9GHz.
 
Last edited by a moderator:
I'm curious about the heat dissipation, a Taiwanese website claimed that the Octa version stayed cool and nice even after 1.5h of use and battery temp (Which is actually charger chip temperature) maxed out at 31°C. From the IT168 teardown we can see they use, in my opinion, an innovative design, in as they use the metal backplate of the AMOLED screen as a heat sink for the CPU with a thermal pad between the two, and the CPU is basically in the upper middle of the screen. That's quite a large area for heat dissipation, basically the whole back of the screen is a heat-sink.

It is quite ridiculous when people equate Samsung = Plastic = Poor Build Quality, the IT168 teardown shows a device that has clearly been constructed to a high standard.
 
Going through the kernel sources, it's blatantly obvious that they have developed a phone with these specs.

The 9500 is the ja3g and its variants are

ja3gduos_chn_ctc / GT-I959 / China Telecommunications Coorporation
ja3gduos_chn_cu / GT-I9502 / China Unicom.

That's fine and dandy, however the conspiracy theories begin with the LTE versions:

The 9505 also has an Exynos variant as named as jalte and its derivatives are

jalteskt / SHV-E300S
jaltektt / SHV-E300K
jaltelgt / SHV-E300L

jaltedcm / ???

First three for the Korean market, and the latter Japan DoCoMo.

The interessting tidbit about the last four is that they're sourced / defined as derivatives of jalte / GT-9505 which is defined as a TARGET_LOCALE_EUR device.

Now over in the Qualcomm universe of devices, we have the jf* boards:

jf_eur, jf_att, jf_can, jf_cmcc, jf_cri, jf_dcm, jf_ktt, jf_lgt, jf_skt, jf_skt, jf_spr, jf_tmo, jf_vzw.

Now it's pretty obvious that there are duplicate devices for both the bolded markets, again the European, Korean, and Japanese variants.

I'm not claiming anything here; the above may either point out that Samsung will in the future have or replace the Qualcomm devices back with Exynos devices, or it may mean that the source code is older and displays Samsung's device line-up before they switched to Qualcomm. The question is, if they'll continue to source Qualcomm devices for the markets which got them from the beginning.

In either case, it just seems that the S4 is a massive failure for Samsung in terms of a product launch, they released it too early, and due to the apparent manufacturing issues, a big clusterfuçk has emerged in the device lineup in the last minute.
 
http://oled-a.org/news_details.cfm?ID=783

Apparently, Samsung will be using a plastic unbreakable display for the up coming note 3.

According to the OLED association, this will have the following benefits....

-The display will be even tougher than glass, as it will bend and absorb impacts better.
-The display will be substantially thinner than glass oled, which is already thinner than lcd.
-The display will be half the weight of glass oled at 50g, with glass oled and lcd at 100 and 160 grams respectively.

Quoted downsides include....likely poor yields due to new technology = less displays for phones and/or less profit from such displays.
Samsung are not likely to be able to produce these displays with full RGB arrangement at 1080p.

A couple of questions, first if the phone ends up being 6 inches as rumoured, why cant an S-stripe pattern be used to achieve 1080p? And if not why would samsung go for a lower resolution just to get RGB pattern (as the article suggests) when they could use the diamond pentile and achieve 1080p easilly?

I understand why plastic would be more impact resistant over glass, but what about scratches? Surely that is down to hardness which most folk would naturally think glass has more of, or are there more harder plastics out there?

Also the quoted 160grams for lcd display does seem a little steep, even with it being a 6 inch display.
 
Last edited by a moderator:
I understand why plastic would be more impact resistant over glass, but what about scratches? Surely that is down to hardness which most folk would naturally think glass has more of, or are there more harder plastics out there?

Why wouldn't there be a piece of Gorilla Glass on top of the flexible OLED screen?
 
I not saying there isnt as I dont know, thats why im asking. however no where have I seen it mentioned that there IS gorilla glass as that wouldnt be flexible or anymore 'unbreakable' than a regular amoled display would it?
 
I not saying there isnt as I dont know, thats why im asking. however no where have I seen it mentioned that there IS gorilla glass as that wouldnt be flexible or anymore 'unbreakable' than a regular amoled display would it?

When they state plastic vs glass, aren't they just talking about the substrate that is part of the display, not the protective Gorilla Glass, as per the image below. This new plastic substrate may be more flexible (unbreakable), but it's not going to be scratch resistant.

GsJ5wwxKE2AO1OKO.medium
 
When they state plastic vs glass, aren't they just talking about the substrate that is part of the display, not the protective Gorilla Glass, as per the image below. This new plastic substrate may be more flexible (unbreakable), but it's not going to be scratch resistant.

GsJ5wwxKE2AO1OKO.medium

Yes but the article title clearly states 'flexible'...how is gorilla glass flexible?..if we are talking about the same technology as YOUM..which I think we are..then im pretty sure glass is not used in any part of the display..how could it?

Perhaps the galaxy note 3 would be a hybrid first generation product, with real YOUM plastic displays used on later flexible new form factors.

Edit..if you look at the diagram in the article..the lcd and oled glass displays clearly state 'glass' in their make up, the plastic display just says thin film encap.
 
Last edited by a moderator:
Would you want the Galaxy Note to be flexible? I wouldn't. It would hamper the pen experience. Not to mention the other design and engineering challenges.

As for that diagram, yes it clearly states that the LCD and glass OLED displays have glass in their make up, but those screens will still have a sheet of Gorilla Glass on top of that (and a touch sensitive layer between the LCD and the Gorilla Glass). As for flexible glass, just Google Corning Willow glass as an example.
 
Would you want the Galaxy Note to be flexible? I wouldn't. It would hamper the pen experience. Not to mention the other design and engineering challenges.
The phone is not going to actually flex like a piece of rubber. It just takes impacts better. If it falls and some part of the screen undergoes a deformation of 0.2%, it won't shatter like normal glass would.
 
The phone is not going to actually flex like a piece of rubber. It just takes impacts better. If it falls and some part of the screen undergoes a deformation of 0.2%, it won't shatter like normal glass would.

Oh I know that that's the main point behind flexible OLEDs. At least initially. I just said that as I thought that french toast was implying that we would see a Note 3 that could actually flex. As Samsung has shown with their YOUM prototypes and concepts though, the potential for flexible screens could be about more than just a screen that's less prone to breakage.
 
Yea personally I wouldnt want a proper roll out note. :)

A first generation scenario like you mention with plastic substrate and gorilla glass 3 would be preferable, however im still not totally sure that is what the article is implying.
 
Back
Top