Samsung Galaxy S series rumours.....

If its clocks are high enough (~650MHz?), a Mali 450MP8 could be on par with most high-end SoCs.

Once you go wider with an implementation you're not necessarily targeting the same frequencies as with less units, unless you don't care about power consumption at all. Alone 8 TMUs at 650MHz aren't going to do you any favours.
 
Yea it's an old uarch.. I'm saving up for a note 3 so I'm hoping for a t624 or something.

If the suggestion is true, then clearly they don't like T604, possibly another T-family member wasn't ready in time. the reason for not going with SGX544 would be interesting to hear.
 
Once you go wider with an implementation you're not necessarily targeting the same frequencies as with less units, unless you don't care about power consumption at all. Alone 8 TMUs at 650MHz aren't going to do you any favours.

All I thought was that if this Mali 450 GPU has 2.5/3x the performance of the Mali 400MP4 in Exynos 4412, it'll be competitive with the fastest Adreno 320 and PowerVR Series 5XT implementations we've seen so far.
It may even best the 544MP3 in the current 5410.
 
GSMarena got this information from sammobile. Given their Galaxy S4 track record, i wouldnt put too much stock in them anymore

Going by previous releases, logically Galaxy Note 3 should be using a better binned and higher clocked 5410.
 
GSMarena got this information from sammobile. Given their Galaxy S4 track record, i wouldnt put too much stock in them anymore

Going by previous releases, logically Galaxy Note 3 should be using a better binned and higher clocked 5410.

The 5410 has been a slight disappointment regarding, especially when compared to Snapdragon 600.
It might be the fact that Android's current kernel doesn't properly support big.LITTLE, or maybe it's being agressively throttled for thermals and lower power consumption, but the fact is that it's not showing better results than the seemingly cheaper S600.

That said, it's possible that the 5410 won't be as widespread as some thought, like its 5210 predecessor.
 
GSMarena got this information from sammobile. Given their Galaxy S4 track record, i wouldnt put too much stock in them anymore

Going by previous releases, logically Galaxy Note 3 should be using a better binned and higher clocked 5410.
SamMobile are a bunch of amateurs, they've lately emailed me asking questions about the Exynos version for an article and they truly are clueless. I doubt ANY of their sources are legit beyond pulling firmwares from the Samsung Kies servers.

And I'm pretty sure the Note 3 will use the 5410; the source code has signs of a board named "Vienna" and also Wacom remnants which if not for a possible Note 3 would have no reason at all to be there in that device tree for a new major Linux kernel version.
 
If the suggestion is true, then clearly they don't like T604, possibly another T-family member wasn't ready in time. the reason for not going with SGX544 would be interesting to hear.

I don't think the suggestion is true :) , I'm just crossing all sorts of things that shouldn't be crossed, that they got mixed up and the real gpu is some kind of 8 core/cluster midguard, something like a mali t658 (canned?).
 
GSMarena got this information from sammobile. Given their Galaxy S4 track record, i wouldnt put too much stock in them anymore

Going by previous releases, logically Galaxy Note 3 should be using a better binned and higher clocked 5410.

This.. (unfortunately)
 
The 5410 has been a slight disappointment regarding, especially when compared to Snapdragon 600.
It might be the fact that Android's current kernel doesn't properly support big.LITTLE, or maybe it's being agressively throttled for thermals and lower power consumption, but the fact is that it's not showing better results than the seemingly cheaper S600.

That said, it's possible that the 5410 won't be as widespread as some thought, like its 5210 predecessor.

I agree, considering s600 is essentially qualcomms 2013 upper midrange processor it is somewhat disappointing.

But hey, it's still a bloody good price of silicon to have in your mobile, it's the fastest chip on the market as of now and is competitive on power consumption.

Current games should actually run better on sgx 544 mp3, future gaming doesn't look hot however.

Personally if samsung keeps that gpu for 5410 prime (almost guarenteed) then I would want s800 in my note 3.
 
All I thought was that if this Mali 450 GPU has 2.5/3x the performance of the Mali 400MP4 in Exynos 4412, it'll be competitive with the fastest Adreno 320 and PowerVR Series 5XT implementations we've seen so far.
It may even best the 544MP3 in the current 5410.

Between what you "think" based on any IHVs colourful marketing claims and reality there's also quite a distance. Do you think you'll get the 104M Tris out of a 450 clocked at 480MHz in real time? http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-450-mp.php

The 450 might be attractive in terms of die area by itself compared to the 544MP3 but definitely not in terms of performance otherwise Samsung would be mighty stupid to not use the 450 for the 5410 instead. Just for the record's sake Samsung increased the peak frequency of the Adreno320 for the GalaxyS4 [strike]while the MP3 came with 10% than originally projected frequency[/strike]. IMHO a move to level the perf/W ratios between the S4/S600 and S4/5410 variants in order to have a fairly balanced offering and not end up with too big performance differences.

Would the S4/S600 have the 320 clocked at default 400MHz [strike]and the 544MP3 would be clocked at the originally projected 533MHz[/strike] the picture wouldn't be fundamentally different than today, but definitely not as much in favour for the S600 as it stands today. By the time 5410 shortages have been battled Samsung will most likely use exclusively their own SoCs and they can if future hw revisions allow it increase both CPU and GPU frequencies if they want which wouldn't be a first for Samsung's usual policy either.

The 5410 has been a slight disappointment regarding, especially when compared to Snapdragon 600.
It might be the fact that Android's current kernel doesn't properly support big.LITTLE, or maybe it's being agressively throttled for thermals and lower power consumption, but the fact is that it's not showing better results than the seemingly cheaper S600.

That said, it's possible that the 5410 won't be as widespread as some thought, like its 5210 predecessor.

It's 5250 for hairsplitting's sake and that one didn't see much integration because it would be a mighty dumb idea to use a tablet >5W TDP SoC for a smartphone platform. We all know that the 5250 has ONLY big but no LITTLE CPU cores, besides the T604 which doesn't seem to do any favours in terms of power consumption either.

As for the S600 again its default clocks for the CPU are NOT at 1890MHz and NOT 450MHz for the GPU, so make out of it what you want. As I said above by the time Exynos production starts rolling the overclocked S600s will gradually disappear and Samsung will most likely as up to now increase through future hw revisions frequencies of the 5410.

***edit: parts striked out due to frequency misconceptions.
 
It's 5250 for hairsplitting's sake and that one didn't see much integration because it would be a mighty dumb idea to use a tablet >5W TDP SoC for a smartphone platform.

You say that but I'm sure 5410 has a higher TDP. Are you sure TDP is what you wanted to refer to?
 
Between what you "think" based on any IHVs colourful marketing claims and reality there's also quite a distance. Do you think you'll get the 104M Tris out of a 450 clocked at 480MHz in real time? http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-450-mp.php

The 450 might be attractive in terms of die area by itself compared to the 544MP3 but definitely not in terms of performance otherwise Samsung would be mighty stupid to not use the 450 for the 5410 instead. Just for the record's sake Samsung increased the peak frequency of the Adreno320 for the GalaxyS4 [strike]while the MP3 came with 10% than originally projected frequency[/strike]. IMHO a move to level the perf/W ratios between the S4/S600 and S4/5410 variants in order to have a fairly balanced offering and not end up with too big performance differences.

Would the S4/S600 have the 320 clocked at default 400MHz [strike]and the 544MP3 would be clocked at the originally projected 533MHz[/strike] the picture wouldn't be fundamentally different than today, but definitely not as much in favour for the S600 as it stands today. By the time 5410 shortages have been battled Samsung will most likely use exclusively their own SoCs and they can if future hw revisions allow it increase both CPU and GPU frequencies if they want which wouldn't be a first for Samsung's usual policy either.

I'm not looking at theoretical performance numbers, only benchmark results from HTC One (4*1.7GHz Krait 300 + 400MHz Adreno 3200 + 2*1066MT/s RAM) and Galaxy Note II/8.0 (4*1.6GHz Cortex A9 + 440MHz Mali 400MP4 + 2*800MT/s RAM):

rz9tZZR.png



Bandhwidth limitations aside, a Mali 450MP8 at ~650MHz would have about 3x the GPU performance of a 440MHz Mali 400MP4, so it would be competitive with the Snapdragon S600.






It's 5250 for hairsplitting's sake and that one didn't see much integration because it would be a mighty dumb idea to use a tablet >5W TDP SoC for a smartphone platform. We all know that the 5250 has ONLY big but no LITTLE CPU cores, besides the T604 which doesn't seem to do any favours in terms of power consumption either.

The 5250 didn't see any integration at all for any samsung tablet that was launched after the Nexus 10, not just smartphones. The Note 10.1 LTE, Note 8.0 and the recently announced Tab 3 models aren't coming with that SoC, so it's not just because it wouldn't fit in a smartphone. Samsung apparently didn't find the SoC to be very performance/cost effective anyway.





As for the S600 again its default clocks for the CPU are NOT at 1890MHz and NOT 450MHz for the GPU, so make out of it what you want. As I said above by the time Exynos production starts rolling the overclocked S600s will gradually disappear and Samsung will most likely as up to now increase through future hw revisions frequencies of the 5410.

***edit: parts striked out due to frequency misconceptions.

Don't put too much heart into the increased maximum clocks of the S600 implementation for the GS4 when compared to other smartphones with the "normal" S600.
Real world results are pointing to the GS4 having very similar performance to the HTC One, which has lower max clocks for the GPU and much lower memory bandwidth.
The S600 in the GS4 probably has more aggressive underclock throttling for thermals and power consumption, to compensate for the higher max clocks.

A little trip to GLBenchmark (Median scores, which will show stock clocks) or 3dmark online results will show that the HTC One is some 4% slower than the GS4 even though it uses older drivers. Then the Exynos 5410 trails behind both of those S600 solutions by more than that:
GS4 Exynos 5410: 699 Frames
GS4 S600: 859 Frames
HTC One S600: 825 Frames


So Samsung may start to ramp up the production of the GT9500 while building more Exynos 5410, but they will be releasing devices with slower GPUs in the process.
 
I'm not looking at theoretical performance numbers, only benchmark results from HTC One (4*1.7GHz Krait 300 + 400MHz Adreno 3200 + 2*1066MT/s RAM) and Galaxy Note II/8.0 (4*1.6GHz Cortex A9 + 440MHz Mali 400MP4 + 2*800MT/s RAM)

I don't think the HTC One clocks its GPU at "just" 400MHz but I can't swear on it either.

Bandhwidth limitations aside, a Mali 450MP8 at ~650MHz would have about 3x the GPU performance of a 440MHz Mali 400MP4, so it would be competitive with the Snapdragon S600.

So where's my answer why Samsung didn't pick a 450 instead? Did you see that 650MHz frequency for the 450 anywhere listed or rumored or is it just a freely invented number? If that should be just a theoretical exersize I can also claim that a 554MP4 at 400MHz would still beat the 450@650MHz and we'd still be twisting in circles.

My original comment was that when you increase your unit amount frequency doesn't scale that easily with it and there's nothing that speaks against it.

The 5250 didn't see any integration at all for any samsung tablet that was launched after the Nexus 10, not just smartphones. The Note 10.1 LTE, Note 8.0 and the recently announced Tab 3 models aren't coming with that SoC, so it's not just because it wouldn't fit in a smartphone. Samsung apparently didn't find the SoC to be very performance/cost effective anyway.

Where's the difference in the last two sentences exactly if you really think about it?

Don't put too much heart into the increased maximum clocks of the S600 implementation for the GS4 when compared to other smartphones with the "normal" S600.
Real world results are pointing to the GS4 having very similar performance to the HTC One, which has lower max clocks for the GPU and much lower memory bandwidth.
The S600 in the GS4 probably has more aggressive underclock throttling for thermals and power consumption, to compensate for the higher max clocks.

A little trip to GLBenchmark (Median scores, which will show stock clocks) or 3dmark online results will show that the HTC One is some 4% slower than the GS4 even though it uses older drivers. Then the Exynos 5410 trails behind both of those S600 solutions by more than that:
GS4 Exynos 5410: 699 Frames
GS4 S600: 859 Frames
HTC One S600: 825 Frames


So Samsung may start to ramp up the production of the GT9500 while building more Exynos 5410, but they will be releasing devices with slower GPUs in the process.

Since we're at the benchmark comparing orgy here try also GLB2.5 comparison and then we can cut an average through that.

Point still stands for the HTC One GPU frequency until I stand corrected for that one also that is :LOL:
 
The Exynos 5410 has a higher TDP than the Exynos 5250?

First and foremost, I don't think either of them specify TDP on datasheets the way Intel does. Or at least not ones that are visible to us, so I don't really know. We do know that they'll throttle if thermals trip past some limit, same with Qualcomm chips, so the thermal capacity of the particular device (and software configuration for throttling) will play a role on max performance.

But I'm pretty sure that four Cortex-A15s @ 1.6GHz on Exynos 5410 will use more power than two Cortex-A15s @ 1.7GHz on Exynos 5250, unless some pretty drastic change happened somewhere. The small difference in clock speed and the benefit in the better process shouldn't make that much of a drastic difference.

This doesn't consider power consumption from full CPU + full GPU load, but neither does your 5W figure so I didn't think you were talking about this.
 
First and foremost, I don't think either of them specify TDP on datasheets the way Intel does. Or at least not ones that are visible to us, so I don't really know. We do know that they'll throttle if thermals trip past some limit, same with Qualcomm chips, so the thermal capacity of the particular device (and software configuration for throttling) will play a role on max performance.

Replace then TDP with average power consumption.

But I'm pretty sure that four Cortex-A15s @ 1.6GHz on Exynos 5410 will use more power than two Cortex-A15s @ 1.7GHz on Exynos 5250, unless some pretty drastic change happened somewhere. The small difference in clock speed and the benefit in the better process shouldn't make that much of a drastic difference.

No it shouldn't for peak power consumption.
 
Back
Top