Samsung Exynos 5250 - production starting in Q2 2012

  • Thread starter Deleted member 13524
  • Start date
Three SoC manufacturers with custom cores? I presume you're not including Apple there, if so we have NVIDIA, Qualcomm and Marvell, right? Any other I'm missing?

Why would I exclude Apple? From what I can tell Swift is a custom core based on ARM instruction sets.
 
Why would I exclude Apple? From what I can tell Swift is a custom core based on ARM instruction sets.

I only said that because Apple's Swift core isn't in any chip that's a direct competitor to Samsung's Exynos chips. Not unless Apple starts licensing or selling it's own chips to other manufacturers. It is true that it's indirect competition of course.
 
Agreed. However Apple counts to the big SoC manufacturers today and is in its own way a direct competitor to Samsung for tablets and smartphones.
 
It's amazing seeing how fast mobile graphics have grown in such a small time period but I have to say, What's the point?

It's not as if they have the controls to be true gaming machines, the most popular mobile games are 2D based like Angry Birds and Where's My Water.
 
It's amazing seeing how fast mobile graphics have grown in such a small time period but I have to say, What's the point?
thats all about to change ;), submitted me latest game (should be in appstore in a week or so)
http://www.zedzeek.com/rollybolly/ Sure u could do the same basic thing in 2d, but it detracts from the immersion a bit.
though youre right about touchscreens not being suited to most game (rollybolly being an exception, in fact tilting the screen works better than a keypad,stick could)

latest ios game btw, same issue how to control the ship? player tilts the screen (u can also drag your finger across the screen, but thats not as good)
but both are no way as good as proper input
iOS-Simulator-Screen-shot-16.01.2013-12.16.26-PM.jpg

here Ive got the camera on ~30degree tilt so you can see whats coming up more, to do this in 2d is practically impossible. So the extra 3d power for phones can be put to practical use
 
Hey, been a while :) er right, dissapointed with the recent revelations about 544 mp3 to be honest....I know its very powerfull for a phone and we are all spoiled these days, but was really hoping for some more midgard uarch...I can only hope that the sgx licence is for he exynos 5 quad already mentioned and likely used for windows rt...
 
It's a build.prop device spoof, fake. The 9500 model was also already used last year for their prototype Tizen phone. Funny enough, it's probably a Note 2 overclocked using my kernel.

Yeah the scores made me think it was fake. But i had also read they wont use 9400 because the number 4 is supposedly bad luck in Korea.
 
questionable source but rumors are always interesting

http://www.brightsideofnews.com/new...let-at-mwc-2013-quad-core2c-gpgpu-inside.aspx

Sammobile also claims Samsung has a new high end tablet in the pipeline with model number GT-P8200

Nexus 10 had model number GT-P8110

If it should contain a T678 it will have 8 clusters and not any random number between 1 and 8 as BSN implies. The funny thing would be that ARM itself has the 678 listed for 2014 on its own website's roadmap while the 604 is to be followed by the 624: http://www.arm.com/images/graphics-and-GPU-Compute-roadmap.jpg
 
If it should contain a T678 it will have 8 clusters and not any random number between 1 and 8 as BSN implies. The funny thing would be that ARM itself has the 678 listed for 2014 on its own website's roadmap while the 604 is to be followed by the 624: http://www.arm.com/images/graphics-and-GPU-Compute-roadmap.jpg

Where does the T628 fit in though? It can be scaled to 8 cores wich according to ARM would be 2X T624 in graphics and compute

Since T678 is 2014 according to the roadmap, i would assume T628 was created to fill in the vacuum left in 2013 performance wise
 
Where does the T628 fit in though? It can be scaled to 8 cores wich according to ARM would be 2X T624 in graphics and compute

Since T678 is 2014 according to the roadmap, i would assume T628 was created to fill in the vacuum left in 2013 performance wise

Good question; I don't even know what the difference between 628 and 678 actually is as ARM shows in its pics the same 8 clusters for both.

From ARM's homesite for the T628:

The Mali-T628 delivers up to 10x the graphics performance of the Mali-400-MP GPU, as well as an increase in GPU Compute performance when compared with the Mali-T604 GPU.

For T678:

The Mali-T678 delivers a 50% performance improvement compared to the Mali-T658.

Whereby the 658 is no longer listed on their site. Call me confused.
 
Guys what about power consumption of that new midguard uarch...anandtech article didnt put the t604 in a good light...not awefull..but not as good as sgx 5 series...
 
Guys what about power consumption of that new midguard uarch...anandtech article didnt put the t604 in a good light...not awefull..but not as good as sgx 5 series...

How can one conclude anything from that horrible GPU test? Without knowing specifics like frames or resolution or even what game etc its nothing but an advertisement for Intel

If you compared a HD4000 vs GTX 670 at HD resolution for the former and full HD or more for the latter and ignored frames and any other specifics, just looked at maximum/minimum mW then HD 4000 would look like a real winner right?
 
How can one conclude anything from that horrible GPU test? Without knowing specifics like frames or resolution or even what game etc its nothing but an advertisement for Intel

If you compared a HD4000 vs GTX 670 at HD resolution for the former and full HD or more for the latter and ignored frames and any other specifics, just looked at maximum/minimum mW then HD 4000 would look like a real winner right?

Well as it comes from a nexus device yea you may have a point...but this samsung sgx licence is still to show up...maybe its windows 8 drivers...or maybe its t604 power consumption. ..just speculating here..
 
An interesting demonstration of how LITTLE big works with simple tasks in the context of power saving.

Note I dont think the above demo used a gpu for page rendering...would think that also would increase efficiency and power consumption?
http://blog.gsmarena.com/samsung-pr...big-little-arm-architecture-of-exynos-5-octa/

It's clearly mentioned in the beginning that the prototype SoC has no GPU acceleration whatsoever.

That sample had 5*Cortex A7 plus 2*Cortex A15. I didn't know they could match an odd number of Cortex A7 to an even number of Cortex A15 in big.LITTLE. The thing is a lot more flexible than I thought.
 
Back
Top