Samsung SoC & ARMv8 discussions

This is strange.
Samsung should be able to purchase i.e. Vivante and enter the market much faster.
Why create a GPU from the ground up? And who is working on it?

I had heard some time ago that they HAD been working on their own GPU project but burried it and had cancelled more than one SoC project on their past roadmap. No idea if that's what Korean times is refering to and I don't have a clue about their credibility either. Remember there had been also repeating rumors for Apple supposedly developing its own GPU for which some were even sure that it'll definitely occur with A8? Also Fudo isn't exactly the brightest horse in the stable either *cough* ;) Jokes aside I'm just asking in case someone has heard anything substantial in that regard.

The most important question would be why if Samsung bothered with a custom design they haven't started with a CPU based on ARM ISA like everyone else but with a GPU instead hypothetically which isn't exactly easier. To go even further and connect it with the recent patent debacles is of course even more priceless :D
 
It's probably just me, but that always sounded like a very unbalanced architecture.
 
They've been experimenting and disclosing research work on their own mobile GPU designs even back to around the days they licensed MBX Lite for their S3C2460, around the time they got the original iPhone SoC contract.

And even back then, they were characterizing their prototype designs as more programmable relative to the established designs on the market. My feeling is they haven't yet been able to refine the hardware and drivers to achieve a balance between their priority for high functionality with adequate performance and efficiency.
 
They've been experimenting and disclosing research work on their own mobile GPU designs even back to around the days they licensed MBX Lite for their S3C2460, around the time they got the original iPhone SoC contract.
They announced the S3C6410 sometime in 2008. I believe that's the last iteration of their own graphics core that actually made it into a shipping product.
 
Hahahaha. The guy actually looked up Wikipedia for the Exynos because I planted the 7410 info there and the Gflops number is also bullshit by being off by 80% even in the most optimistic calculations.

All values are from wikipedia; further example: https://en.wikipedia.org/wiki/PowerVR

The 332.8 (at 650MHz) he quotes for the A8X GPU are from the 6XT table, with the only other difference that the 2nd values are FP16 GFLOPs, but he doesn't need to know that either *cough*
 
I've actually had the info and specs I listed on a Wikipedia entry which were derived from my own educated speculation turn up in official marketing materials from a major cellphone carrier.

Fact checking doesn't go so far within the marketing and research practices of some organizations.
 
Not sure it is the good thread for post this, but well, like i dont see any leaks over the week end ..

A good source in Italy have got some photos of what is allegedly be the Samsung S6 unibody cases in way of production..







I cant confirm this will be final design, but this is part of the project
 
Looks super sweet with those tiny bezels. I wish Apple would do something about the massive ones on their design, it's really getting long in the tooth now, even though it's iconic.

Sticking the hardware button on the reverse side's a bold move. It would take a bit of time getting used to, but overall a good solution I think. It clears room to shrink the bezels.
 
I don't think many know how many departments Samsung really has or what each of them is manufacturing, but I don't see what that has to do with the topic. As for the 2nd pic if that's a somewhat curved screen at the left and right edges, I'm not convinced at all that I could actually like the ergonomy of it.
 
I don't think many know how many departments Samsung really has or what each of them is manufacturing, but I don't see what that has to do with the topic. As for the 2nd pic if that's a somewhat curved screen at the left and right edges, I'm not convinced at all that I could actually like the ergonomy of it.

There's no way that last render is part of an official internal project at Samsung's design studio.
Portrait icon on the left, landscape icons on the right - notice also the inconsistently bent maps icon -, and a stock*ish* Android Ui?

The machining of the interior also looks suspiciously rough, even for a side not meant to be seen by end buyers..
 
There's no way that last render is part of an official internal project at Samsung's design studio.
Portrait icon on the left, landscape icons on the right - notice also the inconsistently bent maps icon -, and a stock*ish* Android Ui?

The machining of the interior also looks suspiciously rough, even for a side not meant to be seen by end buyers..

The photos was taken at different moment of the process of fabrications .. We know that Samsung have work on a lot of different design approach. Dont forget its allways impossible to confirm this type of leak rumor..

last one, who could well be this time the final one.

r1a7w9.jpg


15wijjk.jpg

2ez4prs.jpg


Should move this on a samung rumor thread
 
Snapdragon 810 isn't ready in time for the Galaxy S6 so Samsung will roll their own:

Samsung won't be using Qualcomm's Snapdragon 810 in the next Galaxy S smartphone because of overheating issues, according to a Bloomberg report published this morning. The phone will instead use "Samsung's most advanced chips," according to Bloomberg's sources—this suggests a 64-bit Exynos 7 Octa or a closely related chip.

http://arstechnica.com/gadgets/2015...gon-810-wont-make-it-into-samsungs-galaxy-s6/
 

the single-score values are IMHO rather high. Could that be true? Was the CPU really clocked at only 1.5GHz?

If I compare the results with the Nexus9 and the Apple6 SoC's then ARM seems to have improved the single-core values quite a lot with the A57. Link http://www.phonearena.com/news/Nexu...gra-K1-outperforms-Apple-iPhone-6s-A8_id61825

If true, then the Single-Score value of the Samsung 7420 is clock-for-clock higher than for Nvidias Denver core which could be the reason why Nvidia will not use the Denver-cores for the next gen.
 
Back
Top