Cell's dead baby, Cell's dead. Even in its final form. *spawn*

Shifty Geezer

uber-Troll!
Moderator
Legend
This is where Cell would have been cool. Same processor in consoles and TVs. Same apps. Same security in TV as consoles. Cross device games (TV could be console lite). *sigh*
 
This is where Cell would have been cool. Same processor in consoles and TVs. Same apps. Same security in TV as consoles. Cross device games (TV could be console lite). *sigh*

Let's not continue to romanticize a disaster. It's not like Sony couldn't keep going with the Cell in their TV's if they thought it would be as great as you make it out to be.

App developers are clearly fine with having apps on multiple devices and Cell would never have been any sort of a standard anyhow. It's a dead end.
 
This is where Cell would have been cool. Same processor in consoles and TVs. Same apps. Same security in TV as consoles. Cross device games (TV could be console lite). *sigh*

Don't make us smite you.
 
This is where Cell would have been cool. Same processor in consoles and TVs. Same apps. Same security in TV as consoles. Cross device games (TV could be console lite). *sigh*

Why piggyback off of consoles when you can piggyback off a way larger market which is smartphones? Cheaper hardware with a way larger library of potential apps to migrate over to the TV space.
 
What a sad outcome. Years of development, R&D, a powerful CPU only to fall flat and disappear

What went wrong
 
What went wrong
Speaking entirely as a layperson, it would seem they simply R&D'd themselves down the wrong road. Cell's subprocessors had no way to address main memory other than indirectly via DMAing chunks of data in and out of their private (and comparatively small) chunk of SRAM, and latency across the shared ringbus was horrific. Maybe the instruction set was weird too, I don't recall too good after all this time. :p

Basically everything I do remember reading over the years of PS3's lifespan said Cell was difficult to program in general, and that it was a bitch to adapt common algorithms to run efficiently on its wonky architecture. It had certain areas of expertise where it was fast or even extremely fast, and then a lot of common cases where it was not, and then many other cases where getting it to run at a comparable performance level to traditional processors required a disproportional level of software development effort.
 
What went wrong

Everyone woke up from the dream [nightmare]. ;)

@Grall sums up my thoughts fairly well. Developers had a choice, live in custom to the metal mysterious arts and magic or progress to an age of simplified development through mainstream processors and better software development tools.
 
Speaking entirely as a layperson, it would seem they simply R&D'd themselves down the wrong road. Cell's subprocessors had no way to address main memory other than indirectly via DMAing chunks of data in and out of their private (and comparatively small) chunk of SRAM, and latency across the shared ringbus was horrific. Maybe the instruction set was weird too, I don't recall too good after all this time. :p

Basically everything I do remember reading over the years of PS3's lifespan said Cell was difficult to program in general, and that it was a bitch to adapt common algorithms to run efficiently on its wonky architecture. It had certain areas of expertise where it was fast or even extremely fast, and then a lot of common cases where it was not, and then many other cases where getting it to run at a comparable performance level to traditional processors required a disproportional level of software development effort.
This is what I cant comprehend. The design seems flawed and impractical for something they were planning to include in pretty much any device imaginable. The Cell even found its way in VAIOs if I recall. These guys wanted to introduce the consumer market into parallel processing which means, the design should have considered choices that would have made its implementation easy for a wide market (from the simplest to its most advance uses). For something they were R&D'ing for so much money and for so many years, they missed even the most self-evident design/engineering choices/features that would have made the chip useful and meaningful for everyone And they werent alone in the whole design process. We are talking about a joint investment of Sony, Toshiba and......IBM which has an even better understanding about computers and software. They all missed a huge opportunity with a chip that doesnt make much sense retrospectively. The design self sabotaged the practical use.

Considering that the chip was planned for the PS3 from the start, they didnt seem to have taken into consideration even how programmers and game developers would have benefited the most of it all the years its been into the R&D stage.

Its like they made the chip, drafted a theoretical scenario how it would have worked, then threw it into the market and expected the market to guess how to make it useful
 
Last edited:
1) If I remember correctly they were kinda blindsided by the introduction of and/or the move to programmable shaders in the GPU. They were right about a multi-threaded future but shaders won out.

2) I think they were also looking for an architecture that didn’t have to rely on the GPU as they’re expensive and make it difficult to cost reduce consoles over their life.

They took a gamble and lost but that doesn’t necessarily mean it was a bad idea.
 
Its like they made the chip, drafted a theoretical scenario how it would have worked, then threw it into the market and expected the market to guess how to make it useful
The strategy of "build it, and they shall come." ...Only they didn't; only those who had to, because they wanted to put their software on the PS3. :p

I can only assume this all happened because multicore processors were not in widespread use at the time of Cell's architecting, and writing efficient software for such processors was poorly understood by mainstream programmers. Maybe Sony, Toshy and IBM felt they could opportunistically seize the market with their own design? *shrug* Only supercomputers and mainframes were multi-cored to a large extent back then, and only a small group of people worked on such beasts, few if any of them were into games programming... Most games programmers seemed to view multicore CPUs as nothing but a source of annoyance, John Carmack for example famously exclaiming he'd rather have a 10GHz monolithic CPU than a multicore chip.
 
The developers didn't seem to have as many issues with the multicore CPU in the X360. So the troubles with Cell were something else.

I think the Cell Alliance had dreams of grandeur and taking over the entire market by going their own way with the architecture.
 
So the troubles with Cell were something else.
360 xenon (or was it xenos? I could never tell those names apart! :D) was a traditional CPU in triplicate, even though a really really bad CPU with horrible IPC. Still, it behaved within parameters of established paradigms; Cell APUs or whatever the fuck they were called were so different in comparison. There were more of them, and they all had their own small dedicated memory pools that had to fit all the stuff you needed, program code, data, DMA buffers for I/O and whatnot. And it was a long way from an APU to main RAM. All of it working in concert to trip up programmers, from what I seem to recall of what developers posted here and elsewhere on the webs. Unless maybe they were Sony first-party, in which case Cell was exciting to work with, downright fucking awesome in fact, super fast and all-powerful. :LOL:
 
From what I remember from random sources.... possibly tainted by 10 years of not caring enough to look it up...

Toshiba and Sony wanted a stream processor, IBM wanted massive compute platform. On the surface it looked like a good way to save money together. And they did, 400 million is nothing compared to redesign or source an asic every time there is a new codec (the rate of which was accelerating in the mid 2000s), or using fpgas, or large cpus not designed for massive compute.

Toshiba continued trying to use cell as a universal platform that could be used in media consumer electronics, but never really succeeded before they stopped making tvs and hddvds. The idea was that it was much less expensive than an FPGA (see denon in the early 2000s for their receivers, they had AD dsps for audio processing, and large fpgas for video and hdmi). Any CPU solution was insufficient/expensive, and at the other end of the spectrum, custom ASICs take years to develop, with expensive R&D, and massive risks because they cannot add new codecs (see the time it took for sigma design to get their asic ready for bluray/hddvd, see the dts clusterfuck for dvd).

Cell needed predictable instruction cycles to simplify real time stuff. Codecs often are designed with a worst case number of instructions required to decode a block, but that is algorithmic count, so dependency on cache is unpredictable specially in a multiprocessing environment. A dev in montreal tried to explain this to me, maybe I am understanding it wrong.

Sony continued to develop video processing asics for their tvs. To this day nobody succeeded in making the equivalent of ARM for massive compute... We ended up years later with GPU manufacturers retrofitting graphics processors for this purpose. Open platform or industry consortium for compute hardware is still a problem.

TVs and media players continued with asics and hybrid solutions because it remained less expensive per unit despite requiring massive investment for each asic. The codec rate of introduction is less of a problem today and solutions are ARM cores with codec blocks instead of fully flexible compute. There are now third party solutions using ARM for consumer electronics, few companies roll their own anymore.
 
Last edited:
Sony also didn't seem too interested in selling them.
The company I work for wanted to evaluate their number crunching abilities when built into our custom hardware, but Sony would only sell them as part of a blade not as individual chips. So that was the end of that.
I expect other companies would have run into the same problem so no wonder it never spread to other products.
 
Sony also didn't seem too interested in selling them.
The company I work for wanted to evaluate their number crunching abilities when built into our custom hardware, but Sony would only sell them as part of a blade not as individual chips. So that was the end of that.
I expect other companies would have run into the same problem so no wonder it never spread to other products.
IBM was selling blades, Mercury Systems were selling Cells on pcie cards. Sony and toshiba were only making their own custom chips, so those can't be sold. Are you sure it was Sony offering blades?

You would have needed to be licensing the IP to design your own chip, the same way ARM doesn't make cpus, they just manage the IP. Or ask Mercury for the chips only, not the pcie card. Or IBM.
 
Back
Top