We could have had better results if devs were able to develop Cell-ideal games too and the software was actually mapped well to it.
Basically, Cell represents a returning to computing's roots, and as such will be intrinsically 'superior' to getting work done, but at the cost of software requirements. Computers work a particular way which is completely alien to human thinking. In computing's infancy, people had to learn to think like a machine. They hand-crafted code to make the machines do things the way the machines liked, to get them to work at their peak. Over time, the exponential increase in processing performance coupled with a need for more complex software saw processors designed to accommodate a human way of doing things. Eventually, code development became human led, based on being easy for developers, rather than led by the machine's requirements. But this is wasteful both in prodessing time and silicon budget. Now your CPU is hopping around random memory and trying to shuffle which instructions to do when to get the developers code to operate effectively. The result is your chunky processor attaining 1/10th what it's actually capable of doing. But then you have to write code the Old Fashioned way, fitting the machine rather than having the machine work with your human-level thinking - 'difficult to code for'
Many of the criticisms levied at Cell are only of the time. It was a processor in its infancy, with tools to match. When you look at the tools such as that guy's video about Time To Triangle and the massive program needed for Cell proving how complicated and 'crap' the SPE was, that simple 5 line 'printf' code is hiding a metric fuck-ton of code hidden in the OS, firmware, tools, libraries, etc. The actual amount of code needed for the machine to produce that "Hello World" on screen is possibly millions of instructions. That's been hidden from the dev thanks to tools that have been developed over decades. If you were to compare the raw machine language to get the CPU to put something on screen to the machine language needed for the SPE, the difference would be a miniscule percentage more effort.
Now imagine Cell has been around 15 years. You'll be able to go into VS and include a bunch of standard libraries that do the base-work and boiler plate for you, just as happens with the x64 CPU you write to. Not only that, the Cell has received numerous updates to improve its usability. Now you have a 128 core processor that's faster than Threadripper but far smaller because it doesn't have all the legacy stuff designed to make 'human centric' code run at a decent speed because all the software on Cell is 'machine centric'. There's no reason to doubt this as it's exactly what's happened in the GPU space. GPU workloads were designed around streamed data keeping the ALUs occupied, and software was machine-friendly needing educated engineer to write bespoke shader. GPU work hasn't had decades of clumsy branchy code that modern GPU are haven't to try to run quickly. As such, you can stuff multiple teraflops of ALU in a GPU's silicon budget and get that to do useful work thanks to not hopping around randomly in memory and executing instructions in arbitrary orders. Just as GPUs are faster at data-driven workloads than CPUs (and CPUs are better at machine-centric work rather than human-centric work), so too is Cell in principle.
Now imagine a modern data-driven engine on PS3, maybe using something like SDFs, new concepts that could have been supported on PS3 but they weren't in the developer toolset or mindset at that point. The results would have been mindblowing and likely not possible on other platforms that lacked the raw power to execute.