Cell's dead baby, Cell's dead. Even in its final form. *spawn*

Ah, transputers. If you've ever used the programming language Occam there's a updated compiler and virtual machine called Occam Pi that you can experiment with.
 
All I see is a visionary not very good at explaining himself meeting people who can't adapt to his ideas. That specific example your video ends on about trees is the one I remembered and mentioned here:


Basically the dev is thinking about fitting existing solutions to the evaluation of actions on data, whereas the whole point is considering alternative methods and structures. Acton has spent an hour trying to get people to see about thinking about data differently in relation to the real costs in processing, which is the memory accessing, not the maths transformations, and this person has stood up and just mentioned using existing structures, completely missing the point.

Ultimately you need to go back to why you're using a BSP. If it's the best way, it's the best way, but you still need to do the analysis and consideration rather than just use a tree because that's the current template solution.
 
Last edited:
I really want a PS3 game to be announced as in development, I would love to see what developers do with the SPU's and what modern rendering techniques they can get running on it.

Could it do ML based upscaling? Voxel cone tracing? :unsure:
 
Cell design is extremely powerful for its time. It may have been difficult to extract the power from cell, but that was a learning process. The hardware is capable of what the hardware is capable of. I think if people knew how to maximize PS3 from day 1 it would have been a pretty solid case vs Xbox 360. It's just one of those things that is attributed to the fact that developers were still largely of single threaded mindset back then. The fact that we have to revisit those types of software designs again is sort of testament that we got back there anyway, only difference is that we do it through cores instead of SPE.

The reality is that, to really maximize hardware, developers need to code software specifically designed and tailored to it. Which, will mean moving away from object oriented code, and code that is written for human logic. But in this day of age performance seems to matter less than iteration speed. The reality is, the more you want to scale up, you'll likely end up adopting some form of coding in which everything is done in parallel and at the same time. We can't just slowly iterate through each object and apply logic to it. Single threaded game code is really about simplification for our minds to handle, but as you can see with DoD and where unity is headed, there are ways to program in parallel that takes significant advantage of the hardware.

There are issues with Cell, but parallel programming paradigm wasn't one, that's a developer problem that nearly 20 years later are we starting to see a push back in that direction. It is sad to see UE5 not be able to move in this direction just yet, but honestly, so many games are still stuck in that single threaded loop still.

Honestly, Tim Sweeney has nobody else to blame but himself for Unreal Engine's current slow gameplay framework. Tim Sweeney strongly swears behind the OO paradigm. He thought garbage collection collection was a good idea and transactional memory was the future of parallelism. Their investment into implementing a garbage collector could've arguably been effort better spent elsewhere with their engine. They started working on a new data-oriented game logic framework like Mass but hopefully they saw the light so that it get's seen to completion ...
 
Honestly, Tim Sweeney has nobody else to blame but himself for Unreal Engine's current slow gameplay framework. Tim Sweeney strongly swears behind the OO paradigm. He thought garbage collection collection was a good idea and transactional memory was the future of parallelism. Their investment into implementing a garbage collector could've arguably been effort better spent elsewhere with their engine. They started working on a new data-oriented game logic framework like Mass but hopefully they saw the light so that it get's seen to completion ...
Yea I guess the OO paradigm is what allows for the blueprints in UE. Easily accessible but at the cost of being very slow.

It’s pretty insane how quickly you can go and bust out a prototype.
 
They could have gone with a core to quad at the time, would have been very expensive to put that in every PS3, much faster too, though in contrast the Cell development was very expensive, probably more so across the board.
 
All I see is a visionary not very good at explaining himself meeting people who can't adapt to his ideas. That specific example your video ends on about trees is the one I remembered and mentioned here:

I think we have to disagree here. What it basically boils down to is different data -> different algorithms, which is not very novel.
 
I think we have to disagree here. What it basically boils down to is different data -> different algorithms, which is not very novel.
Those different algorithms have different levels of hardware utilisation. The point of the talk was to make devs aware of other choices they are ignoring by going with the status quo, highlighting that consideration of algorithms needs to look at the weak link in the processing which is largely data transactions, not maths complexity, and so change the data to allow different algorithms that run faster. If different data can't run faster than a tree, use a tree, but why are you choosing a tree and what are the alternatives and how do they perform by comparison? Was the tree chosen simply because that's standard practice and no-one questions it as it makes sense to the developer?
 
Those different algorithms have different levels of hardware utilisation. The point of the talk was to make devs aware of other choices they are ignoring by going with the status quo, highlighting that consideration of algorithms needs to look at the weak link in the processing which is largely data transactions, not maths complexity, and so change the data to allow different algorithms that run faster.

I think most devs att Cppcon is aware of this. But of course there was one retard at the talk who wasn't, so maybe I am wrong.
 
Cell was ready, but the dev tools and documentation were not. I've programmed Cell in a large scale server environment and like a lot of complex architectures - and what isn't these days - everything hinges on good documentation and good tools. Sony eventually realised this and this drove them to acquire SN Tools in 2005, but it was too late for games launching in the first couple of years.

Cell didn't work like any other processor design and there were initially no realtime debugging tools at all, what you had was IBM's Cell emulator which ran like a dog on every platform of any cost because accurate Cell emulation means emulating the individual PPU, SPEs and the interactions across the internal Cell bus, and the two external memory buses in PS3, which is why Cell remains a challenge to emulate some sixteen years later.

I spent a lot of time writing code for Cell in a massive server environment and we had pretty good tools, but we had to rewrite all our tools for Cell development from the ground-up. It was almost like alien tech arriving and you have to solve the mystery of how it worked. Getting Cell code to run some calculations at 20,000 x faster than x86 servers was joyous, but man it took so much damn effort.

Any theoretical reason you couldn't write a demo to run on RPCS3, before maybe testing on a ps3 debug kit? Speed shouldn't be an issue with a modern 8 - 16 core CPU either.
 
Last edited:
Any theoretical reason you couldn't write a demo to run on RPCS3, before maybe testing on a ps3 debug kit? Speed shouldn't be an issue with a modern 8 - 16 core CPU either.

The demo scene generally want to squeeze as much as performance out of the hardware as possible, so the emulator will have to be able to simulate in a cycle accurate manner (or, at least with some sort of accurate performance counters) to be very useful in this scenario. Otherwise, you'll probably end up with something that runs fast on the emulator but not really that well on the actual hardware.
 

In recent years we've seen a relative growth of homebrew dev in complicated platforms such as n64, ps2 or the 3DO, when it was close to non-existant a decade ago.

You are comparing the demo-scene comunity of PS3 with that of machines 30 years older. Lets wait until PS3 has been around for that long before we decide if it recieved any cool experiments or not.

By the way, weren't users given free range over the Cell through Linux on PS3? (just no acess to GPU) Regardless of if that was the case or not, people always find exploits and hacks to circumvent cecurity and open up the blacj boxes. It becomes a slower process with every gen, but eventually the time always comes.

I believe there will still be a time when retro PS3 games will be a thing.
 
Back
Top