Be thankful for Cell ... *mitosis*

Status
Not open for further replies.
I hope you refer to async GPU compute here as "modern system"?
A fraction of a class of systems, yes. Even among those, the Onion bus interacts with CPU caches and can exhibit poor behaviors if accesses between the two sides of the APU hit different locations on the same cache line.

It was partly because drivers/APIs were strictly single-threaded, but I would argue that Cell created the needed paradigm shift here.
It was because the era of exponential single-threaded improvements had not ended until more recently, and the economics of the platform until that came about meant that single-processor improvement could eclipse improvements made via greater concurrency.
The curve flattened, and transistor budgets continued to scale to allow affordable single-socket multicore, after which the greater inertia of the PC market meant things would take time.
Concurrent systems already existed for a long time in non-client realms.

Before Cell, there was the Sega Saturn with its two processors.
There were PC engines that could support SMP, as small a business case as that was outside of workstation and server markets at that time.

The biggest Cell difference was: you need to know how to write multi-threaded code, and how to manually manage its execution.
It was possible to write single-threaded code for Cell. Some early titles defaulted to that.
Manually managed execution is not something prioritized by GCN. That portion is actually significantly more autonomous for AMD.

All other things are irrelevant small nuances.
I guess I don't operate at a level where everything but the number of threads is miscellany.

I hope not. Tools need to be written by game developers, for their specific development needs.
Maybe you can make AMD see the light.

I think they are. Current hardware knows how to draw triangles, one by one. That's it.
There might be a smidge more than just a triangle being managed in the GPU pipeline.

They are impractical crap. It was clear in 2007, it's even more clear now. There is a huge demand for very low-level, thin APIs: DX12/Vulkan/Mantle/Swift. But the main player, MSFT chickened in the last moment and still uses too much CPU for render-related tasks.
Impractical means that they cannot be reasonably used, which is happening to the tune of almost all the software that might touch a graphical interface there is. That's a lot of impractical billions of people and programs.

Iterations are cheap now.
Cell is discussed in the past tense, now. Architectures like Xenon and the desktop dual cores, and systems like the PCs of the time are present and future tense.

Software development is much easier than it was in the PS2 era. There are huge problems with asset development and art, but software is just incredibly easy to write.
Okeedokie.
 
Dunno where I read it that when Sony & IBM worked on CELL, they had diverging ideas about what it should be like, CELL ended up being Sony's vision whereas Xenon was more like what IBM envisionned for CELL.
Can't remember the source.

As for CELL doing any good... probably not, Xenon would have had the same effect and modern CPU too, so no CELL didn't do anything good, it wasted a lot of time and effort in a technical dead-end.
 
So parallel CPU processing and today's massively parallel GPU architectures wouldn't exists without Cell? Even though they had been around for quite a while by the time Sony managed to release the PS3? Mmmmkay.
Although multicore has been around a long while, it wasn't until Intel launched Core2Duo in 2006 that multicore really began to penetrate the consumer market. PC gamers were mostly still overlocking their Pentium 4s and you could argue (and I would) that 360 and PS3 partly forced the hands of games developers because they now had no alternative but to face multicore head on.
 
I seem to remember Mike Acton even approached SPU programming very similar to shader fragments ...

Still, there were definitely some advancements made in programming that were partly forced, partly just enabled by Cell. At the time, even if it was hard, I think it was actually comparatively easier to write SPE jobs than to write GPU jobs. There have been an interesting little list of innovations that I am not sure would have been as practical to pull off in the PC space, but I don't know enough about it to be sure. To list a few of the more prominent ones:

- animation blending. Cell was used in Uncharted to blend several animations in realtime. For instance, a running animation, a looking in a certain direction animation, shooting animation, and ducking when a bullet almost hits Drake animation, all blended in real-time at very low latencies. Correct me if I'm wrong, but how well could you do this on, say, an 8800? (two years newer than RSX) How would you do it now? I'm still not seeing a lot of evidence of this happening, but I'm not sure I've looked well enough.
- audio tracing. Trace (and reflect?) sounds to the player and hit-detect to change the audio. While this existed on PC before, there a game would have to feed a small version of the geometry into the dedicated audio processor (AudigyFX? - Davros will know). Could you use the exact same ray-casting logic on the GPU back then? Can you do it know, and if so, what enabled that?
- MLAA. This little post-AA solution started on Cell and was very succesful. It got a GPU shader implementation much later, and I think it was much harder to program on the GPU back then (how is that now?), but it turned out to be possible and feasible on the GPU even then, as eventually it came to PC and 360, and later was even used on the RSX itself. Uncharted had a lot of (optional, bonus) post-FX options as well, and I remember papers detailing how such post-processing of the resulting image was much more feasible thanks to Cell being able to both feed and post-process the results from RSX.
- Water simulation and various other physics calculations. SPEs powered these more successfully than most CPUs at the time, because even if they were harder to program than a CPU, they were easier to program than GPUs for this purpose.
- Motion Blur and Depth of Field could be done by the SPEs very efficiently.
- Remember when the PS3 dominated Folding@Home? GPUs took over eventually, but I think it was actually a nice incidation of how boundaries were being pushed.

When psorcerer is saying that Cell forced developers to really learn how to program multi-threaded, he is of course partly wrong, but also partly right. On 360, you could easily get away with basically running the same code as on PC, but then just put the Audio on a separate core and not too much else, and that wouldn't really harm code that wasn't inherently multi-threaded before it was ported. The 360 would be far better utilised with proper from the ground up design for multi-threading, but you had an easy way out. On PS3, you'd run into the limitations of that single PPU much sooner, and moving anything to the SPEs was much harder in the beginning. However, that didn't hold up later either, as the SPEs could run the same C code and DMA isn't that foreign a way to send data to it either.

Just came across this paper, which is a nice summary of what was going on http://cryze.bplaced.net/papers/Killzone 2 SPU and AI.pdf

To quote from it (and remember, paper was published six years ago and already a retrospective):

INVEST IN THE FUTURE INVEST IN THE FUTURE
• Abundant CPU power has one flavor: Massively Parallel
• SPUs are here right now
• GPGPU is maturing, soon mainstream
• Expect more massively parallel designs

• Plan for all of your code to become just Parallel
• And your engine code to become Massively Parallel
• It’s not a “platform optimization” - It will stay
 
Last edited:
Although multicore has been around a long while, it wasn't until Intel launched Core2Duo in 2006 that multicore really began to penetrate the consumer market. PC gamers were mostly still overlocking their Pentium 4s and you could argue (and I would) that 360 and PS3 partly forced the hands of games developers because they now had no alternative but to face multicore head on.
You could argue that.

But a more interesting thing happened in the last few years. Mobile phones and tablets, and all other low power consumption electronics made it clear that we needed lower GHz processors but with several cores, all working in parallel. It was cheaper and not as power hungry.

Cell did not push everyone to go parallel. Cost, heat and power limitations did.

Without Cell, we certainly would not be stuck on single threads and would not have current consoles with, say, a single threaded 4GHz CPU for example simply because that would make no sense in the current universe we live in, where even a low-end smartphone has 2-4 low-power cores to play with.
 
Cell did not push everyone to go parallel. Cost, heat and power limitations did.

Neither Cell nor Xenon made things go parallel, that path was set from the late 1990s in the server field. POWER and PowerPC were conceived as wide processor architectures at inception which is why you could get a dual core Mac years before Intel made a dual-core Xeon let alone consumer grade multicore processors.

But on PC even after Intel introduced the Core2Duo, many games and engines were lagging behind even supporting multithreading, which had been long supported by the Pentium 4, let alone multicore. Xenon in 360 then Cell in PS3 forced the issue, devs had to design game loops differently but it still took a while (as few years) before you could bank on multithreaded/multicore support in PC games.
 
This is about ethics in Kryptonian Darwinism.

"The fact that you possess a sense of laziness, and we do not, gives PS3 programmers an evolutionary advantage"
 
This is about ethics in Kryptonian Darwinism.

"The fact that you possess a sense of laziness, and we do not, gives PS3 programmers an evolutionary advantage"

I've been trying to come up with a clever Ayn Rand reference, like "Atlas Shrugged ... the API off his shoulders," but I can't come up with anything good.
 
I've been trying to come up with a clever Ayn Rand reference, like "Atlas Shrugged ... the API off his shoulders," but I can't come up with anything good.


“A man's programming choice is the result and the sum of his fundamental convictions.... He will always be attracted to the console who reflects his deepest vision of gaming, the console whose surrender permits him to experience a sense of Greatness. The man who is proudly certain of his own value, will want the highest type of processing he can find, the console he admires, the strongest, the hardest to conquer--because only the possession of the Cell will give him the sense of a Platinum Trophy.”

ಠ_ಠ

“If you saw Kutaragi, the giant who holds the Cell on his shoulders, if you saw that he stood, code running down his chest, his BD-ROM buckling, his arms trembling but still trying to hold the Cell aloft with the last of his strength, and the greater his effort the heavier the Programmers bore down upon his shoulders - What would you tell him?"

I…don't know. What…could he do? What would you tell him?"

You're fired.”

ಠ_ಠ

"“There is no such thing as a lousy SPE job - only lousy programmers who don't care to do it.”"
“A viler evil than to murder a programmer, is to sell him The Cell as an act of virtue.”


ಠ_ಠ

ಠ_ಠ

ಠ_ಠ

ಠ_ಠ
 
Last edited:
This is about ethics in Kryptonian Darwinism.

"The fact that you possess a sense of laziness, and we do not, gives PS3 programmers an evolutionary advantage"

Strange logic indeed. I have always been taught that good programmers are inherently lazy. ;) Though the better programmers are probably those who predict avoidable problems and rewards for effort spent further into the future ...
 
the Onion bus interacts with CPU caches and can exhibit poor behaviors if accesses between the two sides of the APU hit different locations on the same cache line

Onion bus is a workaround. Around inherent GPU/CPU synchronization problem. In fact any modern game needs CPU just for one thing: reading the inputs. Everything else could and should be done on GPU.
And this is the only way forward for latency-sensitive stuff (think VR). Also: http://timothylottes.blogspot.com/2014/08/front-buffer-rendering.html

Before Cell, there was the Sega Saturn with its two processors.

The question here is not "which hardware has multithreading", but "which hardware forces you to utilize it".

That portion is actually significantly more autonomous for AMD.

But you can do that, and it will be done, in future titles.

There might be a smidge more than just a triangle being managed in the GPU pipeline.

On the low level the fixed function pipeline can only do that. Everything else is done by a GPGPU part. Which in fact is a "modern CPU" (without all the x86 baggage).
So, in some sense the current PS4 hardware is closer to PS2, than to PS3.

That's a lot of impractical billions of people and programs.

So what? There is no need for millions of low level game programmers, but without them nothing would happen.

Architectures like Xenon and the desktop dual cores, and systems like the PCs of the time are present and future tense.

So what? PDP11 is also a thing of the past, why universities teach that to CS students?
 
So what? PDP11 is also a thing of the past, why universities teach that to CS students?

I dont know what ancient rock you're living under but universities do not teach CS students how to program the PDP11. They stopped doing that prior to the 1990s.
 
Although multicore has been around a long while, it wasn't until Intel launched Core2Duo in 2006 that multicore really began to penetrate the consumer market. PC gamers were mostly still overlocking their Pentium 4s and you could argue (and I would) that 360 and PS3 partly forced the hands of games developers because they now had no alternative but to face multicore head on.

Slight correction. AMD actually brought consumer-level muticore to the market with the X2 in 2005. I had one of those and it was awesome at the time and the multiple cores were actually useful even then.

Also, Cell = The Neanderthal. An evolutionary fork that lead to a dead end. The other, better, approaches to the problems Cell was intending to solve were what killed it off, not lazy programmers.
 
Status
Not open for further replies.
Back
Top