Could PS3/Cell be another Itanium or Talisman?

FatherJohn

Newcomer
For the PS3 Sony seems to have chosen the risky approach of a revolutionary new architecture. (From what I can tell it's 4 CPUs-with-embedded-RAM on one chip, plus 4 GPUs-with-embedded-RAM on a second chip, connected by a high-speed bus.)

The PS3 Cell architecture seems to be using "underpants gnome" logic:

1) Put a lot of parallel hardware into a box.
2) ???
3) Profit!

If we look at other companies that have attempted revolution in the past, we see that revolutionary architectures often fail to live up to their initial expectations. For example:

- Intel has not had much success with their revolutionary I64 architecture. AMD's much more conservative and incremental x64 architecture seems to be much more successful.

- Microsoft's revolutionary Talisman approach to sprite-compositing 2.5D graphics was trounced by other company's traditional SGI-style z-buffers.

Bob Colwel, one of Intel's CPU architects gave a talk at Stanford last year.

http://stanford-online.stanford.edu/courses/ee380/040218-ee380-100.asx

In part of his talk he discussed how the Itanium architects were able to convince Intel's managers to go ahead with the Itanium project: They had an example where one hand-optimized loop of 32 instructions was able to run much faster on the Itanium architecture than on more traditional architectures. They claimed that this speed up was typical of what the Itanium architecture would work. (And also claimed that compilers would be able to generate code as efficient as the hand-optimized code.)

In reality, billions of dollars later, it seems Intel has not been able to capitalize on the theoretical performance potential of the Itanium architecture. They seem to be gradually phasing out Itanium in favor of a clone of the AMD x64 architecture. (Which is a much more traditional RISC-like instruction set.)

What if something similar to this happened at Sony? After all, they had to decide on Cell several years ago. They basicly had to guess, based on small benchmarks and gut feeling, what would be the best way to proceed.

To give another example, the Microsoft Talisman project was an attempt to get around the slow 3D graphics, and limited bus speeds of the day by exploiting frame-to-frame coherence of 3D scenes. They would render each 3D object (tree, character, wall, etc) into a 2D sprite, and then composite the sprites together to form the full scene. The idea was that you wouldn't have to render 3D all that often (only when the character turned around.) They even had features for warping 2D sprites to get pseudo-3D transformations, so you didn't have to render the full 3D as often. (Actually, I'm not sure if they had any 3D rendering hardware at all -- I guess you were supposed to pre-render your 3D objects from different angles.)

They developed this idea, produced some demo films, and even got some company to make a chip and a board for them. But in the meantime, it turned out that you could put a full 3D renderer into a chip, and put a full frame buffer on the same board, and voila, 3D graphics cards were born.

(It also helped that 3D was much simpler to use, in practice, than this elaborate sliding-and-scaling 2.5D sprite architecture.)

Anyway, I am worried that the Cell architecture may suffer from a similar fate as these other revolutionary approaches.

For what it's worth, I can think of some revolutionary approaches that have worked out well in practice: the personal computer and the RISC processor. But notice that in both these cases the revolution was simplifying things, or starting out small, and growing from there. It's the big-bang revolutions (like Cell) that are the riskiest.

I can also think of many evolutionary approaches that have been wildly successful: MSDOS-Windows-WindowsXP, Unix-Linux, x86, NVIDIA and ATI GPUs, and so on.

If I had been Sony, I think I would have chosen a more evolutionary approach than they seem to have chosen. Perhaps the simplest way to do that would have been to have waited a few years to start PS3 design, rather than starting it right after PS2 was complete. By starting so eary Sony ran into two problems: 1) They had no idea how techology was going to evolve over the next 5 years, and 2) the only concievable way of achieving their performance goals was to adopt a revolutionary strategy.

If they waited, they would have had a much better idea of which techologies were actually going to work, and they would have been able to consider incremental improvements to existing techologies in addition to revolutionary approaches.

Now, from a business standpoint there are several reasons for Sony to choose revolution, even at the risk of producing an inferior product:

1) Since PS2 is the market leader, developers can't ignore the PS3, no matter how hard it is to program, and no matter how poor the performance is. They will have to give it their full support, simply because it's guarenteed to have at least 40% market share next-generation.

2) If it's really hard to make a game run well on both PS3 and the other consoles, developers may scale back their non-PS3 efforts. ("Put the A team on the hard-to-develop-for PS3, put the B team on the easy-to-develop-for consoles."

One way this strategy can backfire for Sony is if developers figure out an easy way of using (say) 33% of the PS3's potential performance, and don't bother to invest in using the rest of the performance. (For example, if they just use one of the Cell cores to run their game, ignoring the other three.) Then we'll end up with a situation similar to today's Xbox, where some console-specific titles look really good, but the bulk of the titles are cross-platform, and since they use the same art, they look only slightly better than their PS2 equivalent.

Ah well, interesting times.

I'm really looking forward to seeing the PS3 technical demos. I expect them to be jewels of hand-coded graphical goodness. The PS2 technical demos were one of my favorite things about last generation.
 
if anything, PS3 is likely to be another PS2. that is, complex, very powerful, and very sucessful.

that said, PS3 will be its own beast, even compared to PS2.


edit: sorry, i didn't mean to dismiss all of your points. very interesting post btw
 
PS3 and Itanium comparison

While I do not know the details of the ps3s design I do think that not too much should be read into parallels with Itanium, as I tanium last I heard ran legacy code about as fast as a tortoise tied to a house. So this alone put potential upgraders to itanium off, I would imagine. It should also be noted that given the Itaniums clock speed it has a rather competitive SPEC FP score.
I think intel primarily went wrong with trying to force everyone unto their new ISA and not taking enough care as to how well their processor ran legacy code.
After all of the complaints by people about the difficulty of PS2 development I would think it unlikely that sony will not pay substantial attention to ease of development for the PS33, so I expect they have a few tricks up their sleeve in that department.
 
To a certain extent I share your views FatherJohn, but I think that the PS3 won't be as revolutionary and hard to program as the hype would suggest. I'm also not convinced that hardware power is going to be all that relevant due to multi-platform considerations. Exclusive 2nd and 3rd generation games should look fantastic on PS3.

I'm sure that Sony has invested substantially in good development tools this time around. Their revolutionary approach isn't without risks though and unlike MS they don't have the cash reserves to make a big screw up. Ironically, it's MS that is being conservative next generation. At least as far as hardware is concerned.

MS seems to be trying to take Nintendo's approach to hardware, while maintaining their strength in marketing and branding amongst their target demographic. It could be a winning combo, or the market place could simply shrug its collective shoulders. Its difficult to say right now.
 
1st) PS3 is dual core so it should be very fast at .07
2st) Intergrated memory controller take a look at athlon fx architecture and see how performance scales awesomely with an intergrated memory controller on die.
3) wasn't the ram supposed to be embedded????
4) Ibm+Toshiba+Sony's= 3 companies intellectual property.


The performance should be remarkable, How can u say its going to flop????
 
PS3 seems to promise to at least be a good calculator. An extreme focus on general number crunching won't necessarily be enough for good visuals as PS2 showed. If an adequate level of sampling or filtering for some useful function is missing because too much of the die was dedicated to processing units, the system could come up short, but it might, then again, also open the door to alternative solutions as well.
 
I think the PS3 architecture is not new at all, and is already a proven concept. I think the only problem is the implementation, will they be able to manufacturer the device at a high enough clock rate, and with good enough yields, to make it more cost effective and of higher performance than a traditional CPU.

The reason why I say it's a proven concept, is that CELL is based on distributed computing, which is taking the supercomputer world by storm. The most recent list of supercomputers is dominated by distributed computer systems.

Of course, I think the critical component after the hardware implementation, is the software, will the development environment be complete enough, to allow traditional console developers to get the most out of it's distributed framework. We shall see.
 
I know there are a large number of people who desperately need to cling to the "hard to program" garbage, but don't be surprised when your predictions based on that silly line of reasoning leads you time and time again to a failure to understand the console market's future.
 
Sony are fairly open on the hard or easy to develop for question...
 
Hey deano,

yes I can certainly agree. I know some programmers that love the PS2 beucase it's more exotic and interesting to code on. However if you say "we have a few days before we need to show this demo, can we add in mip mapping?" you'll see the most ugly face ever.

Anyway, how are things going?
 
Good! I was reading about that and wondered if you survived. Btw are you going to name the game Ninja theory as well? kinda cool name for a game.

btw, can't wait to see more on your game.
 
DeanoC said:
Sony are fairly open on the hard or easy to develop for question...
To be perfectly honest - while I admire how much improved things are with PSP development environment and support, I am not very happy when I have to work completely under restraints of an API.
Maybe I got spoiled by the PS2, or have been away from real PC development for too long, but it just irritates me on a number of things.

Quincy said:
yes I can certainly agree. I know some programmers that love the PS2 beucase it's more exotic and interesting to code on. However if you say "we have a few days before we need to show this demo, can we add in mip mapping?" you'll see the most ugly face ever.
Straying off a bit here - Even though it takes more work then just a couple of render flags, mipmapping is still easy and short to implement (unless you demand per-polygon inclination correction, which like a total of 2(3?) titles on PS2 ever did).
Writting a clip routine in raw VU assembly on the other hand isn't - but this was a clear case of defficient toolset, not exotic hw(the first chip to support hw clip came out just months before PS2, so you can hardly argue that was a standard feature back then). Once I got VCL compiler, I re-wrote in a couple of hours what took me days with raw asm.
 
I think the PS3 architecture is not new at all, and is already a proven concept. I think the only problem is the implementation, will they be able to manufacturer the device at a high enough clock rate, and with good enough yields, to make it more cost effective and of higher performance than a traditional CPU.

The reason why I say it's a proven concept, is that CELL is based on distributed computing, which is taking the supercomputer world by storm. The most recent list of supercomputers is dominated by distributed computer systems.

Of course, I think the critical component after the hardware implementation, is the software, will the development environment be complete enough, to allow traditional console developers to get the most out of it's distributed framework. We shall see.


I agree that PS3 architecture is not a new concept.

as far as distributed computing, this is probably for inside a single PS3, across probably no more than 2 major dies. possibility as many as 4, but more likely just 2.
 
FatherJohn said:
If I had been Sony, I think I would have chosen a more evolutionary approach than they seem to have chosen. Perhaps the simplest way to do that would have been to have waited a few years to start PS3 design

I think the keyword in your reasoning here is "seem". You don't actually know the architecture you're talking about. Besides, microcircuit tech is always a moving target, waiting is only going to waste time that could have been put to better use.

By starting so eary Sony ran into two problems

You don't even know the architecture PS3 is based on, yet you have already decided Sony's run into two specific problems. Brilliant deduction, man. Brilliant!

1) They had no idea how techology was going to evolve over the next 5 years

Um, and this is different from everybody else in the biz how? Does Nintendo and MS have crystal balls that allows seeing into the future perhaps?

2) the only concievable way of achieving their performance goals was to adopt a revolutionary strategy.

Why is this a "problem"? And is it really revolutionary? Nobody outside STI and their closest partners know the specifics. All we got are rumors and innuendo.

If they waited, they would have had a much better idea of which techologies were actually going to work

Oh, so you already know PS3 isn't going to work, is that it? Well lemme tell you a couple things, sonny... :) For starters, Cell isn't IA64 (that was a botch-job from the start, and HPs VLIW stuff has little to nothing to do with what STI is doing with CELL). Second, have you really looked at who are partnering up with Sony? To begin with, Sony itself is the most successful videogames company, possibly ever. PS + PS2 have been made in over 160 million copies, and PS2 is still selling well. That's A LOT of hardware. Toshiba designs and manufactures supercomputers for frig's sakes, and IBM is, well, IBM. Big blue. They make big iron boxes too, and they're GOOD at it.

You're so quick to point out Sony doesn't know where tech will move in the next 5 years, well, STI have a far greater picture of where tech will go than either Nintendo or Microsoft out of sheer collective experience, that I can assure you. STI owns their own fabs, they have huge permanent staffs of circuit designers, CMOS engineers, research labs out the wazoo. They got their bases covered better than anybody in the console biz, and are at the forefront overall in the computing sector in the world today.

("Put the A team on the hard-to-develop-for PS3, put the B team on the easy-to-develop-for consoles."

You don't actually KNOW it will be (crazy) hard to develop for. There's a good chance it'll be harder to develop for than the other next-gen offerings, but perhaps the difficulty will be outweighed by superior performance. Or not. Who knows. Best not to speculate on what we don't know, IMO.

I'm really looking forward to seeing the PS3 technical demos. I expect them to be jewels of hand-coded graphical goodness. The PS2 technical demos were one of my favorite things about last generation.

They weren't actually that good tho. :p Today's games look much better and use loads more fancy effects.
 
Guden Oden, One problem with the "big brain" argument (e.g. Sony, IBM, and Toshiba are all smart, and have done well in the past, so surely they will succeed) is that there are so many examples where this approach has failed. For example, HP and Intel had plenty of big brains working on Itanium, with less that stellar results.

Another problem with the big brain argument is unequal effort. The PS3 is Sony's life blood, so Sony undoubtedly has put a ton of effort into designing it. But for IBM it's just a research project plus a CPU manufacturing contract. The "Cell" researchers might even have been overly optimisitic in promising future advances in compiler techology.

Not to mention that IBM's efforts seem to be split 3 ways, to support all three next-gen consoles. It will be interesting to see how similar the 3 console's CPUs end up being. Maybe IBM sold the exact same CPU to everyone. :) Including Apple. :) :)

My point regarding planning 5 years out being difficult seems to have been misunderstood. My point is that it doesn't matter how smart you are -- you are unlikely to correctly predict the evolution of technology over a five year span. In order to have an optimal system you have to get everything correct. (RAM size and speed, CPU speed, GPU speed, power consumption, etc.) By starting five years ago Sony had to guess where everything would end up in 2005. If they were off by just 10% a year in estimating the rate of techological advance, they could end up mis-predicting a component technology by 50% or more.

So, for example, they might have assumed that CPU speeds of 6 GHz would be common today. Or that 10ms 1 GB of RAM would be economical. Or that BluRay would cost-effective in 2005.

There's a real chance that one or more of the technologies they assumed would be ready for the PS3 did not, in fact, work out as well as they expected it to. (I don't have any particular technology in mind, I'm just saying.) If this is the case, they might be forced to delay the console for that laggard techology to catch up. Or alternately they might have to ship with less of something than thay had designed for. (e.g. 1/2 the RAM they expected to have, or Blu-Ray as an expensive option.)

Obviously you can err too much on the other side. The Xbox design cycle was so short that it was mostly put together out of off-the-shelf components, which made it cost significantly more than the full-custom designs of the other consoles.
 
Guden Oden said:
To begin with, Sony itself is the most successful videogames company, possibly ever.

That would be Nintendo, nobody else is even close to the total success of Ninty.
 
FatherJohn, I find most of your arguments only superficially relevent to the STI venture. Comparing them to the HP/Intel venture doesn't take into account that they are designing parts for different roles and desining a part that's, for all intents, guaranteed to reach economies-of-scale well in excess of 100 million units.

Nor is it just a Sony product. The STI group is led solely by IBM, it's campus is located in Austin, on IBM's central R&D centre. The group took talent from most of IBM's research divisions, IBM has invested parallel to Sony on the project and has invested something along the lines of 2-3X the engineering talent of Sony or Toshiba. Their E.Fishkill facility is being expanded for the Cell processor line at a cost of something like 3/4ths a $1B.

Not to mention, the Cell project, outside of a few principles, has been worked on by different engineers than the Power processor teams. So, there is little commonality, the XCPU is a Power deriative, the Nintendo CPU likely is too. Cell has about nothing in common.

Not a convincing argument IMHO.

PS. There's a paper by two Cell researchers on Inheriently Lower-Complexity Architectures and their ability to have vastly better Time-to-Market results due to their modular nature and the necessity of being ontime. Perhaps you should read it.
 
FatherJohn said:
Another problem with the big brain argument is unequal effort. The PS3 is Sony's life blood, so Sony undoubtedly has put a ton of effort into designing it. But for IBM it's just a research project plus a CPU manufacturing contract. The "Cell" researchers might even have been overly optimisitic in promising future advances in compiler techology.

Cell is not just a research project for IBM. IBM has plans as broad as Sony does for Cell or Cell technology, or Cell-ish technology. I'm quite sure IBM has known which way the wind was blowing in the MHz race for a long time now.
 
Back
Top