Does Cell Have Any Other Advantages Over XCPU Other Than FLOPS?

dukmahsik said:
why hasn't cell caught on yet? when will sony use it outside of ps3?

Hold on a sec, PS3 isn't even out yet itself! Once volumes are up, you can expect Sony to announce other products using Cell, but for a while, PS3 will be requiring all the manufacturing capacity Sony can give it.

dukmahsik said:
why did apple turn it down?

They make PCs. They have a large existing software base. They already had started walking a path with Intel. Pure technical merit would not have been top of their list, I'm afraid, as is the case with many technology choices company's make.

Here's an opposing question for you - why did Mercury Systems jump at it?

While I'm think Cell (in various guises) will see success beyond PS3, don't make the mistake of linking technical success to commercial success (outside of PS3). If we were to do the same for chips in other systems, it'd look a whole lot less rosy for them than it does for Cell.

dukmahsik said:
why isn't ibm pushing it

As far as I can tell, they are?
 
Last edited by a moderator:
compres said:
How much can we optimize java BTW? i would not use java in performance intensive tasks, maybe am mistaken.

Java can run extremely well now because it uses a JIT approach rather than the previous 'lets interpet the entire bytecode'. The Java language assumes an entirely stack based machine is present in the bytecode and this allows it to easily be translated to suit any particular hardware architecture so good performance can be obtained (dependent on a decent VM). Typically these gains are because the VM is able to read into the code to see what is done often and push more optimisations on that i.e. the VMs notice if a function gets called often it can/will translate it to exploit hardware specific features such as cache size. Similarly, .net has almost identical characteristics in this area because it to uses a bytecode approach (MSIL).

Overall this means a piece of Java now (on a modern VM) has performance comparable to C in most areas and even better in some so the argument for not using Java is becoming less and less relevant. The reason it isn't used in performance intensive tasks is mainly because of the garbage collector which can just steal your CPU time to tidy up but there is some research that is starting to address this (I think I saw someone providing a real-time GC that would allow Java to be used in environments such as missiles last week?).
 
Titanio said:
As far as I can tell, they are?

Cell's current comparatively dismal DP performance is what's keeping it from wider adoption and IBM's full push. Future versions of Cell that are less economically focused like the DD1/DD2 versions will make it in more advanced fields that require higher precision, and a stronger IBM marketing push.
 
one said:
That's an interesting question, but my new question is, why don't ATI use unified shader architecture in their primary business?
The nature of the two design teams working independently, and the fact that the Xenos team was not even allowed contact with the other teams during Xenos development. It will come to the PC, but it will take more time.
 
compres said:
ROFL. Guess thats why they(the developers) can get paid six figure salaries these days, becouse retars deserve the money.

I dont think they are getting dumber, I think it's harder to coordinate large teams, now that we have large proyects. Before, a single programmer could develop a game(atari), now its impossible.
I completely agree, for a modern game (or large software application in general) to be successful it needs to have competent management with a technical background. The lack of effective management can really kill a game.
 
Asher said:
The nature of the two design teams working independently, and the fact that the Xenos team was not even allowed contact with the other teams during Xenos development. It will come to the PC, but it will take more time.
It's not a logical answer to my question. Why didn't the PC team independently launch their own unified thing? :smile:
 
Asher said:
Cell's current comparatively dismal DP performance is what's keeping it from wider adoption and IBM's full push. Future versions of Cell that are less economically focused like the DD1/DD2 versions will make it in more advanced fields that require higher precision, and a stronger IBM marketing push.

I don't disagree with what you're saying, but the poster I was replying to made none of the qualifications you did ;)
 
one said:
It's not a logical answer to my question. Why didn't the PC team independently launch their own unified thing? :smile:
They're on it: R600. DX10. Requires Vista to be fully supported.

Jawed
 
Jawed said:
They're on it: R600. DX10. Requires Vista to be fully supported.
Your answer can be an indirect answer to dukmahsik's question about Cell in this thread... :smile:
 
The biggest advantage of Cell will be in non videogame applications. When it's used as a dedicated media player.

In games I dont thing the difference is big enough to matter unless it's a pretty specific application.
 
The truth is the answer lies mainly in

A) How well the problems of multi-threaded programming are solved in the upcoming years (in general)

and

B) How well the Cell is suited to videogame processing in general.

My money is pretty firmly that Cell is not a very good design. But if they ever do tap it's power, it will be because it is figured out how to multi-thread somewhat trivially in future years.
 
seismologist said:
The biggest advantage of Cell will be in non videogame applications. When it's used as a dedicated media player.
???

I don't know what a media player is going to do with all that FP power.

The best processor and best algorithm in the world can't make a picture look much better without artifacts. Most of the time video processing makes a picture look worse, especially to experts.

The most processor intensive thing I can think of would be advanced (AVIVO or PureVideo style) 1080i deinterlacing, and that doesn't need anywhere near CELL's power. Furthermore, FP is a complete waste, because 16-bit integer will be plenty for an 8-bit signal.

EDIT: Well, I suppose it's less of a waste than a different piece of silicon, especially if it's for lower volume (>$3000) stuff.
 
Last edited by a moderator:
Mintmaster said:
???

I don't know what a media player is going to do with all that FP power.

High bitrate/resolution H.264 decoding would be good for Cell. It can rape all sorts of PC's right now.
 
Bohdy said:
High bitrate/resolution H.264 decoding would be good for Cell. It can rape all sorts of PC's right now.

Cell Beats EVERY processor now at H.264 decodeing. Just imagine the play back quality of PS3 if Sony code Cell to handle all the video processing.
 
Last edited by a moderator:
I'm I the only person here who thinks 'rape' is a totally horrific and inappropriate metaphor? Is it not enough to say Cell surpasses or outperforms other processors, rather than likening it to the grossest act of vicious, agressive violation? Why don't people as whole express themselves using the right language, instead of falling back on primitive sexual metaphor and swear-wards. It's not like there's inadequate vocabulary in the English language to express yourself in other ways.
 
There are at least 2 problems that I see with parallellizing H264 encode/decode:
  • The bitstream format doesn't have any provisions to allow parallel unpacking within a frame. This is especially important if the H264 bitstream that comes is Arithmetic-coded.
  • The Intra-prediction blocks of H264 have serial decoded-data dependencies from one block to the next.
It is entirely possible that Cell will slap e.g. an AthlonX2 silly on H264, but I would like to see actual benchmarks before making a judgement.
 
Shifty Geezer said:
I'm I the only person here who thinks 'rape' is a totally horrific and inappropriate metaphor? Is it not enough to say Cell surpasses or outperforms other processors, rather than likening it to the grossest act of vicious, agressive violation? Why don't people as whole express themselves using the right language, instead of falling back on primitive sexual metaphor and swear-wards. It's not like there's inadequate vocabulary in the English language to express yourself in other ways.

I changed it, your right its not really need'd and i appologise to you and any one else who were upset about it...Sorry
 
Back
Top