High Performance Computing Potential of Cell

Shompola said:
Is there even an MPI implementation for CELL?

I'm not sure, but no doubt it could be done. There was a paper out there some time ago on a proposed programming model for Cell based on MPI.
 
Arwin said:
Also, if I remember correctly the Eyetoy demo shows a 3d model of the hand in the game, so it does more than recognise a binary pattern on the card. I think this is the technology that makes a 3D image out of a 2D one, also demonstrated earlier with the emotion recognition demo.
You're kinda reaching there! The hand was the user's hand captured via camera. The game recognises the position, orientation and type of card and superimposes a graphic on top. It has no 3D perception capacity, the 2D overlay being drawn over the video feed, so the dragon character always appears in front of the player's hand(s) regardless of position of the hands in 3D space.
 
Shifty Geezer said:
You're kinda reaching there!

You are correct. Or at least, the on-screen visuals are the camera, my mistake. I wasn't reaching, just wrong. But you do realise 3d space interpolation from 2d is all the rage these days, it's not that weird a thought. Both Sony and Microsoft have shown demonstrations that used this

Sony: http://www.ps3focus.com/archives/131

The hand was the user's hand captured via camera. The game recognises the position, orientation and type of card and superimposes a graphic on top. It has no 3D perception capacity, the 2D overlay being drawn over the video feed, so the dragon character always appears in front of the player's hand(s) regardless of position of the hands in 3D space.

But the dragon character does react to the hand's movement. I'm just going to assume for now, however, that in eye of judgment that's your average edge/motion detection going on, and not 3d from 2d type stuff.
 
Arwin said:
You are correct. Or at least, the on-screen visuals are the camera, my mistake. I wasn't reaching, just wrong. But you do realise 3d space interpolation from 2d is all the rage these days, it's not that weird a thought. Both Sony and Microsoft have shown demonstrations that used this.
Yes indeed, but we're not saying anything of the sort yet on any console. Perhaps in the future, but Eye of Judgement isn't in that field. As you say, it seems to be just typical motion detection for the interaction, seeing where the motion is relative to the model on screen to determine interaction. I would love to see a virtual 3D space modelled with occlusion though! There was talk of the EyeToy2 having depth perception, possibly through IR. But we haven't heard anything since or seen any demos. And this is off topic so I'll shut up now!
 
SPM said:
In any case the film industry standard practice for years has been to use Linux clusters to do various movie special effects. The cluster nodes don't need much RAM each because they only run one compute intensive task each and return results to a central server. 256MB would be plenty for most applications. The huge data or results set in a complex supercomputing application like weather modelling would be stored on a central server, not on the supercomputing nodes, and would be made available to the nodes in the cluster via the network.

Please don't educate me on this - I'm working at a CGI studio doing movie VFX level work and as I've mentioned, 2GB is already a problem. 256MB is a joke.

Each render node is used to render a complete frame for the best efficiency. Thus the node needs to load scene data, texture data, animation data, then tesselate the geometry, calculate various kinds of intermediate data and so on. They have to be very similar to the workstations, except for the lack of fast openGL graphics. We're using dual Xeon balde servers, which is pretty much the industry standard as well.
 
Hey Laa-Yosh would a Dual Cell blade with 2 GBs of RAM be better for you guys over the Xeon blade? Or are there other things that would need to be changed also?
 
mckmas8808 said:
Hey Laa-Yosh would a Dual Cell blade with 2 GBs of RAM be better for you guys over the Xeon blade? Or are there other things that would need to be changed also?
Well, you'd need a renderer that is obviously optimized for CELL and its SPEs and as I don't know where the compute / memory fetch balance lies for that sort of problem, it may or may not be faster.
 
Last edited by a moderator:
mckmas8808 said:
Hey Laa-Yosh would a Dual Cell blade with 2 GBs of RAM be better for you guys over the Xeon blade? Or are there other things that would need to be changed also?

Autodesk and Pixar and a few other companies would have to do some optimizing as our Intel based farm is used for a lot of separate tasks, like 3D renders, 2D compositing renders, cloth simulation, crowd simulation, etc. etc. That's also why add-on rendering/raytracing accelerator cards never really took of - studios want to be able to use the computing capacity for many different tasks.
(and we need 2GB of RAM per CPU)
 
Laa-Yosh said:
Autodesk and Pixar and a few other companies would have to do some optimizing as our Intel based farm is used for a lot of separate tasks, like 3D renders, 2D compositing renders, cloth simulation, crowd simulation, etc. etc. That's also why add-on rendering/raytracing accelerator cards never really took of - studios want to be able to use the computing capacity for many different tasks.
(and we need 2GB of RAM per CPU)

This is the system I want. ;) He he he.



(NOT A PHOTOSHOP.)
 
mckmas8808 said:
2 GBs per CPU? Whoa! I guess Cell will never be used for movie making then.:cry:

Well, there's no reason why not. As previously stated the wall is really more on the software side of things than anywhere else. If there was demand for it, 2GB of memory per CPU would be full well doable.
 
Arwin said:
Surely the Cell Blades IBM makes will have plenty of memory?

http://news.com.com/IBM+bringing+gaming+chip+to+blade+servers/2100-1010_3-6036943.html?tag=nl

... surely the memory bus setup in the PS3 may be optimised for 512mb, but I can never believe that the Cell itself is in any such way limited or would have trouble performing with 4gb or more.

The current blades have 512MB for each cell

blade.jpg


I think there is also a smaller one with 512MB total. The Mercury blades have 1GB as well afaik.
 
Last edited by a moderator:
The current blades have 512MB for each cell

Note it says "1st generation".

I went to an IBM presentation a while back where they were talking about a potential 2nd generation with total 16GB memory capacity.

They also mentioned they "have a design" for a SPE with fully pipelined dual precision support.

According to the recent Berkley paper Cells are already very powerful on DP maths (up to 12X a 2.2GHz AMD64) so a fully pipelined version should get a lot of interest.
 
Mercury's Powerblock 200 has 2GB DDR RAM behind the southbridge in addition to 1GB XDR RAM connected to Cell.
 
Arwin said:
... surely the memory bus setup in the PS3 may be optimised for 512mb, but I can never believe that the Cell itself is in any such way limited or would have trouble performing with 4gb or more.
If I remember correctly, cell utilizes 2x32-bit memory channels which I guess in theory would limit it to 4GB of memory - I don't know if something similar to physical address extensions is supported in cell processers to provide a higher limit...
 
mckmas8808 said:
2 GBs per CPU? Whoa! I guess Cell will never be used for movie making then.:cry:

That's because PRMan doesn't really support multithreading, it basically treats the system as two separate machines. It's still worth going with dual PCs though because you spare the additional hard drive, power supply, and rack space of a second machine.
 
Back
Top