Cell Server (IBM)

twotonfld said:
Render farm, meet Cell. Cell meet render farm.

Repeat:
Cell is 218 GFLOPS with 32-bit float precision only.
Cell is 20-30 GFLOPS with 64-bit float precision.
Rendering needs at least 64-bit float precision.
So we could say, "Cell, meet DP".

We'll see how it fares. What I wonder about though is, why only 1GB of memory? SDRAM is so dirt cheap nowadays, that even gamers have as much as that in a single processor system. A server would better have at least 2-4GB to be competitive...
 
Well, Cell's DP is actually pretty good for a single chip - regardless it's substantial drop-off from SP performance.

The memory is an issue of sorts; Cell is constrained to XDR due to the memory controller and interface.

(I feel you and me keep coming back to Cell's viability in the larger marketplace Laa-Yosh :p )
 
Reason for my interest is that I work in CG animation, which is one of the targeted markets for this server. We'd be all too happy to get cheaper and faster machines in here, it's just that I'm sceptical about such a huge jump...
 
Regarding FLOPS

I couldn't find the DPOPS for an Athalon MP, but seeing as it's FLOPS was around 3.4, I don't think it's DP performance will be equal (nevermind greater). That said, a Linux-based cluster of 50 was used to render the and composite the effects in "The Core".

Link: http://www.linuxnetworx.com/news/4.2.2003.32-Linux_Networx_R.html

I know this isn't the latest technology - but I think it might strengthen the case of a dual-cell blade.
 
couldn't find the DPOPS for an Athalon MP, but seeing as it's FLOPS was around 3.4, I don't think it's DP performance will be equal (nevermind greater). That said, a Linux-based cluster of 50 was used to render the and composite the effects in "The Core".

You know what twotonfld that is a great find. Being that what you showed in the link could produce The Core special effects, why can't 30 PS3s produce something like The Day After Tomorrow.
________
Easyvape Vaporizer
 
Last edited by a moderator:
Linux is pretty much the standard platform for top VFX studios like ILM, Weta and such and has been so in the past 3-4 years. Smaller shops with no UNIX/IRIX past, and with more off-the shelf software and less custom developed applications, are rather sticking to WinXP because of it's greater palette. There's no Photoshop for Linux, for example.
All in all, supporting Linux will not give Cell-based servers any significant advantage.
 
Hey good to know Laa-Yosh (the CG thing) - that's pretty cool. Now I'll know who to go to if and when these servers do get released in order to ascertain industry perspective on them. ;)
 
mckmas8808 said:
couldn't find the DPOPS for an Athalon MP, but seeing as it's FLOPS was around 3.4, I don't think it's DP performance will be equal (nevermind greater). That said, a Linux-based cluster of 50 was used to render the and composite the effects in "The Core".

You know what twotonfld that is a great find. Being that what you showed in the link could produce The Core special effects, why can't 30 PS3s produce something like The Day After Tomorrow.

Okay, are you really trying to drive me crazy or what?
One of my pals worked on The Core a few years ago, but the studio (Frantic Films) was just one of some 5 or 6 doing all the effects work. But I'm quite sure that those 25 machines weren't rendering the whole movie. Even our small studio has a renderfarm of that size... and I can tell you that it's just enough to do a few minutes of game cinematics in less than 720p resolution.

Core was a low budget movie with small VFX studios on board, whereas TDAT was a blockbuster with a far bigger budget and requirements (I think even ILM did some work there after it was taken away from Digital Domain). It used a huge amount of processing power and buttloads of storage and network hardware. So please stop your ridiculous theories about PS3 rendering anything else than games.
 
The point wasn't about the Linux support - it was about the performance. 1 cell easily outpowers 1 Athalon MP yet someone decided the Athalon MP was enough power to shove in a cluster.

Anyway - why would you run Photoshop on your render farm? Maybe an app server or a local machine. But, I'd assume you wouldn't want to steal cycles from your cluster.

All this said - More movies like The Core get made than movies like TDAT - Thus, this may have its applications. Why are you convinced that this is impossible?
 
twotonfld said:
The point wasn't about the Linux support - it was about the performance. 1 cell easily outpowers 1 Athalon MP yet someone decided the Athalon MP was enough power to shove in a cluster.

Oh please, the Core was produced like, back in 2002 or so... it was in box for almost a whole year, Hilary Swank did Boys don't cry well after it.
Compare Cell to Opterons if you want a fair game.

Anyway - why would you run Photoshop on your render farm? Maybe an app server or a local machine. But, I'd assume you wouldn't want to steal cycles from your cluster.

Managing mutliple OSs in a studio is something that only large shops can really dare to do. If you've got less than a hundred people, you want to stick to a single platform, and in that case your choice is pretty much limited to WinXP.
 
Differnt tools belong in differnent shops - you don't put the same IS in a mom and pop store that you put in Walmart. That's obvious.

Anyway...

Opterons do 2 DP OPS per cycle. At 2.2 GHz that gives you 4.4GDPOPS.

1 cell is still better.
 
twotonfld said:
Differnt tools belong in differnent shops - you don't put the same IS in a mom and pop store that you put in Walmart.

There are maybe a dozen big VFX studios around the world. The bread and butter is the 15-100 man shops.

Opterons do 2 DP OPS per cycle. At 2.2 GHz that gives you 4.4GDPOPS.

1 cell is still better.

There is no such thing yet as "1 cell". Once its out, you can compare theoretical peak performance. And maybe even start to benchmark with actual apps, if they'll get ported.


Now tell me, how exactly does this make the PS3 look not that good for you? Why does it have to be the end of all computers, no matter how much you know about stuff like CGI?
 
This has nothing to do with the PS3 - this has to do with the cell-based blade server mentioned at the start of this thread.

I thought B3D was pretty much past the fanboy crap and about technology - I guess in your case I was wrong.

(and last I checked - IBM has taped out some cells)
 
Bringing up production details about a 3+ years old movie as an argument isn't exactly something I'd expect here either.
 
It was a performance comparison - that's all, and there's not a lot of easily accessible info out there.

Besides - I gave you your Opteron numbers - you had no response for those.
 
Opteron's peak GFLOPS isn't too high, that's correct - I've expected it to be faster.
Intel mentions 6.2 GFLOPS for its 3.2 GHz Xeons, so a dual 3.6 GHz system should reach 15-16 GFLOPS - today. But I'm quite sure the actual performance is lower.

The point is, it's useless to compare theoretical peak values. We can get back to the performance issue when there's final Cell based hardware on the market, with proper software support - and then we'll be able to start making price/performance ratio comparisions.
 
Laa-Yosh said:
The point is, it's useless to compare theoretical peak values. We can get back to the performance issue when there's final Cell based hardware on the market, with proper software support - and then we'll be able to start making price/performance ratio comparisions.

While I agree to an extent that it is pointless - it's essential to technology forecasting to use these peak numbers. You're right that actual benchmarks mean a lot more though. But, we don't know when we'll get those and this seemed worth mentioning based on the type of technology mentioned at the beginning of this thread (dual cell, Linux server blade).

I wouldn't completely dismiss the idea though as Sony has mentioned the possibility of internally using Cell technology for this sort of thing.
 
I think a lot of good points have been brought up - no need for you two to get overly frustrated with each other. ;)

Anyway it surprises me that the Opterons performance would be so low as compared to the Xeon - that almost seems the counter of what I expected, but regardless in real-world performance I'm sure the on-die memory controller must play a strong role in several situations.

Cell should turn out to be a strong chip - that is if the requisite community/software support builds around it. As for performance, I guess we'll need to await properly compiled relevent benchmarks. Theoretically in raw power it looks good thus far, but then again Alpha by all accounts as an architecture for it's time should have had an even more promissing future and where is it now?

I'll be looking forward though to any eventual performance benchmarks.
 
FWIW I'd expect high end rendering to be largely limited by IO bandwidth Hard disk and memory performance rather than Flops.
 
ERP said:
FWIW I'd expect high end rendering to be largely limited by IO bandwidth Hard disk and memory performance rather than Flops.

Exactly, though I'd add network traffic as well. Most studios don't try to mirror their assets to the rendernodes' local harddrives, because it'd require continuous synchronizations for the daily updates in them.
Most of the data is stored on central servers, which every rendernode must access when starting a rendering job. A typical highend studio needs 0.5-10 Terrabytes of online storage that's accessed all the time by 100-500 machines in the network, thus the traffic is huge.
The upside is that the individual blades can have a simple 40-120GB harddisk for local storage, no fancy SCSI or RAID stuf is required.
 
Back
Top