Sony talked about Cell and Cell based workstations

Dream on ;) :LOL:

BTW: With no direct access to e-DRAM, Broadband Engine(Assuming a Teraflops peak) will sustain shit.
 
Paul said:

The #1 super computer doesn't use any cache or eDRAM and it achieves 87% efficiency. Who's dreaming now? Get back to me when you can find real world numbers from this SONY supercomputer ;) :LOL:

SONY needs eDRAM because they have never designed a TFLOPS machine and don't know any better. :LOL:
 
SONY needs eDRAM because they have never designed a TFLOPS machine and don't know any better.

They need eDRAM to achieve in one chip the performance that would take many a dozen chips with other architectures. If you notice the theoretical performance/transistor incremented ratio, you realize that a significant portion of the added budget has gone to make sustainable performance a reality.

They had prototypes quite a while back, and based on their succesful performance the cell project was kept going. The figures the big wigs have seen, have been good enough to merit billions in continued R&D and fab. costs... what this suggests is obvious.... CELL when it comes to the estimated perf.... is near or beyond it in the real world....

The #1 super computer doesn't use any cache or eDRAM and it achieves 87% efficiency.

In some cases wasn't it more like 36% and below? or have I recalled erroneously?
 
From all information gathered to date the CELL architecture is dedicated to - in Sony's and IBM's own words - digital content. That is its focus, and it does not try to be anything else. Expect it to be incredible for digital content applications, but mediocre in others.

You know, like how a 6+ GFLOP PS2 running linux really compares to a "700MFLOP only" P3, those with the kit can run a few simple tests to see for yourselves.

Lots and lots of problems and applications that a "multimedia-dedicated" architecture has no advantage in. The infamous BLAST comes to mind - unvectorizable and unparallelizable. (Actually parallel versions of BLAST exist, but consensus in that gains are insignificant.) In fact most bio-related sequencing problems fall in that category. I'm sure there are many more.

CELL won't displace the Pentiums/Athlons/Power5s/etc, and I believe that was never its aim.

Somthing from gamesindustry
http://gamesindustry.biz/content_page.php?section_name=dev&aid=3451
ony and IBM take on digital content creation with Cell workstation
Rob Fahey 10:42 13/05/2004

Serious shot across the bows of Microsoft and Intel

One of the least-reported announcements of E3 to date is also one of the most worrying for Microsoft - with Sony and IBM announcing co-development on a Cell-based workstation which is aimed at the content creation market.

The workstation, which will ship before the end of the year, will feature an architecture based on the parallel processing Cell chip, and will be designed to power digital content creation for movies, television and videogames.

Cell is the next-generation microprocessor created jointly by Sony, IBM and Toshiba, and it is expected to power a whole range of both consumer and high-end appliances in the future - including the PlayStation 3 game console.

IBM will be building the Cell workstations, with Sony providing the architecture, algorithms, middleware and data structure for digital content creation tools on the platform.

The Cell technology, which has a major focus on working in parallel across high-speed networks, is considered to be ideally suited for jobs such as special effects rendering or content creation for movies or next-generation videogames.

The news of the joint plan will come as a shot across the bows of both Intel - which has recently been enjoying dominance of this market - and Microsoft, which now faces the prospect of high end content creation being done on the same non-Windows platform that will eventually become the development tool for PlayStation 3.

"Microsoft should be really worried by this," one developer told us today. "They've been touting Xbox 2 to their partners and talking about the kind of content they want to see created on the platform - more polygons, higher resolutions, more effects - and our response has been that the tools to create this stuff for games don't really exist yet. Now Sony has effectively created those tools."

The Sony solution is certain to integrate tightly with the PlayStation 3 development system, whereas developers working on Xbox 2 will probably still be tied to Windows - making the task of putting digital content from the Cell workstations (whose proposed role reminds us of the market position occupied by Silicon Graphics workstations in the early nineties, before their performance was overtaken by x86 PC systems and PowerPC Macintosh systems) into Xbox 2 titles much more difficult than the equivalent task on PS3.

The announcement of the workstation also fills in a further piece of the jigsaw in Sony boss Nobuyuki Idei's plan for the company - adding the tools to create digital content to a business model which already includes the publishers of digital content, and the manufacture of the playback devices for that content.


I need to ask how devs feel about the bolded part. It this flamboyant/melodramatic reporting again?
 
I'm not a professional game developer but my opinion is the bolded statement is a bit premature. If the Cell based workstations are available before real Xbox 2 dev kits then the unnamed developer might have a point. Until then neither company has provided all of the tools.
 
zidane1strife said:
SONY needs eDRAM because they have never designed a TFLOPS machine and don't know any better.

They need eDRAM to achieve in one chip the performance that would take many a dozen chips with other architectures. If you notice the theoretical performance/transistor incremented ratio, you realize that a significant portion of the added budget has gone to make sustainable performance a reality.

They had prototypes quite a while back, and based on their succesful performance the cell project was kept going. The figures the big wigs have seen, have been good enough to merit billions in continued R&D and fab. costs... what this suggests is obvious.... CELL when it comes to the estimated perf.... is near or beyond it in the real world....

The #1 super computer doesn't use any cache or eDRAM and it achieves 87% efficiency.

In some cases wasn't it more like 36% and below? or have I recalled erroneously?

Just because the CELL project is continuing doesn't mean CELL-powered supercomputers will be competing for the top spot in the top 500. You're grabbing stuff from thin air dude. Regarding efficiency figures, IBM's top supercomputers achieve around 55% efficiency using architectures with cache. The #1 supercomputer achieves 87% efficiency without cache. Of course both figures are for optimized apps. Worst case scenario figures would obviously be proportionally lower for both.
 
Just because the CELL project is continuing doesn't mean CELL-powered supercomputers will be competing for the top spot in the top 500. You're grabbing stuff from thin air dude. Regarding efficiency figures, IBM's top supercomputers achieve around 55% efficiency using architectures with cache. The #1 supercomputer achieves 87% efficiency without cache. Of course both figures are for optimized apps. Worst case scenario figures would obviously be proportionally lower for both.

No, I remember something about most hovering in the 30%s(for the best) and below(most others) in most apps, maybe I'm recalling incorrectly, but that's what I think I read.

Now as for the cell, true it might not take the top spots, but I'd just like to hear about multi-Tflop level workstation(I'm sure it'd cause a nice riot if one of them had numbers higher than the Earth sim one, even if not directly comparable...)

PS I'd also like to see new Renderfarms using cell tech, from early 90s to today we've gone from 40Gflops to 2-3Tflops(yes, I know the numbers are not directly comparable, but stay with me)... With cell it could go to 2-3Pflops, if even a fraction of this suggested perf. is achieved, we'll have an improvement in CG complexity that might finally break the barrier...
 
passerby said:
I need to ask how devs feel about the bolded part. It this flamboyant/melodramatic reporting again?

Having worked in the movie industry, I can understand the comparison with SGI workstations. When opening a Maya file takes several minutes, you want to buy the fastest machine available on the market. Rendering times can be easily reduced by building bigger renderfarms, but to create models and animations you still have to rely on a single workstation. On the movies I worked on, the biggest problem was the sheer amount of data the artist needed to manipulate. They ended up animating low res proxy and splitting scenes in separate parts, hardly the most efficient way to work.
If the Cell workstation is significantly faster than top of the line PCs, movie studios (and possibly game developers) will buy them by the hundreds.
The productivity gain justifies the cost, as long as the software tools are available.
 
No, I remember something about most hovering in the 30%s(for the best) and below(most others) in most apps, maybe I'm recalling incorrectly, but that's what I think I read.

Actually I've posted the actual figures in this forum a while back. Just do a search.
 
passerby said:
Lots and lots of problems and applications that a "multimedia-dedicated" architecture has no advantage in. The infamous BLAST comes to mind - unvectorizable and unparallelizable. (Actually parallel versions of BLAST exist, but consensus in that gains are insignificant.) In fact most bio-related sequencing problems fall in that category.

Any search using search trees is hard to parallelize, unless of course you have parallel queries ... then it becomes quite trivial.

When I hear a game developer say tools I find it very hard to believe he was talking purely about hardware BTW. A tool is a hammer, an idiot or a piece of software.
 
unless of course you have parallel queries ... then it becomes quite trivial.
Of course that's trivial! Problem is with that one single monstrous search(not necessary BLAST) dictated by circumstances and requirements that takes forever. But yeah I understand yor point.

EDIT
Thanks to pcostabel's reply. That was informative.
 
PC-Engine said:
eDRAM is just like cache except a lot slower, neither of which guarantees high efficiency ;) :LOL:
eDRAM is just..well..embedded DRAM! It's not a cache, a cache has a different behaviour than a 'standard' memory block.
It's also possible to make a cache-like memory out of DRAM or eDRAM, of course.

ciao,
Marco
 
Those workstations, are they for licensed developers only, or can I trade some of my PCs for one ?

Basically, are those workstations consumer end product ?
 
pcostabel said:
passerby said:
I need to ask how devs feel about the bolded part. It this flamboyant/melodramatic reporting again?

Having worked in the movie industry, I can understand the comparison with SGI workstations. When opening a Maya file takes several minutes, you want to buy the fastest machine available on the market. Rendering times can be easily reduced by building bigger renderfarms, but to create models and animations you still have to rely on a single workstation. On the movies I worked on, the biggest problem was the sheer amount of data the artist needed to manipulate. They ended up animating low res proxy and splitting scenes in separate parts, hardly the most efficient way to work.
If the Cell workstation is significantly faster than top of the line PCs, movie studios (and possibly game developers) will buy them by the hundreds.
The productivity gain justifies the cost, as long as the software tools are available.


AMEN. A-bloody-MEN.

Can't imagine how my system will handle Maya6 when it finally comes out. :?
 
And the same thing was said of PS2, workstation to make the content on. Yet how many PS2 games have shipped with content made ona PS2 workstation?

The fatal flaw is software, so Sony/IBM is going to convince MAYA/Max//XSI/Lightwave/Zbrush/Photoshop/Renderware studio/Alienbrain etc to be rewritten to run on a Cell workstation? Remember Cell is different enough that simple porting a Linux version will likely be slower than the Windows version. To get good performance your going to have to deal with Cell at a low level.

Even if a Cell workstation is cheaper/faster than the equivilent PC or Mac expect it to take 5 years for the big apps to convert.

By which time we will be talking about PS4 workstations.

Maybe you might have a couple of Cell workstations as nodes in the art pipeline (render-farm or art to in-game asset system) but can't see anybody producing actual content on them unless the DCC packages are ported...
 
Back
Top