Sony talked about Cell and Cell based workstations

DeanoC said:
And the same thing was said of PS2, workstation to make the content on. Yet how many PS2 games have shipped with content made ona PS2 workstation?

The fatal flaw is software, so Sony/IBM is going to convince MAYA/Max//XSI/Lightwave/Zbrush/Photoshop/Renderware studio/Alienbrain etc to be rewritten to run on a Cell workstation? Remember Cell is different enough that simple porting a Linux version will likely be slower than the Windows version. To get good performance your going to have to deal with Cell at a low level.

Even if a Cell workstation is cheaper/faster than the equivilent PC or Mac expect it to take 5 years for the big apps to convert.

By which time we will be talking about PS4 workstations.

Maybe you might have a couple of Cell workstations as nodes in the art pipeline (render-farm or art to in-game asset system) but can't see anybody producing actual content on them unless the DCC packages are ported...


Very very true, and the various GScube's really seemed like a NA-NA-NA-NA-NAR WE CAN RENDER THE MATRIX IN REALTIME!! kinds of project rather than anything really useful... However, they DID/DO exist.

I don't see Alias doing anything about Cell anytime soon. Hell, they haven't done anything about Macs either, and how many years has it been since Macs have been able to "compete" on a certain level with Win CPUs??
 
PC-Engine said:
Actually I've posted the actual figures in this forum a while back. Just do a search.

The numbers I've seen for the overall efficiency of the earth simulator is very very far from 87%. If you say an IBM design gets around 55, that probably means an average of some unknown superset of tasks. If you cherry-pick A task that the ES achieves 87% on, then you have done a very biased comparison.

Also, I am extremely sceptical about your claim the ES is cacheless. That makes no sense, *EVERYTHING* has cache these days for many good reasons.

Furthermore, cache per se is not neccessarily a lot faster than edram. That depends entirely on circumstances that varies according to the task at hand, and how the chip is programmed to deal with that task. Cache merely stores information in the hope it will be needed soon, retrieving info is fast if it is in cache and extremely slow if it is not. The programmer typically has little or no control over what information will actually BE in the cache at any one time. It's all guesswork. edram on the other hand is entirely controlled by the programmer and typically much larger than a cache, so it can fit exactly the pieces of info needed next. It won't be as fast in raw access speed, but fewer delays due to cache misses can very well mean it ends up being as fast or faster than something that relies on cache.

Saying one is better than the other is impossible, they're apples and oranges and can't be compared just like that.
 
Gubbi said:
It doesn't have data caches. Its a large configuration NEC SX-6. Built out of 8GFLOPS vector CPU components:

http://www.sw.nec.co.jp/hpc/sx-e/sx6/hard_chart_e.html

It has oodles of bandwidth though (32GB/s per CPU).

Cheers
Gubbi

There seens to be a cache on the processor ( connected to the scalar unit )..
Interestingly the vector units look like they only talk to their local memory, so I guess results are transfered by the scalar unit around the crossbar connections to other cores.
 
DeanoC said:
The fatal flaw is software, so Sony/IBM is going to convince MAYA/Max//XSI/Lightwave/Zbrush/Photoshop/Renderware studio/Alienbrain etc to be rewritten to run on a Cell workstation? Remember Cell is different enough that simple porting a Linux version will likely be slower than the Windows version. To get good performance your going to have to deal with Cell at a low level.

SCE's name is next to both middleware and tools in the image. Sony can either buy one of them outright or partner with one of them and subsidize development.

Sure to piss off a lot of developers, but Sony doesnt have the time to kiss up to all of them.
 
MfA said:
SCE's name is next to both middleware and tools in the image. Sony can either buy one of them outright or partner with one of them and subsidize development.

Sure to piss off a lot of developers, but Sony doesnt have the time to kiss up to all of them.

So its a good move for Sony to lose its entire 3rd party library? Its not a piss off matter, if you require developers to use a particular set of tools you simple exclude those developers who don't use them. No development house is going to retrain its art staff to a different package because Sony say so.

Sony isn't big enough to piss off people like Square/Enix and EA off. They have a MASSIVE investment in artists who aren't going to convert to another package.

People like Renderware, SN System, etc will simple produce a converter if Sony is that stupid.

Check out EA's comments about development costs to understand why forcing the use of Cell workstations would be suicide for PS3.
 
DeanoC said:
So its a good move for Sony to lose its entire 3rd party library? Its not a piss off matter, if you require developers to use a particular set of tools you simple exclude those developers who don't use them. No development house is going to retrain its art staff to a different package because Sony say so.

Converters from PC based tools to the Sony middleware wont be any harder to write as for any other. They can stick with those tools, and do content development for the PS3 the same as for any other console ... they might just be at a competitive disadvantage on the PS3 compared to those who do adapt.

Competitive pressure will force change, not Sony.
 
MfA said:
Competitive pressure will force change, not Sony.

This was my thinking as well. I think some people went overboard in assuming a mandate or that Sony will necessitate or believes in an immediate and absolute shift to their platform.

I find it much more likely that they'll let the Free-Market do that for them. If STI's Cell is all it's cracked-up to be and provides a competitive advantage - which it very likely might as recent papers on commodity computing have explored - who is to say the content vendors/development toolset providers won't do the work on supporting Cell because it's profitable to do so? Wasn't a similar dynmic behind the move to Linux and commodity x86 computing in the '90s?

That being said, I think your comment about an initial Sony/SCE subsidization is smart, perhaps likely.
 
Gubbi said:
It doesn't have data caches. ( ... ) It has oodles of bandwidth though

Oodles of bandwidth or not, without data caches it'll suck at random memory accesses due to latency. Crossbars may have a ton of throughput, but they don't exactly improve latency them either...

It's pretty much a certainty this thing does not get 87% efficiency on average with this in mind; that's likely only achievable on tasks that stream data in a predictable fashion. In other words, it'd be great to use to "backup"/transcode DVDs. ;)
 
Guden Oden said:
Gubbi said:
It doesn't have data caches. ( ... ) It has oodles of bandwidth though

Oodles of bandwidth or not, without data caches it'll suck at random memory accesses due to latency. Crossbars may have a ton of throughput, but they don't exactly improve latency them either...

It's pretty much a certainty this thing does not get 87% efficiency on average with this in mind; that's likely only achievable on tasks that stream data in a predictable fashion. In other words, it'd be great to use to "backup"/transcode DVDs. ;)

It's a vector processor. And as such doesn't do random access very well.

But if you can express your problem in matrices and vectors it will rip through it at high sustained speed.

Cache based machines are good at exploiting spatial and temporal locality to a limited degree (until your problem busts the size of the caches). Spatial locality is exploited by loading whole cachelines at a time thereby preloading data items adjacent to the one you just requested (prefetching also helps here). Temporal locality by having a data item already in cache when it needs to be reloaded or stored to.

Vector machines don't exploit spatial locality to any significant degree. Instead they have scatter/gather memory systems (units) which can do gathering loads and scattering stores, which means that they pipeline memory requests in a way cache machines can't. -And of course oodles of bandwidth. The scatter/gather load/store units comes into their own when your problem lends itself to sparse matrices as opposed to dense matrices.

To exploit temporal locality vector CPUs has a limited set of vector registers, on the SX-6 these registers amount to 144KB (so a fair chunk of data can be stored there). These registers are under explicit programmer (or compiler) control so can not really be categorized as a Dcache even though they kind of serve the same purpose.

Cheers
Gubbi
 
Here you can find a video of the speech.

I think that Sony is taking this Cell affair really seriously. They are taking time with PS2 lifecycle just to make sure that they can develop a technology like no other. I think that PS3 will be vastly superior to Xbox2, but that's just a feeling :)
 
I wonder how targeted the workstation will be at game content development. If it is more general in its design, I can see a lot of computational researchers/engineers wanting to take a closer look.
 
Hey man, *thank you* for the link of the video.

But I agree with you, they ARE taking it serious. They spent billions towards Cell R&D since 00-01 and it's a hugely ambitious project.
 
What is so ambitous about CELL compared to the continual evolution of hardware from AMD, ATI, INTEL, 3dlabs, Nvidia, Sun, and PowerVR?

What I'd like to see when the workstations come out is a benchmark targeting general system performance (CPU and VPU), that can be run on PC hardware and Cell workstations. Then take a PC with top of the line hardware at the time of the CELL announcement and run that benchmark. Continue to do the same thing for each following year with the best PC hardware available. Then plot a simple graph showing the increase in PC computing power from the time of the CELL annoucement until its release.
 
Paul said:
Hey man, *thank you* for the link of the video.

But I agree with you, they ARE taking it serious. They spent billions towards Cell R&D since 00-01 and it's a hugely ambitious project.

Don't mention it. I think that many overlooked the Sony conference and a lot of key concept they explained. Sony are trying to stretch PS2 life, and that from the beginning. Chris Deering admitted that they didn't provide any middleware to developers during the launch just to have them take their time to understand the unusual architecture of the machine.
They have not lowered the price of the console too aggressively just to have the hardware gradually penetrate into the market, and stretch PS2 life cycle so they could fill the time gap with Xbox and GC and have PS3 technologically competitive with their respective successors. I think there will be a lot to seein the future.
 
What is so ambitous about CELL compared to the continual evolution of hardware from AMD, ATI, INTEL, 3dlabs, Nvidia, Sun, and PowerVR?

Broadband Engine aims to be the worlds first Teraflops microprocessor, look at the patent. 4Ghz, 1TOPS, 1TFLOPS, e-DRAM, this is overly ambitous.

I'm not saying they'll hit the patent embodiment(though it is the goal), but I'm simply saying that it's hugely ambitous(it is).
 
Brimstone said:
What is so ambitous about CELL compared to the continual evolution of hardware from AMD, ATI, INTEL, 3dlabs, Nvidia, Sun, and PowerVR?

I'd say the notion of conjuring up an entirely different architecture aimed at taking them all out (performance-wise) with one fell swoop is pretty damn ambitious! :D
 
As far as processing real time graphics, there is no way transistor for transistor that Sony/IBM will trump ATI or Nvidia. A CELL based console should be very flexable, but today we are already starting to see flexability in VPU designs and DX10 should allow for even more flexability.

The overall design of the PS3 will be elegant, but I fail to see how CELL is going to give a performance advantage compared to the constantly improving designs of other companies.
 
Back
Top