Why doesn't SCEI go into supercomputer biz to prove itself?

ps2_cluster_pic_02.jpg

Hell, poor NCSA scientists who can't afford million dollar machines are building a 70-machine PSX2 cluster for $50,000. I am sure a 100 teraflop CELL machine for $100K would go very well with scientists, not to mention that it earns Kutaragi Ken the prestige of building the world's most powerful computer...
 
GSCube (16 PS2s chipsets or 64 PS2s chipsets) was just a start. these relied on SGI worstations/supercomputers as hosts.

next time, STI or Sony-IBM, will build their own graphics super computers and will totally surpass SGI in brute processing power, but probably not pixel shaders or image quality. (SGI is now using ATI R3xx to build the new Oynx 4 UltimateVision systems that will replace InfiniteReality series)
 
GSCube was PS3 test technology so developers could get used to distributed computing(Cell).

And this thread is just sad, DM you've sunk to a new low. I would say this is borderline retarded.
 
"Why doesn't SCEI go into supercomputer biz to prove itself?"

Who says they won't? Remember, they are working on Sony's timetable, not DMGA's timetable (whatever that may be).
 
I thought Sony said something about wanting to do some supercomputing or workstation thingajamig with PS2, during their pre release hype campaign?
 
I thought it was the idea of farming out processing to other companies? or something to that effect, don;t see it happing without sever cost benefits tho.
 
chaphack said:
I thought Sony said something about wanting to do some supercomputing or workstation thingajamig with PS2, during their pre release hype campaign?

Well I highly doubt they were telling people to do this all from one console. Meanwhile, notice the GS-Cube? Notice the clustering? There are some things that the PS2 can do rediculously fast for its price, but these aren't things in high demand (nor to most institutions in any way remotely thing "we need a fast research computer. Pick up some consoles!") and are probably too tightly confined to attract some notice. I suppose technically a "supercomputer" would be possible, but what would be the point?

CELL seems to be both much more broadscale useful and programmable, but also highly networkable, which would seem to make clustering all the better. We're STILL not likely to see major companies going straight to consoles, but we'll see what IBM is planning on bringing out, and all the products in general will flesh out in due course. Not to mention CELL needs some proving time, as well as companies building up the confidence and programming time to convince themselves to switch.
 
I'm not so sure supercomputing fits well with SCEI's business model. After all, the market is extremely limited for these kind of machines and SCEI is all about mass-market. Supercomputer customers also demand rock-solid reliability on top of everything else...

Supercomputers have checksums and error-correction codes guarding I/O, memory, caches and registers, they have backup memory devices and even entire processors to cut in instantly if there is a hardware failure.

PS3 chips and memory would not be engineered with this in mind. If a cosmic ray twiddles a bit somewhere in a PS3 the machine would either crash or glitch to a smaller or greater extent, but there wouldn't be any lasting harm. Randomly twiddled bits in a supercomputer that might be working on very sensitive data would be something else entirely...

Besides, PUs would still be limited to single-precision FP, wouldn't they? Or can they do 64-bit floating-point ops too this time 'round? Doing 64-bit maths using 32-bit ops will cause a significant performance drop.


*G*
 
Grall, SCE might not build super-computers with CELL, but Sony Pictures might build Renderfarms with it and IBM has expressed their desire of using the CELL architecture themselves as well, so IBM might make supercomputers with CELL and ECC ( implementation detail ).
 
Re: Why doesn't SCEI go into supercomputer biz to prove itse

DeadmeatGA said:
Hell, poor NCSA scientists who can't afford million dollar machines are building a 70-machine PSX2 cluster for $50,000.

Wait, how is NCSA under budgeted? Have you ever been to UIUC? It's one of (especially NCSA) the preeminent public research centers in the country. You don't know why they did this; You don't know who sponcered it; You don't know what they're eventual goal is; All you know is flame....

Having read your posts for way to long now - what do you know?

PS. Hey, check this out douche:
[url said:
http://ncsa.uiuc.edu[/url]]CHAMPAIGN, IL — The National Center for Supercomputing Applications (NCSA) will install an Intel Xeon-based Linux cluster from Dell with a peak performance of 17.7 teraflops (17.7 trillion calculations per second), NCSA Director Dan Reed announced today.

This new cluster will be one of the world's fastest supercomputers, compared to the peak performance of the machines listed on the most recently released Top 500 list, http://www.top500.org/.

Not bad for the poor fellow's whose work on NCSA Mosaic basically lie under all contemporary graphic web-browsing. :rolleyes:
 
While SCEI may not make supercomputers, it is pretty clear that
supercomputers or more specifically, visualization supercomputers
(like Onyx RealityEngine, InfiniteReality and UltimateVision, will be constructed with the same building blocks used to make PS3. that is BEs/PEs and GS3s/Visualizers. It just probably wont come from SCEI.

perhaps Sony will form SVSCI (sony visualization super computing inc)
 
Problem is, as someone else already pointed out, that 32 Bit Floats don't offer enough precision for most scientific applications, so while you can get lots of flops for cheap by these ps2 clusters, finding a real use for them is an entirely different issue. With this in mind i doubt that at least the ps3 version of cell will show up in too many computing centres. Pre-visualisation, however, might well be a suitable usage domain (i remember readting that movie quality cg uses not only >= 64 bit colour/channel precision but also "64bit geometric precision", not even beginning to think about CAD/CAM applications).
 
I'd think CAD/CAM applications would be pretty safe with 32-bit. Even if you were designing something 1 square mile in size, you could still resolve down to 1.48e-5 of an inch. :oops: That's still "pretty" precise, I would think. Perhaps, there would be a demanding project that could need a bit more headroom, but 99.9% of CAD projects should fit in that window quite easily.
 
Nothing would stop another company from buying 100 BE's from Sony with the optical port on them and then linking them together themselves.
 
Just stumble upon some nice on-topic links:

Grid Middleware written in Perl for PS2:
http://www.sve.man.ac.uk/Research/AtoZ/Playstation2

Computational Chemistry on the PS2:
http://spawn.scs.uiuc.edu/research/sonyps2/ps2project.htm

NCSA's PS2 Beowulf Cluster on BBC News:
http://news.bbc.co.uk/1/hi/technology/2940422.stm

Interview with PS2 Linux Developers:
http://codingstyle.com/mod[...]article&sid=153

A port of NetBSD to the SONY PlayStation 2:
http://www.netbsd.org/Ports/playstation2/

Slightly off-topic, but although nice:

Unmodified Xbox (Linux) Cluster:
http://www.cs.uh.edu/~bguillot/xbox/home.html

xbox4a.jpg
 
Wouldn't it be awesome if we could stack our Xbox 2's like in ChryZ's pic above and have 4x the performance avaliable and at least a handful of developers support that feature with much larger, more detailed games?

remember in the PS1 and Saturn (and Dreamcast) days, a few developers supported the Link Cables so that you could have two players playing on different screens. there were no more than 2-3 dozens games that did this, between PS1, Saturn and Dreamcast.

well, if just 10-20 games supported "combined processing" between 2-4 (or more) consoles, networked/linked together (not via internet) we could see some truly mindblowing games that would not otherwise be possible with the same detail that a single XBox 2 or PS3 could manage.
 
Wouldn't it be awesome if we could stack our Xbox 2's like in ChryZ's pic above and have 4x the performance avaliable and at least a handful of developers support that feature with much larger, more detailed games?

Actually, I think it'd be one of the worst possible things to happen to gaming.
 
Back
Top