CELL will be there in PS3!

It will not be the main system or graphics processor in the Playstation 3.

And what would then ?

What is this architecture they have been secretly working on ?

You do not see the processor being fast enough in 3D graphics processing ?

Do you think Sony has another WELL funded R&D prohec for PlayStation 3's CPU and GPU ( especially after the Phase 1-3 and the GScube projects were both halted ) ?

Also I think the technology is also geared to 3D Tasks...

They mention a Cell based Visualizer with Pixel Engines, Image Caches, CRTCs, etc...

And from the patents you can find this:

[0137] Other dedicated structures can be established among a group of APUs and their associated sandboxes for processing other types of data. For example, as shown in FIG. 27, a dedicated group of APUs, e.g., APUs 2702, 2708 and 2714, can be established for performing geometric transformations upon three dimensional objects to generate two dimensional display lists. These two dimensional display lists can be further processed (rendered) by other APUs to generate pixel data. To perform this processing, sandboxes are dedicated to APUs 2702, 2708 and 2414 for storing the three dimensional objects and the display lists resulting from the processing of these objects. For example, source sandboxes 2704, 2710 and 2716 are dedicated to storing the three dimensional objects processed by, respectively, APU 2702, APU 2708 and APU 2714. In a similar manner, destination sandboxes 2706, 2712 and 2718 are dedicated to storing the display lists resulting from the processing of these three dimensional objects by, respectively, APU 2702, APU 2708 and APU 2714.

[0138] Coordinating APU 2720 is dedicated to receiving in its local storage the display lists from destination sandboxes 2706, 2712 and 2718. APU 2720 arbitrates among these display lists and sends them to other APUs for the rendering of pixel data
.

So they do mention some "gaming" related uses too ;) ( do not led the Broadband Engine terms mislead you... it reminds me of how they were presenting the Emotion Engine before it was fully announced it was going to be included in the PlayStation 2 ).

Regarding the PlayStation 3 name in the patents...

Inventors: Suzuoki; Masakazu (Tokyo, JP); Yamazaki; Takeshi (Tokyo, JP)
Assignee: Sony Corporation Entertainment Inc. (JP)
Appl. No.: 816020
Filed: March 22, 2001

Current U.S. Class: 711/164; 709/214; 709/215; 711/105; 711/148; 711/153; 713/202
Intern'l Class: G06F 012/02
Field of Search: 711/105,148,153,163,164 713/202 709/214,215




[...]


Other References

Mamoru Maekawa, Bunsan Operating System (Distributed Operating System), Tokyo: Kyoritsu Shuppan Co., Ltd., Dec. 25, 1991, pp. 175-182, ISBN 4-320-02570-9.

IBM System/370 Kakucho Architecture (SA22-7085-0 IBM System/370 Extended Architecture Principles of Operation, IBM Japan, Apr. 1984, pp. 3-8 to 3-10.

"IBM Wins Playstation 3 Contract," BBC News, Mar. 12, 2001[/b].



http://makeashorterlink.com/?G54325344

Sony is not THAT scared of writing the PlayStation 3 name in one of the most inclusive patents they have filed regarding the Cell architecture...
 
My final guess: Cell is the broadband processor that allows multiple PS3s, DVD players, and other home electronics devices to communicate over broadband connections fluently, such as being able to play streaming DVD on-demand movies without having cable TV or a satellite dish, or enhance MMOGs such as a PS3 version of Star Wars Galaxies.

Yes Cell can do that, but it can also work as a good architecture for PlayStation 3's CPU and GPU.

As I said in the other post there has been mention of "gaming related" stuff and GDC 2002's Okamoto presentation mentioned Cell and that was not only in regard to a Broadband Communication chip...

What would you do with 1 TFLOPS ? Especially considering the SIMD capabilities of the APUs ( armed with single precision FMACs capable of 1 cycle FP MADD instruction pipelined )... it would be sweet for 3D Vector processing... ;)
 
Pardon my ignorance but can 1tflops(which needs 4 BB Engine)* be useful for other stuffs, like say heavy duty streaming, breaking down and rebuilding of say, HD movies/music data?

==GDC 2002====
. From 2001 to 2005, we'll be developing what he calls "Internet with appliances." Not just PCs, but mobiles, digital televisions, and game consoles will all connect to their own nooks and crannies of the 'net. And beyond 2005, he hinted that the Internet will be organized into what he called "cells," a project that Sony is working on with IBM and Toshiba. Sadly, he got very vague. "Today I cannot mention more detail of cell processor," he said, noting that it'll be unveiled around 2003 or 2004. He did state, however, that the third generation PlayStation would be based on this technology. That means a PlayStation 3 born and bred to be jacked into the 'net.

Again, the crux of Cell talk is about connecting to the internet....


Incidentally, he mentioned the GSCube prior to the Cell talk, and this time it is more related to the gaming properties...
.....

Try as he might, Okamoto couldn't squeeze out that kind of performance in the next generation of hardware; he had to "settle" for 300 times the power of the PSone. He then returned to his developer and asked if his estimation was still accurate. But now the developer had set his sights even higher. Real-time rendering was great, but genuine world simulation would require something even more powerful -- 1,000 times more powerful than even what the PS2 had to offer. That's the key issue Okamoto has to face: "Moore's Law is too slow for us!" he said.

......

One way is with parallel computing, where many processors tackle different parts of a problem simultaneously. To this end, he showed us a diagram of a project Sony calls "GScube." It features 16 PS2s with a video merger all wrapped up in a single box. The device is extremely powerful. But, as Okamoto deadpanned, "programming of this is...very difficult."

I
especially after the Phase 1-3 and the GScube projects were both halted

Are there any news that the GScube project was really halted?


* 4 BB Engines, wouldnt that be costly???
 
Between the things you can find on the net, the good ol' archie I think mentioned it as well as other people like Fafalada IIRC ( and others )...

GScube as you can even find in that quote had some programming issues as its underlying architecture was not built for parallel rendering and that was the cause of the mentioned programming difficulties IMHO: the GScube was a SCE R&D pet project, not the PlayStation 3 successor...

It did test succesfully that parallel rendering could have been extremely powerful... a Cell based Broadband Engine would be powerful and it would be in certain ways a progression of that line of thought, but it would have been designed with parallel processing in mind from stage one...
 
1 TFLOPS = 1 Broadband Engine at 4 GHz...

In the patent each APU is rated at 32 GFLOPS and 32 GOPS and we have 8 APUs per PE and 4 PEs...
 
Panajev2001a said:
1 TFLOPS = 1 Broadband Engine at 4 GHz...

In the patent each APU is rated at 32 GFLOPS and 32 GOPS and we have 8 APUs per PE and 4 PEs...

I thought it was 3ghz ? But anyway could we see mabye a smaller cell chip in the ps3 due to costs or process delays .
 
And beyond 2005, he hinted that The Internet will be organized into what he called "cells," a project that Sony is working on with IBM and Toshiba.......The third generation PlayStation would be based on this technology.

Did you get it? "PS3 based on organised internet cells technology", is what im readin.

BAH! Cell or no Cell, thats IT for me in this type of topic. 2005/6 will be judgement day, which is when i be back! :oops:
 
APU = 4 FP Units... each does 1 FP MADD which means 2 FP ops/clock...

2 FP ops/(cycle * APU * FMAC ) * 4 FMACs * 8 APU/PE * 4 PE/Brodband Engine * 4 GHz = 1 TFLOPS

2 FP ops/(cycle * APU * FMAC ) * 4 FMACs * 1 APU * 4 GHz = 32 GFLOPS

In the patent each APU in their example is targeted for 32 GFLOPS...
 
So lets sum up where we are at


80% of info is pointing at a version of cell in the ps3
20% is not

Fabs are going to be build starting soon.

1tflop ps3 is not confirmed and may still happen and at the same time may not .

Jvd is the greatest guy in the world


I think i've summed up the posts for all the new people
 
Panajev2001a said:
APU = 4 FP Units... each does 1 FP MADD which means 2 FP ops/clock...

2 FP ops/(cycle * APU * FMAC ) * 4 FMACs * 8 APU/PE * 4 PE/Brodband Engine * 4 GHz = 1 TFLOPS

2 FP ops/(cycle * APU * FMAC ) * 4 FMACs * 1 APU 4 GHz = 32 GFLOPS

In the patent each APU in their example is targeted for 32 GFLOPS...

thanks man :) i got the 3 ghz number from some where . Mabye the hammer reviews with all the p3 3ghz chips in it
 
PC-Engine said:
Panajev2001a said:
He did state, however, that the third generation PlayStation would be based on this technology.

Using the cellular concept in PS3 doesn't mean it'll be capable of 1 TFLOPS.

Not necessarily of course, but an implementation of the Cell architecture as described in the patent with the advanced 65 nm process Sony and Toshiba have worked on should give hopes they can effectively achieve their goal...
 
jvd said:
So lets sum up where we are at


80% of info is pointing at a version of cell in the ps3
20% is not

Fabs are going to be build starting soon.

1tflop ps3 is not confirmed and may still happen and at the same time may not .

Jvd is the greatest guy in the world


I think i've summed up the posts for all the new people

:LOL:
 
First of all, they were not making reference to the Playstation 3 explicitly. They were making reference to an article describing the technology covered in the patent, which happened to have Playstation 3 in the title, because that is how the BBC chose to name the article. That does not prove that it is going to be used in the Playstation 3.

That aside, After reading over that patent, a few things become clearer to me.

First, "Cell" is less of a computer chip, and more of a complete restructuring of the idea of a computer network, including a universal instruction set for all processors in the network. That is, ideally they would like to have every processor in the entire broadband network use the same ISA, instead of having a different one for your main processor, graphics processor, broadband modem, sound processor, etc. Basically, they want to replace Intel architecture, AMD architecture, NVIDIA architecture, ATI architecture, etc. and just have Sony architecture-based processors throughout the entire network system. They see the benefit in having all the instructions being able to be interpreted by all of the processors along the way, in hardware. Instead of breaking your web browser's HTTP request into segments, encapsulating each in a packet, then encapsulating the packet into a frame for transmission over a broadband network, the application would simply send the HTTP request as a "Cell software program", which would be automatically understood in hardware at every level. You would no longer have TCP/IP stacks, or sockets, or ports, or ethernet. Everything would be created, sent, routed, received, and processed in "cells".

Second, they obviously realize their design will never be successful if it's an "all-or-nothing" approach. Therefore, they throw in all these little words like "ideally", "preferably", and "In a preferred embodiment". This gives them some leeway allowing them to use Cell processors in systems that have other processing architectures, and software written for those other architectures, but being transferred by, or to, Cell processors in some way. In other words, the patent covers everything from a network interface card that uses a proprietary data link protocol, to an entire network comprising of different types of cell processors, performing different tasks, but using the same instructions.

Third, some of their "ideal" uses seem a bit silly:

The basic processing module is a processor element (PE). A PE preferably comprises a processing unit (PU), a direct memory access controller (DMAC) and a plurality of attached processing units (APUs). In a preferred embodiment, a PE comprises eight APUs. The PU and the APUs interact with a shared dynamic random access memory (DRAM) preferably having a cross-bar architecture. The PU schedules and orchestrates the processing of data and applications by the APUs. The APUs perform this processing in a parallel and independent manner. The DMAC controls accesses by the PU and the APUs to the data and applications stored in the shared DRAM.

In accordance with this modular structure, the number of PEs employed by a member of the network is based upon the processing power required by that member. For example, a server may employ four PEs, a workstation may employ two PEs and a PDA may employ one PE. The number of APUs of a PE assigned to processing a particular software cell depends upon the complexity and magnitude of the programs and data within the cell.

In a preferred embodiment, a plurality of PEs are associated with a shared DRAM. The DRAM preferably is segregated into a plurality of sections, and each of these sections is segregated into a plurality of memory banks. In a particularly preferred embodiment, the DRAM comprises sixty-four memory banks, and each bank has one megabyte of storage capacity. Each section of the DRAM preferably is controlled by a bank controller, and each DMAC of a PE preferably accesses each bank controller. The DMAC of each PE in this embodiment, therefore, can access any portion of the shared DRAM.

Thus, their preferred design is to have multiple PEs access 64 MB of DRAM. In a server, the ideal number of PEs was 4, which gives a server a total of between 64 and 256 MB of DRAM. Given the fact that they explicitly talk about the PE having it's own crossbar memory controller, and having multiple PEs accessing one DRAM bank, I can hardly imagine the DRAM they refer to as being eDRAM. It sounds to me, then, that this patent filed in 2001 was making reference to a server having a maximum total of 256 MB of DRAM (the preferred case having even less, probably 64 MB), and a workstation having, at most, 128 MB.

Now extend this to the Playstation 3. How much RAM do you think it will have total? If they intend to do anti-aliasing, anisotropic filtering, have extremely high polygon meshes, advanced AI, extensive voice and sound, etc. I can't imagine they would release the system with less than 512 MB of RAM. Yet here they are making explicit reference to workstations with 2 PEs that would ideally only have 64 MB of DRAM total. That seems pretty weak to me.

Finally, they make a nice reference to a graphics workstation:

The chip package of FIG. 8 comprises two PEs 802 and 804 and two VSs 806 and 808. An I/O 810 provides an interface between the chip package and network 104. The output from the chip package is a video signal. This configuration may function as, e.g., a graphics work station.

FIG. 9 illustrates yet another configuration. This configuration contains one-half of the processing power of the configuration illustrated in FIG. 8. Instead of two PEs, one PE 902 is provided, and instead of two VSs, one VS 904 is provided. I/O 906 has one-half the bandwidth of the I/O illustrated in FIG. 8. Such a processor also may function, however, as a graphics work station.
It seems that Figure 8 is the best candidate for an ideal computer game console system built using Cell technology, given the examples provided. Two general-purpose processing PUs, and two visual processing PUs containing pixel engines and image caches in the PE, which connect to the external DRAM via an internal I/O ASIC & DMAC unit, hypothetically with 64 MB of RAM. Doesn't sound like the monster people are envisioning, but this probably isn't the design they are intending to use in the Playstation 3.

After all of that, I'm more inclined to believe Cell will make up a large portion of the Playstation 3, but I'm still not convinced it will be used for the entire system, and there is still no hard evidence that explicitly proves it will be.
 
Shit, 13 posts made while I was typing that... I need to take a speed reading course or something.
 
I will go to bed... one more post... only one more ;)

Crusher I want to thank you for the nice and inelligent observations brought to this thread.

The DRAM is IMHO embedded... one of the hints is that I do not really see them having an off-chip bus that is 1,024 bits wide ;)

Also the comment about the Server chip with 4 PEs... that patent was written in 2001 and at that time their Cell plans had 100 nm manufacturiong technology in mind and only later they revised to to include 65 nm manufacturing technology and lower so it is possible that by mid 2005 they can release the 4 PEs Broadband Engine as CPU of PlayStation 3.

There is specific mention of External Memory in FIG. 6

BTW, it seems that the cross-bar switch works with the PEs' DMACs and we do not have a cross-bar switch for each PE.

The DMAC controls accesses by the PU and the APUs to the data and applications stored in the shared DRAM.

Reading this "shared" and how they talk about the DRAM HW enhancement for the HW sandboxes, etc... I cannot but think e-DRAM ( especially with all those "shared DRAM" comments ).

65 nm manufacturing technology should also allow for a transistor' budget which allows for e-DRAM being implemented.

IBM was doing research with e-DRAM and when talking about their BlueGene project they mentioned how one of the revolutions of Cellular Computing was solving the memory speed vs CPU speed bottleneck by integrating e-DRAM, thanks to also its smaller cell size compared to SRAM...

e-DRAM thanks to its huge bandwidth is a good solution to keep processor's performance high if you can afford it ( also I don ot believe that the PEs oherwise take THAT ultra much of transistor logic and that there is space for e-DRAMconsidering the transistor budget for a 65 nm process [IBM had a page with some figures with projections of e-DRAM and other things] )...

Sorry if this post seems to jump from one thing to the other and then back and forth... :(
 
chaphack said:
BEcause i want to talk about graphics, not games here?

So, I take it that XBox Next and GameCube2 won't have any advanced graphics to speak of? If so, then why aren't you talking about the plathora of pitfalls they could have.

Touch your heart and tell me you seriously think MS hype == Sony hype?

Were you around during the pre PS2 days? Go do a search on pre PS2 hype and pre Xbox hype

Oh yes, since SCE's out there right now uptalking PS3... oh wait...

Oh and did you read my last post entitled "Vince...."? You want quotes of Sony hype, i gave you all that in that topic.

I did and it's you taking their words and twisiting them to fit what you want. I'll go threw it here:

Not only will this new CPU have application for games, but it will be the core media processor for future digital entertainment applications,

Im still waiting...

Go to Japan and see just how what they're doing on their BB Network is.

The massive combined performance of this CPU permits complicated physical calculation, NURBS curved surface generation and 3D geometric transformations, which are difficult to perform in real time with PC CPUs performed at high speeds.

Right....Sony must have oh so suddenly forgot about the 3D cards back then.

Chappers, give me a break. Name me a 3D card today that nativly supports NURBS. And then name me one that did back in 2000, just a bit beyond DX7's rather primative hardwired T&L front-end.

When this is applied to the processing of geometric and perspective transformations normally used in the calculation of 3D computer graphics (3DCG), the peak calculation performance reaches 66 million polygons per second. This performance is comparable with that of high-end graphics workstations (GWS) used in motion picture production.
I am still waiting for my motion pictured games.

And I'm still waiting for the world to change like nVidia promised when they launched their GeForce256 back in 1999. :rolleyes:

Why don't you crusade againts nVidia's PR?

This new GS rendering processor is the ultimate incarnation of this concept – delivering unrivaled graphics performance and capability. The rendering function was enhanced to generate image data that supports NTSC/PAL television, High Definition Digital TV and VESA output standards. The quality of the resulting screen image is comparable to movie-quality 3D graphics in real time

Ultimate rendering processor? RIGHT!
HDTV and VESA standards? RIGHT!
Movie quality 3D in real time? RIIIIIIIIGHT!

Again, your twisting their words to suit your biased needs:

(a) It doesn't say "ultimate rendering processor". It says "Ultimate incarnation of this concept" - which their probobly refering to the massive 48GB/sec of onboard sustainable bandwith. Way to spin...
(b) Last time I checked there were PS2 games in HDTV/Progressive resolutions with a growing library.
(c) Games like SH3 or Doom3 are approaching 2000 level entry graphics. Of course their going to uptalk their product. Just like nVidia does here:

GeForce FX: Cinematic Computing for Every User
Are you ready for gaming nirvana: The thrill of cinematic-quality graphics at blazingly fast frame rates? Well the NVIDIA® GeForce FX GPUs makes it possible. Powered by pure adrenaline, the GeForce FX GPUs deliver unheard of 3D graphics performance for all of your current games and apps—with unmatched technologies that trigger the next generation of explosively real game effects.
http://nvidia.com/view.asp?PAGE=entertainment

Chappers - how about you crusade against nVidia and their "false advertising" and "Hype"?!?

In the past, this level of real-time performance was only achieved when using very expensive, high performance, dedicated graphics workstations. However, with the design of the new Graphics Synthesizer, this high quality image is now available for in-home computer entertainment applications. This will help accelerate the convergence of movies, music and computer technology into a new form of digital entertainment

Death to dedicated graphics workstations!!!!!!!!! Oh wait..., fook it, the convergence is the now

What part of this is wrong? The GS is faster in raw specs than basically any workstation of it's time, how is this any diffrent than how Microsoft claimed 4.0GPixel/sec for the XBox? So much hype... oh wait.

"We can create digital content that is like a movie but is really a game," Kutaragi says. "It allows us not only to make the images look beautiful but also to imbue the objects of our world with the behaviors of the real world or any other version of reality you choose." He looks up and deadpans, "I think people will be impressed."

MAMACARAMBA! Im so impressed with that movie, oh wait! damn! its a PS2 game! It is soooooooooo alive it is rEAl!!!!! Damn impressed im!

Whoa... alright. But, to answer your no-existent remark - games like MGS2: SoL and The Getaway are very Cinimatic and do push this convergent feld forward. As has Max Payne, et al.

Sony dubbed this "Emotion Synthesis" - making game characters and environments behave just as they would in the real world. According to the company, this reaches beyond the capacity of current state-of-the-art workstations and approaches the power seen in large-scale super computers used for scientific simulations.

Supercomputer? Few hundred PS2 to Iraq? I see it now!!! TEh WMD is TEH PlAySATAN2!!!! No wonder inspectors failed to find them, what better way to hide them but to disguise them as game consoleS!! SCE is TERROOORISTO!

Wow again, how can I argue with this form of brilliance. Would you like a recommendation for MENSA?

They can fly for a billion years and they will never repeat the pattern," Harrison says. "The flight isn't random. It's biological behavior. The birds are maintaining spatial separation. They are maintaining rules of follow the leader. If they get lost, they maintain the rule of what happens for them to be able to rejoin the flock. They flap their wings when they need to gain height and glide to slowly descend. None of this was precomputed by an artist

I wonder which game does that.....

Ask Kristof whose created some demos on Procedural Texture generation about it if your curious. Again, how is this hype if a demo uses procedurally generated terrain and has some intelligent AI? Is it hype because nobody else has eleced to utilize it?

Sony's press material released today sums it up nicely: "Imagine walking into the screen and experiencing a movie in real-time... this is the world we are about to enter."

Sums it up nicely it sure does, it sure does

Ok, again... how is this hype? Because it isn't upto your expectations? How is playing GTAo_Ox not like being in a movie with it's open ended gameplay and story? Or ICO or MGS2 or The Getaway? Want me to go on?
 
Now extend this to the Playstation 3. How much RAM do you think it will have total? If they intend to do anti-aliasing, anisotropic filtering, have extremely high polygon meshes, advanced AI, extensive voice and sound, etc. I can't imagine they would release the system with less than 512 MB of RAM. Yet here they are making explicit reference to workstations with 2 PEs that would ideally only have 64 MB of DRAM total. That seems pretty weak to me.

First in the latest PR released they mentioned they are already eyeing the possibility to switch as soon as it will be ready to a 45 nm manufacturing process ( from 65 nm ): they expect the processors to be BIG... and in the long term it is cheaper to bhave the e-DRAM on the chip rather than keeping for a while a 1,024 bits off-chip bus running at a high frequency...

Answering your question: what about 32-64 MB of e-DRAM ( for the Broadband Engine chip and the Visualizer chip in FIG. 6 ) + 256 MB of Yellowstone DRAM ( external RAM in FIG. 6 ) ? ;)
 
Back
Top