What is this SCEI patent?

...

Seems like an audio DSP ALU to me.

What this thing does is to eliminate the need to move the operand between register and ALU back and forth, instead performing direct manipulation of bit operation on incoming/outgoing datastream as they come and go.

This is the only truely innovative patent application coming out of CELL, the others are marketing hypes.(APU cough cough, APU cough cough)
 
The one thing I'm looking forward the most from their latest patents is that new wearable/non-wearable 'controler' thingie with tactile feedback.

Even that patent about different software rendering methods applied to the same scene at the same time, takes the backseat to it :p

PS3 - let me touch you!
 
Paul, Thanks.
That is the most complete version of the story that I've read.


Marconelly,
Don't get your impressions from me. I've very impressed and curious with this graphic model. (A like concerned about synchronization but impressed.) The reason I didn't delve into it is because I'm a Tech fanboy and people take my flippant comments too seriously. (~Vince.)

I'm not well versed in engineering. But enjoy exploring new ideas.

One thing I recall (I hope correctly) is that the most recent patent also stated that the Video DRAM was Quad ported. Which makes sense to me as one 256bit bus for each APU.

What didn't make sense to me was the 8x 4MB soft partitions.
8 divisions from four ports???
A guess is that each buss/port might be serial up and down.
Uh that didn't make much sense?
The other non-sense idea was that the port was split: 128 up, 128 down?

To allow the read right dual port memory of the PS2 it'ds value was effectively cut in half. So there was a read and write area work area with a small thrid area of data that was retained for reaccuring data.

Another guess that I've had since the first patent is that the cell is a 64bit native system. 256bit instructions/words seem to be the working chunks. With PS2 128 was broken into 32bit, so I went with the same divide four idea. Also for addressing and managing tasks on a network noithing less than 64 bit processing seemed appropriate

Deadmeat in that same thread as my glove post proposed a working model. However the nature of these APU is in no way fixed and will function as the programers decide. So to that it it has been more important to me to understanding the working structures....

This is where things got interesting. Their hardware design is the the all singing, all dancing, shiva of engineering deconstruction. First a clear guess early on was that graphics would have to be devided overlayed and or seperated. So Z-merging and tile rendering were expected. Unlike some people guessed about tile redering it is in fact not patented. Various methods are but not the idea. In fact OpenGL has a niche that covers this method called WireGL. And that method is protected by a GNU license. So to is the Bunsan(=japanese for distributed) Operation System they intend to use. I've been reading about various options for a long while now. The only one I've shared is Hurd microkernal. Which I haven't persued do to a lack of recent activity. BTW- I'm pretty sure MS pulled out of the OpenGL committeee because they became aware of Sony's interest in it as their new graphical language.

I'll make a rough start of how I suspect cell might function in a PS3.

Cell
is a Parallel System of Multiprocessors
Sharing access to a single main memory.
With Non-Uniform Memory Access (NUMA)
and a slight amount of non-cache/NO Remote Memory Access (NORMA) local memory in each processor.

Distributed Operating System
It will function in a Real Time: Firm environment allowing for missed data.
APU are Reactive Op units, but there will be Embedded ICs
The processing cores (APU) can work in Isochronous operation using an absolute clock and NOOp flags to govern activity.
However many operations must be Synchronously performed,
especially operations in relation to local/user displayed graphics.

So as I see it here
1 VS = PU, DMAC, 4 APU, Pixel Engine, 32MB of Quad-ported Video-DRAM, CRTC
1 PE = PU, DMAC, 8 APU
128 MB of XDR = 2x 256bit/64Mbyte chips
1 HDD
1 Blue-ray drive (I’d rather have Toshiba/NEC’s standard but ah well.)
1 Gigabit LAN port (I’d prefer two ports)
(Sound and I/O processing is unknown to me. But I would prefer an embedded solution.)
3 control ports
2 peripheral ports (USB)

Potentials (As in the imaginary limit of the above configuration.)
768 Operations per second (Half floating, half fixed point.)
30GB/s XDR bandwidth (25GBs is what most are expecting)
1024bit bussed memory access to 32MB of on chip Video DRAM
@ hopefully 1GHz = 128 GB/s of Display Bandwidth.

When I have time to really sit down and elaborate or make revisions to this I will. Until that time I look forward to reading other peoples ideas.
 
David,

That's a lot of interesting, yet realistic speculation... Some questions for you:

1: OS. Previously, it has been pretty much confirmed Linux would be the one driving PS3, now you're saying it's this Bunsan whatsitscalled? Are you sure about this? If so, did the switch occur because of all the lawsuits currently flying around you think, or for some other reason, or was Linux never REALLY intended to be OS for PS3? Or is Bunsan perhaps some Japanese flavor of Linux? ;) Ok, so that was many questions all in one. So sue me. :)

2: 3 controller ports? Why 3? That strikes me as sort of an odd number... (Pun intended.)

3: XDR throughput. I would think Sony would aim at 50+GBs. Think about it, a 256-bit GDDR3 bus delivers that much already TODAY. Ok, not EXACYLY today because no graphics cards have been announced yet with memory that fast, but you can buy the individual memory devices today from Samsung. 30-ish GBs in one and a half to two years' time will look slow.

4: Connections. You want two gigabit ethernet ports - why? I don't even have to mention it'll never happen because we know it won't, but a gigabit hub will be cheap when PS3 is out, and the bandwidth in just one cable should be sufficient for any purpose, heh heh. Also, I would not be surprised if firewire appears again on PS3. Maybe DV camcorders can double as eyetoys or something as such.
 
I'm a bit confused with the three controler ports part myself. Maybe that's a little joke as Sony kinda refused to make PS2 have four ports but made with two instead. So, he sugests they won't budge all the way this time, but will add one port? :)

I think 128MB of RAM is way, way too small of an amount. I cannot imagine PS3 being worth a damn without at least 256MB of RAM, and even that is stretching it.
 
Main RAM - half a gig. No contest there, else they're crazy. :)
One gig... Well, that might well be crazy too.

Quarter gig will be way too small though.
 
Quarter gig will be way too small though.

This sounds about right, remember they thought that whatever they had in the PSP originally (what was it? 16MB?) would be 'enough', until Devs started hazing HQ.
 
Guden Oden said:
2: 3 controller ports? Why 3? That strikes me as sort of an odd number... (Pun intended.)

I think that 3 controller ports comment refers to a: 1P, 2P, Special device (eye toy , new gloves ,etc...)

anyway , i realy hope sony goes the 4 players way!!

edit:typo
 
Guden Oden said:
Main RAM - half a gig. No contest there, else they're crazy. :)
One gig... Well, that might well be crazy too.

Quarter gig will be way too small though.
Remember how crazy 32Mb sounded back in 1999?
The kind of RAM they’re planning to use aren’t exactly cheap you know.
256Mb RDRAM plus all the eDRAM might very well be enough.

If wireless controllers are standard, with PS3 there really only need to be a one single port, for the docking bay, which I doubt will integrated with the console (even if super fast recharging batteries are used: http://science.slashdot.org/article.pl?sid=04/04/06/1443207&mode=thread&tid=126 . ).
 
Guden Oden,

1. A Unix/Linux/BSD variant will definitely be the OS. In the only granted cell patent they refer to a “Bunsan†OS. Bunsan is the Japanese word for distributed. The actual OS they were drawing on for inspiration, is the Galaxy OS of 1991. A person from that team is helping them develop the cell microkernal. I haven’t been able to come to grips with the best version of what it is that they may be seeking. A Hurd OS is the OS I’ve been using as a template. However Object Oriented tagging, queuing and other ways of distributed/parallel computing hold potential as being more efficient. In the end whatever OS they name/create/tweak will run or be compatible Linux apps. Hurd for example supports X-Windows.

2. Three control ports is in fact something of a joke. Marconelly & Vysez combined got the idea. My guess is based purely on the hardware’s needs, or at least my interpretation of that. We absolutely need at least 2 ports, and at the very least one for peripherals like DDR Mats, Light Guns etc. But as a joke I’m suggesting Sony may settle on a quantity with console numeric “synergy†(PS3 w/ 3 Control Ports. Like ya’ll I agree four ports are more desirable. Or am I the only one that hates plugging and unplugging controllers every time we play a different game?

3. Sony once stated their goal to have main memory performance great that of the PS2's VRAM (48GB/s). Just at the moment I’m letting hardware make the choice for me. And the (XDR) hardware thus far hasn’t been demonstrated to reach 50GB/sec. But this bandwidth really isn’t too important to me. Since a TV’s refresh rate is generally 60FPs, and not 120FPs like some PC gamers go for. I feel 30GB/s may well be good enough while reflecting a real industry manufacturing deadline.

4. Two, Gigabit Ports because I’m a spend-thrift who might just Daisy Chain a network instead of using a router hub. True it’s not as robust as a hub but my roomies and I can link with each other and online without the added expense. However you see that my judgment is realistic enough to expect just the needed port.

Yes/No, I would expect the newer Firewires also. Especially because the i.link in the PS2 was for the very purpose you imagined it could be in a PS3. But I’m also respecting the budget and minimal needs of the hardware to achieve set goal. And to that end many new Digital cameras are USB 2.0 compatible. And since USB is cheaper than Firewire to implement Sony well just go with only the 2 USB ports I predict.

With the 126MB memory I’m just respecting the bare needs of the hardware. 64MB for each Element (I consider VS and PE to both be elements.) However Panajev may need to be consulted on this. In order to have dual channel memory performance you may need 2x 64MB XDRs for each element. Which would equal the 256MB that you, I, and the whole industry might generally prefer. Again I just based the guess on the needs of a 720p HDTV. (BTW, I still consider it possible for deferred rendering to be implemented by a slick enough software programming team.).

I have to go to class right now. I’ll be back later with maybe a little more.

David_South
 
David_South#1 said:
A Unix/Linux/BSD variant will definitely be the OS.

This is in line with previous speculation, now the forum doesn't feel lost and confused anymore. Thanks for putting us back on track! :)

A Hurd OS is the OS I’ve been using as a template.

To those of us that - eeheh *coughs* unlike me - maybe aren't fully up to speed on this - what is a Hurd OS anyway?

And the (XDR) hardware thus far hasn’t been demonstrated to reach 50GB/sec.

Luckily, the console isn't launching tomorrow. By the time it does launch, XDR will likely be considerably faster than it is now (well, ISN'T, really, but we have paper specs :)). Cconsidering it uses differential signalling and all that, it should be able to go faster using 256 data pins (half data, half data inversed), than what GDDR3 manages with 256 data pins.

I feel 30 may well be good enough

If you compare the fluidity between 30 and 60, it's a totally different ballgame..

and is a real industry limit

Well, only in NTSC land, and only on the video side. Actually not even on the video side, since it's like, 29 1/2 or something weird like that. ;)

2 because I’m a spin thrift. I’d just Daisy Chain my network.

Interesting idea, but really, I don't expect Sony to tailor their hardware after your needs ;) (as you already pointed out yourself). Anyway, the console would have to act as a router at all times for that to work, and at gigabit speeds that requires a handsome amount of processing power we might not want to dedicate during gameplay for example.

In order to have dual channel memory performance you may need 2x 64MB XDRs for each element.

Each chip is a discrete entity and, I think, 16 bits wide like with RDRAM, so you'd basically get multiple channels "for free"... Well, more pins would be needed of course.
 
notAFanB said:
then why use XDR (besides the braggin rights?)

Initially I thought it would be because of costs, since IIRC XDR takes fewer layers for the PCB?

But then IIRC you need a special memory controller chip which adds to the overall cost?

I don't know if what I said is correct or not regarding board costs etc.

However I do know for a fact that Nintendo chose RDRAM for the N64 because of costs. It needed fewer PCB layers. Too bad they didn't consider latency. Don't know what the situation is with XDR though.
 
notAFanB said:
then why use XDR (besides the braggin rights?)

Because the differential setup is more fault-tolerant and can handle higher clock speeds, and it requires fewer pins overall (GDDR has lots of extra signals like any DRAM-based interface; RAS, CAS etc pins thar runs at odd frequencies compared to the main databus and so on).

XDR is a 'cleaner' approach overall from the hardware point of view like DRDRAM was too, which is probably why Sony likes it.
 
XDR is a 'cleaner' approach overall from the hardware point of view like DRDRAM was too, which is probably why Sony likes it.

while that's all well and good for PC space, for an fixed console I cannot see the advatages if the perforemence ( of both parts at the time) are near enough Identical.

EDIT:

er, right the above kinda sounds harsh, but my point is If XDR scales well what would be the likely cost reduction over the lifetime of PS3? (including the initial price of purchase?) How does does this compare with the projected figures for GDDR3 over the same period?
 
Back
Top