Second Gen Cell info

Wow! This bodes well for FMV cutscenes, though we can see how game costs are set to increase if PS3 is to deal with 48x as much FMV content. :p
 
That's around 3GPix/sec MPEG2 decoder performance with 6SPUs, not too shabby :p
Any clues at what clock speed they ran the chip for this demo?
 
Fafalada said:
Any clues at what clock speed they ran the chip for this demo?
No hint at that.

fxtech said:
is this windows media player 10 ? cell card for windows x86 pc ? mmmm... .
Powerpoint + Windows Media Player is a must for a demo! ;)
 
why 3giga pixel ?

mpeg2 resolution standard at 480*576*48*30 stream = 0,3 gigapixel done by 6 spe

any idea of what are the performance of a standard x86 see cpu ?
 
Thanks one, quite impressive, so each of the 6 SPE's handled 8 streams. I'd assume this is a tech demonstration not only for Cell's power in general but specifically in the HDTV market which Sony and Toshiba have in mind. Instead of flicking through channels why not display a dozen or so at once and select the one you like.......who needs tv guide.
 
Fafalada said:
That's around 3GPix/sec MPEG2 decoder performance with 6SPUs, not too shabby :p
Any clues at what clock speed they ran the chip for this demo?

Not by my math, according to the slide those were SD streams not HD

I get 480*576*48*30fps -> 398131200

which is 398 Megapixels/sec.

Still impressinve though.
 
SDTV resolution is 720x480 (NTSC) and 720x576 (PAL), so this demo would decode closer to half a gigapixel of data. It'd be more interesting to hear about the bitrate, though.
 
I built a DShow filtergraph outputting a SD MPEG2 to the Null renderer (to remove the video card from the equation), and on this old 1.4 ghz P4 400 mhz FSB (it's an engineering sample I got from Intel many years ago), it consumed about 20% of the CPU. That's with all the overhead of DirectShow/Kernel Streaming/etc.

So a modern 3.4 ghz machine could probably decode at least 12 SD MPEG2 streams simultaneously. A dual core could probably do at least 24. Also I was using an older Intervideo MPEG2 codec, so I don't know if a better one has come out since.

Probably you could squeeze out some more if you remove any OS overhead from the picture.

IMHO, 48 streams is impressive, but not orders of magnitude better than current tech.
 
Who the hell deleted my post????? I'll post it again.

The bitrate was probably very low since the streams had to be shrunken down to thumbnails anyway. Why have high quality streams when they'll be shrunken down anyway?

Also SDTV has two resolutions 640x480 and 704x480 and I'm betting the demo was using the former lower resolution for the same reasons I mentioned above.

http://www.audiovideo101.com/dictionary/sdtv.asp
 
PC-Engine said:
The bitrate was probably very low since the streams had to be shrunken down to thumbnails anyway. Why have high quality streams when they'll be shrunken down anyway?
It's a demo, not an application, who cares about it?
The only truth is you'd doing anything to downplay cell, lol :)
That's why your post was deleted, no doubts about it :)
You know ..it's the same old story..we're still looking at those beautiful prerendered madden 2006 screenshots you sweared were realtime :LOL:
 
nAo said:
PC-Engine said:
The bitrate was probably very low since the streams had to be shrunken down to thumbnails anyway. Why have high quality streams when they'll be shrunken down anyway?
It's a demo, not an application, who cares about it?
The only truth is you'd doing anything to downplay cell, lol :)
That's why your post was deleted, no doubts about it :)
You know ..it's the same old story..we're still looking at those beautiful prerendered madden 2006 screenshots you sweared were realtime :LOL:

Some were asking about bitrate so I gave possible reasons which are actually reasonable compared to what's been suggested with HD streams.
:LOL:

BTW, I didn't say the madden screenshots were realtime. I said they could easily be realtime with blur. ;)

BTW wouldn't it be pretty funny if the actual realtime cutscenes looked better than that pic? Of course only PS3 would be able to accomplish that since CELL can stream MPEG2 clips very well. :LOL: ;)
 
Well according to the original post, I inferred the original decoding was done at native resolution by the 6 SPE's, with the 7th downsampling and tiling. Even if it wasn't done that way its still an impressive technical exercise. Be interesting to see whether it was done with an original concept prototype shown physically at ISSCC or the refined version which can do an additional concurrent thread in the PPE.
 
Tacitblue said:
Well according to the original post, I inferred the original decoding was done at native resolution by the 6 SPE's, with the 7th downsampling and tiling. Even if it wasn't done that way its still an impressive technical exercise. Be interesting to see whether it was done with an original concept prototype shown physically at ISSCC or the refined version which can do an additional concurrent thread in the PPE.

Well it's pretty impressive, but not within the realm of normal technology scaling. You can design DSPs or ASICs to do the same thing and even run at sub GHz clock speeds.
 
PC-Engine said:
BTW wouldn't it be pretty funny if the actual realtime cutscenes looked better than that pic? Of course only PS3 would be able to accomplish that since CELL can stream MPEG2 clips very well. :LOL: ;)

Eh? Since when do "realtime cutscenes" require MPEG2 decoding?
 
Yeah, there's a lot of specialized DSP hardware out there, unique items like the Lenslet Enlight256 or Cell more conventionally have really made me wonder how much a large company like Intel really thinks outside the box its made for itself. To keep such a legacy architecture going sucessfully has to take a lot of transistors, a lot of marketing and basically overdependance by the enormous userbase. It's probably more costlier at this point to change to a fully new computing architecture than it is to keep developing bloatware and chips like the Pentium D or the Athlon X2. Unless something truely of broad appeal which can more or less seamlessly transition commercial and private users to a new paradigm which x86 simply cannot compete with on current buisiness models happens, its gonna be the same old same old.

Which is a shame really, as most things that have tried to compete with x86 have been assimilated with almost Borg like remorselessness. RISC made Intel get into the whole P4 netburst architecture with the overly deep pipelines and micro ops and which they're now trying to backtrack out of somewhat as they ran into trouble. Now multicore is the big thing, first symmetric multicore and in an Intel slide i saw sometime back fat+lean core "helpers" much like Cell with 1 or 2 OoO main cores and a dozen or more microcores around it.

Even with process technology advancement and its nearly overwheming success how long will it be before x86 runs out of ideas internally or externally from competitors. Might have to be something as radical as the change between vacuum tubes and transistors for all i know, i'm just a layman.
 
Back
Top