KK slips CELL performance at decoding HD streams.

pahcman said:
cell is not bad chip. the problem is people here are fed too much of sony magical abilities over the years.

:rolleyes:

This isn't below most people's expectations. In fact, after the Tosh demo, most people thought it could only decode 6 or 7 HD streams in realtime, so 12 is positively surprising. And people were questioning that the clockspeed might have been very high for that Tosh demo too, but that evidently wasn't the case.
 
Titanio said:
pahcman said:
cell is not bad chip. the problem is people here are fed too much of sony magical abilities over the years.

:rolleyes:

This isn't below most people's expectations. In fact, after the Tosh demo, most people thought it could only decode 6 or 7 HD streams in realtime, so 12 is positively surprising. And people were questioning that the clockspeed might have been very high for that Tosh demo too, but that evidently wasn't the case.

Actually if you go back and look in that thread some people were saying under 4GHz and about 10 HD streams.
 
PC-Engine said:
Titanio said:
pahcman said:
cell is not bad chip. the problem is people here are fed too much of sony magical abilities over the years.

:rolleyes:

This isn't below most people's expectations. In fact, after the Tosh demo, most people thought it could only decode 6 or 7 HD streams in realtime, so 12 is positively surprising. And people were questioning that the clockspeed might have been very high for that Tosh demo too, but that evidently wasn't the case.

Actually if you go back and look in that thread some people were saying under 4GHz and about 10 HD streams.

Maybe I exited that thread early, but I just recalled 1 stream per SPE, so 6/7. But 12 @ 3.2Ghz is still a lot better than 10 @ 4Ghz ;)
 
Titanio said:
PC-Engine said:
Titanio said:
pahcman said:
cell is not bad chip. the problem is people here are fed too much of sony magical abilities over the years.

:rolleyes:

This isn't below most people's expectations. In fact, after the Tosh demo, most people thought it could only decode 6 or 7 HD streams in realtime, so 12 is positively surprising. And people were questioning that the clockspeed might have been very high for that Tosh demo too, but that evidently wasn't the case.

Actually if you go back and look in that thread some people were saying under 4GHz and about 10 HD streams.

Maybe I exited that thread early, but I just recalled 1 stream per SPE, so 6/7. But 12 @ 3.2Ghz is still a lot better than 10 @ 4Ghz ;)

People were saying below 4GHz not @4GHz. ;)
 
P4@3Ghz can decode 3 *HD* MPEG-2 streams? Maybe by offloading some of the work to a GPU.
Or all of it. Most any modern video card these days has hardware MPEG-2 decode acceleration. You just have to pay extra if you want hardware *encoding* as well. Most people forgot that detail when speaking of their 266 MHz machines decoding DVD video in realtime.
 
Yes, primarily iDCT and motion compensationa acceleration. I tried to find some benchmarks to verify PC-Engines claims, but sadly, most of the MPEG-2 decode benchmarks I found comparing P4s and AMDs had Radeon 9700's or better, or Geforce6's, so it was impossible to remove the fact that much of the decode pipeline: iDCT, mo-comp, deblocking, etc was being offloaded. Practically all MPEG-2 players now support hardwre iDCT/mo-comp, so its hard to find a "pure" codec benchmark any longer. Maybe PC-Engine can provide data for his claims.

The reason I'm skeptical is because I have an old 2.4Ghz Athlon system setting around and playing back 720p MPEG-2 streams severely taxes the system.
 
Given the fact that there is no medium with enough bandwidth to feed 10 HD streams I declare this thread absolutely worthless :mrgreen:


Well, Can I ask you something? What about mpeg4 and more advanced codecs? A single HD WMV9 stream needs a pentium 4 3GHz+, 512MB RAM and a good ati/nvidia processor on board. A single H-264 1080p stream DOESN'T work enough well (In fact it runs at a crappy 10fps...) on my iMac G5 1,8GHz... I want to know about these codecs, not the old MPEG2.


PD: The other fact is that all the PC's come with hardware accelerated MPEG2 decoding... Think of it.
 
ShootMyMonkey and DemoCoder both have an excellent point. Modern GPU's accelerate some portions of the MPEG-2 stream so processing HD level MPEG-2 streams is not a good CPU benchmark. Conversely, high bit rate MPEG-4 AVC (H.264) streams bring today's fastest CPU's to a crawl since there is no GPU assist yet. In the lab I've seen actual Blu-ray disc MPEG-2 and MPEG-4 AVC streams being decoded and you can clearly see & measure the difference.
 
Back
Top