Will the PS3 be able to decode H264.AVC at 40 Mps?

Will the PS3 decode H.264/AVC at 40 Mps?

  • Yes, the PS3 will decode H.264/AVC at 40 Mps

    Votes: 86 86.9%
  • No, the PS3 won't decode H.264/AVC at 40 Mps

    Votes: 13 13.1%

  • Total voters
    99
Unfortunately you won't be High Profile compliant (and hence not BD compliant) if you don't implement CABAC, so skipping it is not an option.

I wasn't trying to suggest it as an alternative, only give it as an example of entropy decoding.
 
Last edited by a moderator:
The design didn't have 1080p resolutions as a primary focus. Sony knew with MPEG-4 AVC they were going to encounter serious obstacles.

As fate would have it, Microsoft was thinking big, as in Digital Cinema extreme big. They ditched their previous work and started building a codec from the ground up becaused they discovered what worked great for lower resolutions, hurt image quality as you cranked up the resolution. So Microsoft was targeting high resolutions and optimizing for it. They stayed away from CABAC because of the heavy proecessing requirments.
You know their WMV-9 codec must be optimized to be smooth on x86 CPUs in their customers' Windows PC first and foremost, right?
 
Which does bring me to a question though - is there some CABAC papers available that I could use to educate myself about why this algorithm should suck so badly on Cell? I've googled a bit and came up empty handed so far.

I couldn't find the ones that's free, you need to have access to them.

However, there is

http://www.bilsen.com/aic/

which uses CABAC and has a page on arithmetic coding and CABAC.

or alternatively you can just look at the H.264/AVC Reference Software.

http://iphome.hhi.de/suehring/tml/download/
 
Brimstone,

I hadn't thought of it from a PS3 playback perspective although that does make a lot of sense. Initially, BR was strictly Mpeg2 because low bit rates wouldn't be an issue with BD50media.

Maybe one day when it's all said and done, someone will write a book with all the back room stuff that happened between these formats. I'd certainly buy it :) Much more interesting than the Ati/Nvidia stuff!
 
You know their WMV-9 codec must be optimized to be smooth on x86 CPUs in their customers' Windows PC first and foremost, right?


I'm not sure what you're getting at.



For Sony and the CELL, VC-1 is an incredible gift. They have the flexability of choosing what best fits thier needs.


There isn't an ultimate codec. They are optomized for different things. While VC-1 trounced MPEG-2 and h.264 AVC for 1080p playback for Blu-Ray/HD-DVD. If the shootout was for the best codec for streaming video on a cellphone, h.264 might win. While h.264 AVC is more processor intensive, the resolution for a cellphone is so low it doesn't have much of an impact.


I wonder if the reason why Kutaragi had two two additional SPE's (one deactivated) added to CELL was to make sure h.264 AVC could be decoded by the PS3? Sony didn't control to the fate of MPEG-4. In fact it may influenced Sony to go with a more aggresive blue laser design to make sure they could achieve high bit rates with MPEG-2. In essence MPEG-2 with a blue laser providing high bit rates might have been CELL's saftey net.

Ironiclly Microsoft may have built the perfect codec to fit the needs of CELL.
 
So err, Nvidia is launching its 7950GT in mid September, and lets just say it seems to share more than a little in common with RSX...
060814nv795079gs05jg1.jpg

Its timing to the market would be consistent with the ramp up that RSX would be currently going through. It might also explain why RSX is still under NDA given the tech it would share with the 7950GT.

Another interesting note is the details on its H.264 acceleration:
060814nv795079gs06ss1.jpg

Given that current Nvidia chips are allegedly inadequate above 20Mbps, perhaps this features a modified PureVideo HD implementation in silicon. One which was necessary for RSX to meet Blu-ray's specifications.

Just a theory anyway, if not RSX, then at least a variant of RSX's developent maybe? They would have to change the bus interface for commercial use I imagine, unless it can be left unused and altered in the bios or something. Either way the main point I think I was getting at is that Nvidia's PureVideo HD doesn't seem to be the same as it was implemented in its previous cards if they're claiming 'best in class'. So it's a reasonable leap of faith to assume that we shouldn't necessarily be looking at current commercial Nvidia chips for H.264 performance.
 
I'm not sure what you're getting at.
It's just causality and priority. You suggest they didn't adopt CABAC for big screens. I suggest they didn't because most home PCs can't handle it as evidenced by today's situation where playing back H.264 clips without a frame drop requires a relatively good PC spec.

Outside of the PC world, for cell phones and digital TV broadcast, decoding work is done by dedicated ASICs, not by x86 CPUs. For those embedded devices CABAC is not a big problem.

EDIT: One more factual correction,
Brimstone said:
MPEG-4 AVC is fine for lower resolution devices like cell phones where the screen resolution isn't that large and the bandwidth is going to be limited. So it made sense to go with a more extreme compression algorthim.
This is not true. The Baseline Profile for mobile application doesn't use CABAC. Only the Main Profile (for SDTV broadcast and recorder) and the High Profile (for HDTV and HD DVD/Blu-ray) support CABAC.
 
Last edited by a moderator:
Hmm... reading through V3's links, I still remain unconvinced that CABAC will perform badly on a Cell architecture. If CABAC is based on a form of context sensitive Huffman encoding (64 contexts ?), then it should be parallelizable even if the logical data looks like an unbalanced tree, and there are sequential dependencies somewhere in the original/straight forward algo.

Found a parallel Huffman implementation here:
http://www.cs.duke.edu/~jsv/Papers/HoV95.pdcfull.pdf
There should be other parallel algo somewhere.

It seems that you should be able to set 4-6 SPEs running in parallel on 1 slice, using the above parallel algorithm. If it can be parallelized this way, we should be able to assign each SPE to its own slice too. An SPE can interleave its operation on multiple pixels (i.e., work on another pixel while waiting for memory result for a pixel). So I still don't see why it will run badly on Cell, granted I didn't research further.

In the worst case, one can still use the PPE for the entropy encoding/decoding stage (with the SPEs fully deployed for other stages). Cell is a 3.2GHz processor afterall (albeit in-order and lack of auto-branch prediction).

Finally, we still don't know if a 40Mbps stream demand higher/lower CABAC load (compared to a lower bitrate stream). Usually a lower bitrate would require heavier decoding load right ?
 
Last edited by a moderator:
It's just causality and priority. You suggest they didn't adopt CABAC for big screens. I suggest they didn't because most home PCs can't handle it as evidenced by today's situation where playing back H.264 clips without a frame drop requires a relatively good PC spec.

Outside of the PC world, for cell phones and digital TV broadcast, decoding work is done by dedicated ASICs, not by x86 CPUs. For those embedded devices CABAC is not a big problem.

EDIT: One more factual correction,
This is not true. The Baseline Profile for mobile application doesn't use CABAC. Only the Main Profile (for SDTV broadcast and recorder) and the High Profile (for HDTV and HD DVD/Blu-ray) support CABAC.

H.264 AVC was given opportunities in the codec shootouts, and it placed THIRD! This is them encoding the material and presenting it.

If CABAC provided an advantage shouldn't H.264 AVC won the codec shootouts? Regardless of x86, the Microsoft codec still won.

Microsoft found solutions better than CABAC for what they were trying to achieve. The hardware demands had no influence during the codec shootouts.
 
If what Robert R1 say is accurate, AVC may take some time to catch up in optimization. A rushed proposal and implementation may suffer from sub-par implementation. It could be the case that currently VC-1 outperforms most (or all) AVC HP Profile implementations. We should be able to assess their relative merits better once implementations mature.

For that matter, I do not know how well the PS3 VC-1 decoder perform w.r.t. its AVC implementation. Hopefully Sony has enough time to iron out the kinks (Keeping my fingers crossed).
 
H.264 AVC was given opportunities in the codec shootouts, and it placed THIRD! This is them encoding the material and presenting it.
And that AVC was NOT AVC High Profile in BD-ROM as RobertR1 explained it. Besides I don't think CABAC contributes to image quality. It contributes to higher compression at a given bitrate.
 
or alternatively you can just look at the H.264/AVC Reference Software.
...you cruel man :)
And that AVC was NOT AVC High Profile in BD-ROM as RobertR1 explained it. Besides I don't think CABAC contributes to image quality. It contributes to higher compression at a given bitrate.
Indirectly it does since it means, for a given bit rate, you can have more IDCT residuals/motion vectors.

Outside of the PC world, for cell phones and digital TV broadcast, decoding work is done by dedicated ASICs, not by x86 CPUs. For those embedded devices CABAC is not a big problem.
Actually, I heard a rumour that some HD settop boxes were using x86s just to handle this - but take that with a grain of the NaCl <shrug>
 
Last edited by a moderator:
I think we all need to do a better job of getting our AVC's right when discussing it :) I'll personally keep a mental note to not get lazy and use the proper acronym when discussing either version.

BR spec and POST DVD Forum Codec shootout = AVC HP
Before than and still existing alongside but not part of the BR spec = AVC

We're also forgetting that the PS3 will also need to fully support BD-J/BD-Live functionality which will need to be accessible while playing the movie and also process sound.
 
Last edited by a moderator:
...you cruel man :)

Indirectly it does since it means, for a given bit rate, you can have more IDCT residuals/motion vectors.


Actually, I heard a rumour that some HD settop boxes were using x86s just to handle this - but take that with a grain of the NaCl <shrug>


Which ones? I'll happily ask on avsforum. From all I can gather, most use either the SoC or Broadcom hardware decoders.
 
Usually a lower bitrate would require heavier decoding load right ?

Tends to be the opposite for lossy video codecs.

The lower the bitrate, the more stuff is thrown away by the encoder, so the less you have left to decode on the receiving end.

Although depending on the codec design, sometimes more CPU intensive techniques kick in at lower bitrates, but typically for a given profile/given resolution, more bitrate = more processing, not less.
 
Last edited by a moderator:
Sorry, my bad. I was refering specifically to CABAC (more for compression instead of general decoding).

In the 40Mbps vs 15Mbps case, the former should require lesser compression/decompression ? So does CABAC still kick in ? If so, is it more/less intensive than the 15Mbps case ?
 
Actually, I heard a rumour that some HD settop boxes were using x86s just to handle this - but take that with a grain of the NaCl <shrug>
If it's true Cell has a chance too :smile:
We're also forgetting that the PS3 will also need to fully support BD-J/BD-Live functionality which will need to be accessible while playing the movie and also process sound.
That's what PPE is for.
 
Which ones? I'll happily ask on avsforum. From all I can gather, most use either the SoC or Broadcom hardware decoders.

As far as I know the main CPUs in both the Toshiba HD DVD player (2.4 ghz P4) and the Samsung BD player just run the interactivity and the player's OS. The heavy lifting of the video decoding is all done by the Broadcom decoder chip.
 
Last edited by a moderator:
We're also forgetting that the PS3 will also need to fully support BD-J/BD-Live functionality which will need to be accessible while playing the movie and also process sound.

Is sound processed by BD-J ?
I thought BD-J/BD-Live only handles the interactivity and the network access ? So for video decoding, BD-J/BD-Live is not really used unless there are interactive content.
 
Back
Top