Will the PS3 be able to decode H264.AVC at 40 Mps?

Will the PS3 decode H.264/AVC at 40 Mps?

  • Yes, the PS3 will decode H.264/AVC at 40 Mps

    Votes: 86 86.9%
  • No, the PS3 won't decode H.264/AVC at 40 Mps

    Votes: 13 13.1%

  • Total voters
    99
And the reference h264-encoder is not exactly state of the art. Its pointless comparing formats without specifying which encoder was used. I dont see that answered yet.

Like I said above, the codec comparison you posted has nothing to do with HD DVD or BD, and so we cannot draw any conclusions about the performance of either codec when used for HD DVD/BD based on that data.
 
Last edited by a moderator:
From what I gather:

- CABAC is extremely difficult to parallelize. You basically process from the first symbol to the last in sequential order, and the algorithm depends on them being processed in order. So splitting it across multiple SPEs is out.
...well, you can parallelise at slice boundaries but as a slice is up to 1 frame in size that doesn't really help...:D :devilish:

- CABAC uses a lot of data dependent branching. SPEs aren't that good at branching.
Indeed. There can be lots of branches per bit of decompressed data in a software implementation.

I don't have the details, just some more questions. Is CABAC the only stage in H.264 AVC decoding ?
No. It is one of the early steps, i.e. entropy decoding. H264 has an alternative (less effective but simpler scheme) called CAVLC.
You mean random memory accesses ?
Yes. CABAC maintains a big table of probabilities which can be updated sometimes on a bit-by-bit basis.
 
Oh goodie, gurus on the topic. :D

Can someone answer Shifty's question on whether a 40Mbps stream need as much CABAC as a 15Mbps one ?

Assuming that we use the SPEs (instead of the PPE) to complete the CABAC stage...

How big is the probability table ? Can it fit comfortably in 256K ? If not, can it be partitioned into multiple 256K ones ? Also can the lookups be batched ?

What do you mean by "lot's of branches per bit" (As in any ballpark numbers) ? Do the branches involve only mathematical computation ? Or some more branching code that cannot be transformed into predicates ?

Sorry for the deluge of questions.

EDIT:
Simon F said:
...well, you can parallelise at slice boundaries but as a slice is up to 1 frame in size that doesn't really help...

What are the challenges if we have 2-3 SPEs working on separate slices in parallel ? i.e., Each SPE load and processes chunks of each slice independently.
 
Last edited by a moderator:
I'm sorry but I find this line of reasoning utter nonsense. You don't get to pick and choose what is PR disingenuity and not. Until proven otherwise, we have to accept that the information released by a company is fact.

If we're allowed to call this out, then why stop there? What makes this more unlikely than the console actually existing at all? It again comes down to "Why do we not believe this?" which invokes the burden of proof.

We should never accept what some PR says on an unrelease product as being fact. Thats just bad. Here is a past example with ATI's AIB released slides a few weeks ago showing how the xtx1950 smoked a Quad nVidia card set up. And I am suppose to beilve that true? Common sense tells me thats not what we are going to see when the card is release on the 14 of Sept.

I am not saying that ALL of it is PR. As we will have to judge what is reasonable to assume, what is reasonable to expect, and which claims are a bit too over egar.

The point is just because a company tells you something does not make it a fact. You could turn it around, if they say they can do it, then they need to show it right?

If I claim my company is going to release a phone with 1 million hours of talk time on a single battery...are you going to accept it as fact? Is it up to you to prove me wrong? Or is it up to us to prove we can when the product is released??

Anyone with any sense knows that a million hours of talk time just is not even remotely possible in a mobile cell phone using todays tech. But my point still stands.





I think you're just being overly cynical.

Nope It the lessons that I have had to learn beliving PR crap before and then seing the final procuct fall below what the PR had said. After awhile you start to learn....
 
Okay, I see your point. PR is sometimes filled with outrageous claims.

So why is 40Mbps H.264 AVC not a reasonable expectation of the PS3? What you've done on each of those examples is point out where claims are unreasonable. Something to the point where existing evidence suggests such a claim is not feasible. Do the same for PS3's Blu-ray conformity then. Just saying it's balderdash doesn't help
 
Like I said above, the codec comparison you posted has nothing to do with HD DVD or BD, and so we cannot draw any conclusions about the performance of either codec when used for HD DVD/BD based on that data.
I asked which AVC-Codec was used in that comparison(beat by Mpeg2? - Under what conditions,bitrate,setting and in which century?? ) :
aaaaa00 said:
So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.
We can draw any conclusions about the Format after this shootout?
 
I asked which AVC-Codec was used in that comparison(beat by Mpeg2? - Under what conditions,bitrate,setting and in which century?? ) :
We can draw any conclusions about the Format after this shootout?

It was done in mid 2005. At HD resolution. 1080p/24. It was the non "HP" profile which had been around for a while. The "HP" profile of AVC came into existance due to the standard version losing out at the codec shootout. Before this, there was no AVC HP.

The "HP" profile was added to the specs for political reasons. Basically, studios were leaning on VC1 because it was simply more advanced (variable bit rate) than Mpeg2 (constant bit rate) and other reasons. The BR core camp, Sony/Fox/Panasonic knew they could not allow VC-1 to dominate so AVC HP came about and was added into the specs. The neutral studios wanted AVC HP to be added to the HD DVD specs also, so if needed, they could use the same encode on both formats. Still however they opted to use VC-1 and unless the Panasonic AVC HP encoder can do some magic, they'll continue to use VC-1. Mpeg2 is needed so they can simply bring over all the extras from a standard DVD without having to reencode them into VC-1.

The conclusions are not about the format. We're talking about AVC and AVC HP. Where did you come up with this from?
 
It was done in mid 2005. At HD resolution. 1080p/24. It was the non "HP" profile which had been around for a while. The "HP" profile of AVC came into existance due to the standard version losing out at the codec shootout. Before this, there was no AVC HP.

The "HP" profile was added to the specs for political reasons. Basically, studios were leaning on VC1 because it was simply more advanced (variable bit rate) than Mpeg2 (constant bit rate) and other reasons. The BR core camp, Sony/Fox/Panasonic knew they could not allow VC-1 to dominate so AVC HP came about and was added into the specs. The neutral studios wanted AVC HP to be added to the HD DVD specs also, so if needed, they could use the same encode on both formats. Still however they opted to use VC-1 and unless the Panasonic AVC HP encoder can do some magic, they'll continue to use VC-1. Mpeg2 is needed so they can simply bring over all the extras from a standard DVD without having to reencode them into VC-1.

The conclusions are not about the format. We're talking about AVC and AVC HP. Where did you come up with this from?
Im differencing between AVC as Format(I believe you refer to BD/HDDVD as Format?) and the specific AVC-Encoder used in that comparison. The talk is about how "AVC" is been outclassed, when in fact its based on the performance of tests of specific AVC-encoder(s), such tests dont imply that Encoders cant improve or even that already other AVC-encoders already exist which are better, but not choosen for the comparison for technical or political reasons. Of course this also applies for VC1, with the difference that you have a single supplier with all its pros&cons (it eases up choosing an encoder for comparisons for sure :) )
So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.
I read that as Mpeg2 beats AVC at a given (average) bitrate, thats why I question the AVC-encoder used and the specifics. Or did you mean the sequence in which the standards got accepted?
PS. Is that test public? link?
 
SimonF said:
Was that a profile that used CAVLC rather than CABAC for the entropy encoding/decoding?
I'm pretty sure it couldn't have been CABAC given the time-frame, but I need to ask. And besides it was run on 300mhz CPU which happens to suck at random memory acceses (albeit a bit less then the new IBM cpus).

Which does bring me to a question though - is there some CABAC papers available that I could use to educate myself about why this algorithm should suck so badly on Cell? I've googled a bit and came up empty handed so far.
 
Basically, studios were leaning on VC1 because it was simply more advanced (variable bit rate) than Mpeg2 (constant bit rate) and other reasons.
No VBR for MPEG2? Where did you get that idea? Also what's the point of your tirade about the old H.264 version which is not related to the final BD-ROM format as is now?
 
Okay, I see your point. PR is sometimes filled with outrageous claims.

Hmm yes thats what I seem to have typed in. But thats not really what I ment to say.

Where I was going is that what they are saying today about a product that is not released yet may not be what we actually get when that product is actually released. Lots of times things happen from that point until it ships that could effect its outcome. Some by mer chance or bad luck, others could be known just had their turth streched. So what PR claims may not show up in the final product even though those claims were 100% reasonable. Delays happen, licenses fall through, things sometimes just dont go right during the final day until shipment... Oh the horry stories I could tell about some of the last minute issues on our phones that we did not find until going for the final production runs :)


So why is 40Mbps H.264 AVC not a reasonable expectation of the PS3? What you've done on each of those examples is point out where claims are unreasonable. Something to the point where existing evidence suggests such a claim is not feasible.

To be honest I have no reason why it would not. It sounds like its well with in its abilitys of the PS3. Just wanted to call some caution in using it as 100% given was all. The unreasonable examples where there to drive home a point (however not the correct point I was aiming for).

Again I was not trying to say anything about the PS3 abilities just warning to not trust PR (any PR for that matter) on a products ability before its release as 100% factual. But since you did not ask of it, I will STFU and start brain storming how to get that 1 million hour talk time in our phones, maybe a flux capacitor is the key :p
 
Yeah I think we both were approaching this from the wrong angle. My statement about 'PR fact' was incorrect and I should have qualified it with the exception of reasonable doubt. I do believe that this is a different scenario to those you highlighted though as it's not about boasting performance so much as meeting a required and established standard.
 
No VBR for MPEG2? Where did you get that idea? Also what's the point of your tirade about the old H.264 version which is not related to the final BD-ROM format as is now?

I messed up on the Mpeg2 about single pass vs. double pass. Still though Mpeg2 requires much higher average bit rate to acheive visual transparency than VC1 and AVC HP. Also, it's peaks are higher.

My "tirade" is simply pointing out that AVC HP existed due to the old H.264 losing out to VC1 and Mpeg2. AVC HP is a good thing. For this topic people need to understand that they can't use the OLD AVC/H.264 encoders to determine CPU load. Anand's article is the closest we have to determining how much resources AVC HP eats up. The Cell however is very different. Again, that's using the Toshiba Encoder not the Panasonic one that the BR camp will be adopting. Unlike VC-1 (for now), there will be various AVC HP encoders.
 
For this topic people need to understand that they can't use the OLD AVC/H.264 encoders to determine CPU load.
You meant OLD AVC/H.264 decoders?

Actually you are right and wrong. Of course AVC HP and MP (main profile) are 2 different things, which means old data can't be used to determine anything. But if you follow the discussion in this thread, the current topic is about CABAC in entropy encoding which is SPE unfriendly according to aaaaa00. Since CABAC in entropy encoding is not unique to HP, it's valid to look up old H.264 decoders to see rough CPU load if it's confirmed that the test movie is encoded with CABAC. If you think HP-unique tools such as quantization matrix and 8x8 IDCT that are also present in MPEG2 and VC-1 are unfriendly to Cell or other CPUs please explain why.
 
I asked which AVC-Codec was used in that comparison(beat by Mpeg2? - Under what conditions,bitrate,setting and in which century?? )

I'm not privy to the details, but it is my understanding that it was the best H264 codec the proponents of H264 could get at the time, with at least as much time to encode, tweak, and optimize as all the other codecs in the test got.

The way the test was run was that each codec group was asked to submit an encode of a series of high-definition un-compressed video clips, and they got a certain amount of time (weeks or months) to tweak, optimize and do anything they wanted within the specs of their encoder to demonstrate how good their codec was.

Unlike the codec comparison you posted, this was not a test where the encodes were run by a single party, and all settings were left at default settings. Each codec was supported by a company (or companies) that were pushing for its inclusion in the disc standard, so it was their responsibility to do everything in their power to demonstrate how good their codec was.

Once the encoded clips were submitted, they were shown to an industry panel of experts in a blind test, on many different pieces of video equipment, from studio quality production monitors, to super-high-end projector displays, to flat screens and other HDTVs.

Since each codec group was asked to submit their own best possible encoding, this should remove any doubt as to how well the encodes were optimized, unless you believe the people supporting H.264 were incompetent at optimizing their own codec.

aaaaa00 said:
So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.

We can draw any conclusions about the Format after this shootout?

Uh I didn't write this second quote. Please attribute things correctly.
 
Last edited by a moderator:
To be clear, I have never said that PS3 will or will not be able to handle H.264 AVC HP at 40/mbps. I'm sure Sony will dedicate as many development resources as necessary to make it work regardless.

All I am saying is that I've heard from someone who's actually has written a high-bitrate H.264 HP software decoder that fully compliant BD H.264 playback won't be a piece of cake to do in software on the PS3, and that there are parts of H.264 HP that will be a complete pain-in-the-ass to implement on CELL.
 
Last edited by a moderator:
You meant OLD AVC/H.264 decoders?

Actually you are right and wrong. Of course AVC HP and MP (main profile) are 2 different things, which means old data can't be used to determine anything. But if you follow the discussion in this thread, the current topic is about CABAC in entropy encoding which is SPE unfriendly according to aaaaa00. Since CABAC in entropy encoding is not unique to HP, it's valid to look up old H.264 decoders to see rough CPU load if it's confirmed that the test movie is encoded with CABAC. If you think HP-unique tools such as quantization matrix and 8x8 IDCT that are also present in MPEG2 and VC-1 are unfriendly to Cell or other CPUs please explain why.


One. All I'm doing it trying to explain how AVC HP came about. You can have the "upper hand" from a techincal perspective all you like. I've never claimed to be an expert on CPU's or fine details of codecs. I enjoy the political and business side of this format war because it's in my realm and interests me much more. Ofcourse with that, you pick up certain things along the way.

If people want the in n' out of BR or HD DVD, they can simply post here and ask directly:

http://www.avsforum.com/avs-vb/showthread.php?t=697036

Infact, if someone is really interested and has time to waste, they can read that thread and gather some good info along with the subforum in where that thread is located.
 
  • Like
Reactions: Sis
Here is the way I interupt some issues with Blu-Ray and a satelite view of events.

The Sony mantra that MPEG-2 at high bit rates is best suited for HD movies for Blu-Ray is a giant PR insurance policy for CELL. MPEG-4 AVC was going full bore with more complex entropy encoding (CABAC), which isn't CELL friendly. MPEG-4 AVC is fine for lower resolution devices like cell phones where the screen resolution isn't that large and the bandwidth is going to be limited. So it made sense to go with a more extreme compression algorthim. The design didn't have 1080p resolutions as a primary focus. Sony knew with MPEG-4 AVC they were going to encounter serious obstacles.

As fate would have it, Microsoft was thinking big, as in Digital Cinema extreme big. They ditched their previous work and started building a codec from the ground up becaused they discovered what worked great for lower resolutions, hurt image quality as you cranked up the resolution. So Microsoft was targeting high resolutions and optimizing for it. They stayed away from CABAC because of the heavy proecessing requirments.

When the codec image quality shootout results came back, Microsoft with VC-1 was in the pole position. Microsoft was the codec leader for the next-gen optical formats.

Fortunatley Sony made VC-1 one of the standard codecs for Blu-Ray.
 
Back
Top