Will the PS3 be able to decode H264.AVC at 40 Mps?

Will the PS3 decode H.264/AVC at 40 Mps?

  • Yes, the PS3 will decode H.264/AVC at 40 Mps

    Votes: 86 86.9%
  • No, the PS3 won't decode H.264/AVC at 40 Mps

    Votes: 13 13.1%

  • Total voters
    99
Cell has been demoed decoding 12 HD streams simultaneously so why should 1 give it so much difficulty?
It depends on how those streams relate. eg. If they were 12 10 Mb movies, and the effort decoding a 10 Mb movie is 1/100th the effort required to decode 1 48 Mb movie, it's not comparable. I also don't know what codec they were in and could have been the less computationally demanding MPEG2 clips.
There were DSPs around 3 years ago which could decode H.264 and I doubt they'd even match a single SPE.
Unless you know the problem being faced with scaling up the bitrate, it's not fair to say 'this tech could on simpler material, so this later tech should work on greater material'. I myself don't know what the problem is. I thought with higher bitrate, you've less compression and so you'd need less computational effort, but I know squat of these compression methods. It could be that with higher bitrate you've got the same compression taking place but on more and smaller 'chunks'. I also don't know why random access would be an issue with decompression. The data has to be being stored in contiguous blocks of some form or other. One pixel isn't going to be placed at one point in memory, and the next pixel placed somewhere else entirely.

BTW H.264 is another name for AVC, H.264 seems to have a small army of different names (including MPEG4 layer 10).
I'm pretty sure that's not true. AVC is from MS, h.264 is from the MPEG group. They may use the same general compression techniques (or not) but they're different formats.
 
I bet it does it with one finger on a single SPE. I've got a Pinnacle Showcenter at home which is able to decode 1080i WMV-HD without any problems, and this is a passive cooled (not even real DSP) ARM processor at 200MHz!
 
I'm sorry to change the mentality here, but if this H264 AVC is so computationally intensive that it's many times slower than some other codec that at the end of the day gives the same image quality, doesn't hits become a case for not using this codec?

I mean, we're not talking about just a bit slower, some people here are talking about many times slower, and at this point i'm wondering: "why bother?!". Unless the people involved are just panicking and overestimating things for no reason.

Seriously, if even Cell has problems, then i can only think that the codec is needlessly inefficient and should be snubbed.
 
It probably should be snubbed. But wasn't it developed for low-bitrate distribution, and thus could conceivably be slower at high-bitrate compression because that's not it's primary role? As it looks like 28 Mb h.264 is visually indistinguishable from 40 Mb in studies, the idea of using a compression method with a bitrate it wasn't designed optimally for with negligable difference in results and which can't be run on most hardware doesn't make much sense. I'm not quite sure what the point of the original post was, and the capacity of BRD is likely to be used in high bitrate MPEG2 which isn't processor demanding. That doesn't stop questioning of whether PS3 can handle it or not. Out of curiosity I'd like to know what these codecs are doing that doesn't map well to SPE, regardless of whether BRD pulls it off or not. :D
 
The damage that certain people has done to the h.264 image is just impossible to count.

From my understanding a lot of anti-h.264 propaganda comes from Microsoft Windows Media division.

So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.

AVC did great at SD resolutions but lost out at HD res. so BR camp ran back and worked on an advanced version of AVC which is called "AVC HP" (high profile). AVC HP takes elements from VC-1 to improve. Unfortunately AVC still lost out but was included into the specs for many reason, business and political. The Japanese studios whom MS was not working closely with initially, decided to run the AVC route for their releases. Their VC-1 counterparts easily out classed them.

Panasonic then started working hard on their own AVC encoder which you'll see Fox and perhaps Sony using at a later date. If you believe Joe Kane, Disney will be using VC-1 also.
 
Found some info on Wiki regarding H.264,avc, mpeg 4 part 10 (seems to be the same thing with different names)
http://en.wikipedia.org/wiki/H.264

I belive the UMD movies for the PSP is in H.264 and i think the encoded resolution on the umd is actually bigger then the psp native res. It surely seems odd if cell@ above 3ghz couldn't decode an 40mb H.264 stream, but i guess time will tell.
 
Ive read through here and havent been able to decipher why this capability is relevant in any practical sense?

Because the clips are encoded at Variable Bitrate, not constant, which means the average bitrate might be 28mbps, but on the most action intensive scene you might want a peak up in the 36mbps range.

For example, a 24mbps VBR HD-DVD, would have a ceiling of only 28mbps that it can reach in it's most action intensive scenes, while a 24mbps BR movie would be able to go much higher if it needed to.
 
Because the clips are encoded at Variable Bitrate, not constant, which means the average bitrate might be 28mbps, but on the most action intensive scene you might want a peak up in the 36mbps range.

For example, a 24mbps VBR HD-DVD, would have a ceiling of only 28mbps that it can reach in it's most action intensive scenes, while a 24mbps BR movie would be able to go much higher if it needed to.

If you buffer a few frames, you can take more time for such peaks and then catch up when the bitrate gets lower. In fact the encoder must guarantee that over a fixed amount of time/frames the average bitrate wont go too high.

Anyway, I cant imagine Cell struggling at H264 decoding, Toshiba&Sony planned to put it in their Playback Devices after all (in the long term atleast).
 
I belive the UMD movies for the PSP is in H.264 and i think the encoded resolution on the umd is actually bigger then the psp native res. It surely seems odd if cell@ above 3ghz couldn't decode an 40mb H.264 stream, but i guess time will tell.
UMD's use H.264 AVC Main Profile Level 3, Blu-ray supports up to H.264 AVC High Profile Level 4.1. The max bitrate for UMD's is something like 10Mbps, but even then it's a slightly less computationally intensive implementation of H.264 compared to Blu-ray. Even after that, the PSP's 'Media Block' has dedicated silicon for H.264 acceleration anyway, so it's not like it's doing the job in software.
 
The real question is whether there is even a perceptable quality difference between 28mbps and 40mbps H.264/VC1. As far as I know, 20-24mbps is basically indistinguishable from the master.

btw, Isn't Bluray 36mbps max transfer?

Well, as a director of photography for me the difference is like day and night.

I consider a BR50 the minimum spec to have an acceptable quality for an hd movie, but that's just my point of view as an artist, the consumers will eat everything regardless of its quality.
 
Last edited by a moderator:
So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.

AVC did great at SD resolutions but lost out at HD res. so BR camp ran back and worked on an advanced version of AVC which is called "AVC HP" (high profile). AVC HP takes elements from VC-1 to improve. Unfortunately AVC still lost out but was included into the specs for many reason, business and political. The Japanese studios whom MS was not working closely with initially, decided to run the AVC route for their releases. Their VC-1 counterparts easily out classed them.

Panasonic then started working hard on their own AVC encoder which you'll see Fox and perhaps Sony using at a later date. If you believe Joe Kane, Disney will be using VC-1 also.
Strange, any comparison I seen has VC1/WMV trying to gain ground vs Xvid/Divx, with the h264 Encoders (Ateme and x264) outclassing everything else. (Example)

What codec was your comparison based off? Just because they found a single h264-Encoder that faired bad against VC1 says nothing about the format or the state of other encoders.
 
Strange, any comparison I seen has VC1/WMV trying to gain ground vs Xvid/Divx, with the h264 Encoders (Ateme and x264) outclassing everything else. (Example)

This codec comparison is invalid for HD DVD and BD for the following reasons:

1. The WMV9 encoder is not the VC1 encoder. The VC1 encoder used for HD DVD and BD is substantially more sophisticated than the WMV9 encoder that is currently available.

2. The source clips were already compressed with MPEG2, in fact all the clips were sourced from DVD material. A true test relevant to HD DVD or BD would involve uncompressed source video.

3. The source clips were all DVD resolution, not HD. A true test relevant to HD DVD or BD would involve source material at 1920x1080p.

4. The target bitrates were all extremely low, in the 700-1000 kbps range. A true test relevant to HD DVD or BD would involve target bitrates at least ten, twenty, or even fourty times higher than that.

Modern advanced video codecs like H.264 and VC1 significantly change their computational requirements and coding techniques as they scale up and down in bitrate, resolution, and profile, which can cause huge differences in performance and quality.

Techniques that are important at low bitrates and resolutions can become useless or counter-productive as the codec scales up. Likewise, techniques well tuned for high-resolutions and bitrates may not perform well at low-resolutions or bitrates. Techniques that consume massive amounts of CPU may work well for squeezing the last bits out at low resolutions and bitrates, but may simply not be worth the processing power and cost at high resolutions and bitrates.

It's all about the tradeoffs.
 
Last edited by a moderator:
Well, as a director of photography for me the difference is like day and night.

I consider a BR50 the minimum spec to have an acceptable quality for an hd movie, but that's just my point of view as an artist, the consumers will eat everything regardless of its quality.

Not really, consumers are noticing major discrepencies between the current BD and HD-DVD offerings. I spotted it immediately when I compared the two last weekend.

I'm curious, which movies have you watched that were encoded at 40mbps?
 
So it was Microsoft "propoganda" that made AVC lose out at codec shootouts in the DVD Forum infront of all the studios and CE's??? AVC actually came in 3rd, 2nd went to Mpeg2 and VC-1 took home first. Divx and others were a no show.

AVC did great at SD resolutions but lost out at HD res. so BR camp ran back and worked on an advanced version of AVC which is called "AVC HP" (high profile). AVC HP takes elements from VC-1 to improve. Unfortunately AVC still lost out but was included into the specs for many reason, business and political. The Japanese studios whom MS was not working closely with initially, decided to run the AVC route for their releases. Their VC-1 counterparts easily out classed them.

Panasonic then started working hard on their own AVC encoder which you'll see Fox and perhaps Sony using at a later date. If you believe Joe Kane, Disney will be using VC-1 also.

Is Microsoft propaganda because a lot of arguments against h.264 comes from a forum member of AVSforums nicknamed Amirm who works in Windows Media division.

How do you know that VC-1 outclassed them?

The only that I know is that the h.264 has the same performance than MPEG-2 but with half of the bitrate and that goes beyond of HD movies being the VC-1 a derivation from WMV9 HD for digital cinema.

I don´t have anything against VC-1 and Microsoft.
 
Is Microsoft propaganda because a lot of arguments against h.264 comes from a forum member of AVSforums nicknamed Amirm who works in Windows Media division.

How do you know that VC-1 outclassed them?

The only that I know is that the h.264 has the same performance than MPEG-2 but with half of the bitrate and that goes beyond of HD movies being the VC-1 a derivation from WMV9 HD for digital cinema.

I don´t have anything against VC-1 and Microsoft.

Do you think if Amir was making stuff up that:

A. "talkstr8t" who is Bill Sheperd from Sun Microsystems and Head of the BD-J development would let him get away with it? If you read avsforum you know that insiders will call eachother out all the time and can be pretty rough with eachother.

B. Would "Rio" who works for Panasonic, would allow the same?


VC1 is loosely related to WMV9. It's evolutionary, yes but quite different also. Would you like to classify AVC with AVC HP? Is it Amir who made the Japanese AVC encoded discs look bad in comparison to the VC-1 encoded HD DVD discs?
 
Last edited by a moderator:
Is Microsoft propaganda because a lot of arguments against h.264 comes from a forum member of AVSforums nicknamed Amirm who works in Windows Media division.
He doesn't "work for the Windows Media division", he runs it. ;)

The only that I know is that the h.264 has the same performance than MPEG-2 but with half of the bitrate and that goes beyond of HD movies being the VC-1 a derivation from WMV9 HD for digital cinema.
Interesting then that you seem to think he's spreading propaganda, given that you admit you don't know much about the topic.
 
1. The WMV9 encoder is not the VC1 encoder. The VC1 encoder used for HD DVD and BD is substantially more sophisticated than the WMV9 encoder that is currently available.
And the reference h264-encoder is not exactly state of the art. Its pointless comparing formats without specifying which encoder was used. I dont see that answered yet.
2. The source clips were already compressed with MPEG2, in fact all the clips were sourced from DVD material. A true test relevant to HD DVD or BD would involve uncompressed source video.
True, but unless studios release that, I doubt thats going to happen with independend,reconstructable reviews.
3. The source clips were all DVD resolution, not HD. A true test relevant to HD DVD or BD would involve source material at 1920x1080p.

4. The target bitrates were all extremely low, in the 700 kbps range. A true test relevant to HD DVD or BD would involve target bitrates at least ten, twenty, or even fourty times higher than that.
Points taken, but woudlnt it more reliable to match resolution to bitrate? In that case DVDbitrate*6 ~ HDbitrate

...It's all about the tradeoffs.
And tuning... I simply cant imagine half-decend h264-Encoders having problems beating xvid/divx. Now this is surely showing my bias :LOL: , but VC1 just seems to be forced on the market like WMA was (ie cant have technical reasons as it offers little gain over mp3,especially compared to other alternatives like Vorbis). As said, I dont see any reliable comparisions between VC1 and h264 yet (including the points you just brought up) and I dont think I will in the near future. Some HD-Resolution comparisons would be a nice start tough.
 
Not really, consumers are noticing major discrepencies between the current BD and HD-DVD offerings. I spotted it immediately when I compared the two last weekend.

Hi scooby , if you read carefully my post i didn't say the consumers will not be able to tell the difference, i said they will buy those movies regardless of the quality.

In my opinion, both hd-dvd and BR25 look weak and the BR50 will be the minimum spec to have an acceptable quality.

I'm curious, which movies have you watched that were encoded at 40mbps?

I was addressing your assertion that "20-24mbps is basically indistinguishable from the master", but to answare your question, i have seen my movies encoded in tons of different formats , bitrates and codecs (even just for release of the trailers).

Hope this help.

Bye,
Ventresca.
 
Last edited by a moderator:
Back
Top