1080p/60 HDMI Clarification

london-boy said:
And today's keyword is:

HD DVD's representative


Please, when someone other than the bloody competition comes out with such stupid stuff, maybe i'll believe it. Thinking that converting a progressive scan signal to interlaced is "extensive" and then extrapolating that something will not work from that makes me laugh. You should know better.

With all due respect the HD DVD representative was only given ONE line in that entire paragraph and it was in regards to their OWN material "Regarding when we're going to see 1080p, HD DVD's representative have indicated to me that at least the format's initial releases will be 1080i, not 1080p".
So the stupid stuff you attribute to the representative, maybe you should redirect your angst towards the author of the piece, since, that is where the rest of the paragraph is written from, the authors point-of-view/hypothesis.
 
ERP said:
To be fair he was right, go check out a Samsung 1080p DLP TV, they won't actually take a 1080p signal, just 1080i which they up convert.

Some of this years models will take a 1080px30 signal (the HP ones at least). I don't know of a set that will take 1080px60, simply because there are no available signal sources whether the connector can support it or not.

1080 is still a bit of a standards mess.

VGA, dual DVI, or possibly DVI with altered blanking intervals. No consumer devices besides PCs and maybe scalers will output 1080p though. And you're probably looking at computer displays that will actually accept those resolutions and display them natively, I don't think there's even a single CRT HDTV that will display 720p natively.

BTW, considering only CRTs can display 1080i natively, it seems like a silly res to have made an HD standard since CRTs are being phased out fast and are resolution limited anyway. 1080p at 30fps would make more sense, or 540p at 60fps. The main reason for 1080i is bandwidth right, so shouldn't those two signals consume the same bandwidth and be more beneficial to newer display technologies?

Anyhow, no reason for Microsoft to worry about supporting 1080p, it's not like the internal resolution of the xbox 360 will be anywhere near that. In fact, I wouldn't be surprised to start seeing games drop the internal rendering resolution to standard def, especially for games that rely on a high framerate. Of course, maybe it'll go the other way and we'll see 720p 60fps and 1080i 30fps games.
 
The way I understand things as of right now is

HD-DVD movies will be encoded at 1080i meaning you get 1920 x 1080 resolution

Blu-Ray movies will be encoded at 720p at first so you get 1280 x 720 resolution


Now later on I assume Blu-Ray movies will be re-released at 1080p sometime in the future.


If a person is a hard core movie junkie and wants the best viewing experience on a very large screen they'll want a HD-DVD player connected to a 1080p dispay. A HDTV like a Sony SXRD will take that 1080i signal and display the full resolution picture of stunning quality. I have no idea how anyone can draw the conclusion at 1920 x 1080 doesn't look incredibly better than 1280 x 720 on a large screen HD-TV.


Even though CRT's accept 1080i signals, their native display resolution isn't neccessarily 1920 x 1080. A lot of them are half of that because it's so hard to manufacture a TV Tube that can scan fast enough at such a high resolution.
 
Brimstone said:
The way I understand things as of right now is

HD-DVD movies will be encoded at 1080i meaning you get 1920 x 1080 resolution

Blu-Ray movies will be encoded at 720p at first so you get 1280 x 720 resolution


Now later on I assume Blu-Ray movies will be re-released at 1080p sometime later.

Really? I haven't heard 720p being mentioned once in relation to Blu-ray at all yet. In fact I've only hear 1080p. Where did you get that info from?
 
Brimstone said:
The way I understand things as of right now is

HD-DVD movies will be encoded at 1080i meaning you get 1920 x 1080 resolution

Blu-Ray movies will be encoded at 720p at first so you get 1280 x 720 resolution


Now later on I assume Blu-Ray movies will be re-released at 1080p sometime in the future.


If a person is a hard core movie junkie and wants the best viewing experience on a very large screen they'll want a HD-DVD player connected to a 1080p dispay. A HDTV like a Sony SXRD will take that 1080i signal and display the full resolution picture of stunning quality. I have no idea how anyone can draw the conclusion at 1920 x 1080 doesn't look incredibly better than 1280 x 720 on a large screen HD-TV.


Even though CRT's accept 1080i signals, their native display resolution isn't neccessarily 1920 x 1080. A lot of them are half of that because it's so hard to manufacture a TV Tube that can scan fast enough at such a high resolution.


HD-DVD and B-Ray movies will all be encoded at 1080p.
Everything else will be up to the customers setup (player, HDMI/component, TV etc...)
 
Titanio said:
Really? I haven't heard 720p being mentioned once in relation to Blu-ray at all yet. In fact I've only hear 1080p. Where did you get that info from?


I could very well be wrong, but didn't some guy from Sony movie studio's a while back mention that all of Sonys movies are being encoded at 720p.


Blu-Ray is using MPEG-2 so data compresion isn't the same as Microsofts VC-1 that Time Warners movies use. Sony may have dual layer disks in the lab, but I doubt they can release a dual layer movie at a cheap cost right now. So intial releases have to fit into 25 Gigabytes or less with MPEG-2.
 
Last edited by a moderator:
Brimstone said:
I could very well be wrong, but didn't some guy from Sony movie studio's a while back mention that all of Sonys movies are being encoded at 720p.

No, I don't think so.

If either camp has been louder about content being at 1080p, it's been the Blu-ray camp. Their CES press releases were laced with constant references to it..I think they're planning all content to be 1080p.
 
Titanio said:
No, I don't think so.

If either camp has been louder about content being at 1080p, it's been the Blu-ray camp. Their CES press releases were laced with constant references to it..I think they're planning all content to be 1080p.

I'm not suggesting Blu-Ray won't support 1080p, it's just my understand that due to MPEG-2 being used, encoding at 1080p60 would go beyond what a signal layer Blu-Ray disk can hold. The quality of the encoding is a trade off in the amount of space it takes up. So the higher bit rate the more space is consumed.

On this issue I could be totally wrong, so if someone knows please shed some light on the matter.
 
Last edited by a moderator:
Brimstone said:
I'm not suggesting Blu-Ray won't support 1080p, it's just my understand that due to MPEG-2 being used, encoding at 1080p60 would go beyond what a signal layer Blu-Ray disk can hold. The quality of the encoding is a trade off in the amount of space it takes up. So the higher bit rate the more space is consumed.

On this issue I could be totally wrong, so if someone knows please shed some light on the matter.

Well I remember them announcing that they'd mastered the first blu-ray movie, at 1080p. I've never heard the suggestion before that using mpeg2 would constrain them as far as resolution goes.
 
Titanio said:
Well I remember them announcing that they'd mastered the first blu-ray movie, at 1080p. I've never heard the suggestion before that using mpeg2 would constrain them as far as resolution goes.

They can go at a higher resolution, but at what quality level? If you lower the "bit rate" to compress the data more, you get artifacting with MPEG-2. It's a trade off of picture quality vs data compression.
 
Brimstone said:
They can go at a higher resolution, but at what quality level? If you lower the "bit rate" to compress the data more, you get artifacting with MPEG-2. It's a trade off of picture quality vs data compression.
You use very high bitrate such as 40Mbps only in critical places by VBR so it doesn't consume the space linearly to the bitrate. BD-ROM allows up to 40Mbps for video OTOH HD DVD is up to 28Mbps.
ceb3_04.jpg

ceb3_03.jpg
 
You use very high bitrate such as 40Mbps only in critical places by VBR so it doesn't consume the space linearly to the bitrate. BD-ROM allows up to 40Mbps for video OTOH HD DVD is up to 28Mbps.

Who cares? Using VC-1 with a bitrate less than 20Mbps, there isn't any perceivable difference to a D5 master. BR NEEDS higher bitrate and higher capacity to compensate because of an inferior MPEG2 codec used by SONY and maybe other studios. What kind of bitrate would you need to equal a D5 master using MPEG2? It aint 16Mbps VBR.
 
NANOTEC said:
Who cares? Using VC-1 with a bitrate less than 20Mbps, there isn't any perceivable difference to a D5 master. BR NEEDS higher bitrate and higher capacity to compensate because of an inferior MPEG2 codec used by SONY and maybe other studios. What kind of bitrate would you need to equal a D5 master using MPEG2? It aint 16Mbps VBR.
Why do we have to forget studios that use VC-1/H.264 on BD, such as Warner? Look at the AVC 12Mbps case in
http://www.watch.impress.co.jp/av/docs/20060310/ceb3_04.jpg

Basically the argument Brimstone raised about MPEG2 and the argument you've brought here about the false needs of MPEG2 on BD are completely unrelated issues.
 
NANOTEC said:
BR NEEDS higher bitrate and higher capacity...

How can BR "need" higher bitrate and capacity? It is just a media. It supports mpeg2 and mpeg4, so it is a moot point. It is flexible to whatever you wish to accomodate. This is a positive thing.

With your sort of logic, one can just as well say HD-DVD "needs" VC-1 due to a shortage of bitrate and capacity. :rolleyes:

It all comes down to choices, priorities, and objectives, as to what the consumer product we actually see hit the shelves. It's got damn near nothing to do with what the media is capable of, at this point.

As for the 1080p60 topic, it is largely moot for movies for the time being. Movies will continue to come from 1080p24, so everybody will be able to share in the hd goodness through standard HDMI ports. If need be, 1080p30 can certainly suffice.
 
Last edited by a moderator:
one said:
Why do we have to forget studios that use VC-1/H.264 on BD, such as Warner? Look at the AVC 12Mbps case in
http://www.watch.impress.co.jp/av/docs/20060310/ceb3_04.jpg

Basically the argument Brimstone raised about MPEG2 and the argument you've brought here about the false needs of MPEG2 on BD are completely unrelated issues.

Why compare it to 12Mbps AVC on a 15GB HD DVD? Why not compare it to 18Mbps VC-1 on a 30GB HD DVD? Is it because HD DVD has a 5GB advantage when using the superior VC-1? Finally there is no point in going past 20Mbps using VC-1 when you've alreay got parity with a D5 master at equal or lower bitrates. In other words even if you talk about 50GB BR movies in the future you stilll won't be able to beat a D5 master no matter how high the bitrates get. You willl need that 50GB when using MPEG2 though which may still not be enough to equal a D5 master.

How can BR "need" higher bitrate and capacity? It is just a media. It supports mpeg2 and mpeg4, so it is a moot point. It is flexible to whatever you wish to accomodate. This is a positive thing.

Because BR's orginal design goal was high bitrate using MPEG2. MPEG4/VC-1 wasn't added until later when they realized MPEG2 even at a high bitrate just wasn't gonna cut it compared to HD DVD. HD DVD's orginal design goal was advanced codecs used with existing DVD manufacturing infrastructure.
 
Last edited by a moderator:
randycat99 said:
It's still a moot point, regardless of what design beginnings you would spin for it. :rolleyes:

No it's not moot at all. In fact using MPEG2 like SONY on a 25GB BRD will be inferior to VC-1 on the same disc no ifs or buts. Moving to 50GB is needed for MPEG2 to try and reach the same quality.
 
one said:
Why do we have to forget studios that use VC-1/H.264 on BD, such as Warner? Look at the AVC 12Mbps case in
http://www.watch.impress.co.jp/av/docs/20060310/ceb3_04.jpg

Basically the argument Brimstone raised about MPEG2 and the argument you've brought here about the false needs of MPEG2 on BD are completely unrelated issues.

Well that chart is obviously slanted to show Blu-Ray in the best possible light.

It's showing a single layer HD-DVD and the compromises that have to be made with 15 Gig of space. But HD-DVD has 2 layer and 3 layer available to gain more storage space. So a dual layer disc has a little more storage space of a single Blu-Ray disc.

It's not stating at what resolution they encoded Spiderman at and this I'm really curious about. Can anyone find a link for what the intial releases are being offically encoded at?
 
NANOTEC said:
No it's not moot at all. In fact using MPEG2 like SONY on a 25GB BRD will be inferior to VC-1 on the same disc no ifs or buts. Moving to 50GB is needed for MPEG2 to try and reach the same quality.

The average movie buyer isn't going to be able to pick out the difference, so it is a moot point. If it turns out to be an issue, BR has the mpeg4 avenue to explore, as well. So again, it is a moot point. :rolleyes:
 
Back
Top