Will Warner support Blu-ray?

Status
Not open for further replies.
I think the moderate outlook in all of this is one that takes "perspective" into account. Considering that even simplistic forms of mpeg manage to throw-out 90% of the original video information away and turn water to wine from the last 10% of the data it saves (figuratively speaking), it's pretty clear that the lionshare of data compression has already been leveraged. Pushing for newer and newer codecs that squeeze things vanishingly smaller is not really the direction we should want to be going in when it comes to high performance video standpoint (this is patently obvious to the video-biz professional, but not to joe-consumer who just wants to rip bootleg movies to CD's). More "efficient" codecs are cool from an academic standpoint and actually useful for applications that actually are challenged with low datarate ceilings (digital satellite and cable feeds? on-demand video service over IP?), but the complete antithesis of the strive for high performance video for the future. The direction we should be going is to find ways to support higher data rates for video, which would in turn open up the options for less aggressive compression settings. At "generous" datarates, the quality differences between mpeg2 and mpeg4 are rather trivial, but the real gains in quality will come from simply higher datarate. The real goldmine quality of video nirvana won't be found in that 50/60/70x compression range, but by finding ways to get to that 10-20x range with real 4:4:4 color space, real detail preservation in textures not just hard edges, and real resolution that hangs in there during screen motion. We don't have this now, and we certainly won't get it by going further down the path of tighter compressions with more "efficient" codecs.

10-20x may seem "low", or it may not seem impressive, but this is where the perspective comes in...consider that getting past a mere 2x compression is quite stupendous for lossless scenarios, can you imagine the insanity we are asking for by pushing things to 50/60/70x?! (The point being- 10-20x is quite a nice leap from 2x, while still being sane about giving that signal "room to breath") The goal here is not to get to "100x" just for compression's sake, but to get the best video to our TV. We're not going to get that by getting sold on dreams of codecs that throw out 99% and polish the last 1% into something remotely resembling the original signal.
 
Last edited by a moderator:
iknowall said:
you can understand that a 20 : 1 compression with mpeg2 preserve more the image quality than a 50 :1 compression with Vc1 .

So i guess the old .FLC format with even less compression is superior in quality to both formats?
 
aaronspink said:
No, I'm pretty sure you are missing the point. For a given bit rate, the more advanced codec can achieve higher image fidelity. This isn't rocket science

No, I'm pretty sure you are missing the point

What means that the codec is more advanced ? It can give an higer compression ratio with less artifact , but this also translate in a loss of video quality so it wont give you the same quality of the original , and at hi bitrate where using hi compression ratio is pointless all this advancment the codec have became pointless since higer bitrare = lower compression

the same thing happens in audio codecs where, as over time, more advanced codec are created that give either better quality at the same bit rates or the same quality at lower bit rates.

First, the comparison dont make much sense because you can compress the audio a lot more without any evident problem but with the video you see fast the artifact of the compression.

Second, cd audio still use the same Wav uncompressed formate have always used,
and it give you the best quality , no mp3 have the same quality of the original wav,
an expert can tell the difference.

MPEG2 is very old as far as video compression goes.

And Digibetal betacam is even older and blow mpeg2 and mpeg4 out of the water with it's 3:1 compression , 4:2:2 10 bit color space and a data rate of 90Mbps.

So ?


It may be his work, but it doesn't mean he knows what he's talking about or is even being honest with what he knows.

Aaron Spink
speaking for myself inc.


The problem is he actually are doing this in pactive, he is not just stating an opinion, he is saying " we use this because we found out that this give the best result" wich give him a lot more cridibility than you that dont have any real word experience.
 
Last edited by a moderator:
aaronspink said:
Then explain AAC and MPEG2 Layer 3 audio compression and how AAC is able to acheive either higher fidelity at the same data rates or same fidelity are lower data rates.

These things happen, technology advances, our ability to compress better at the same quality advances.

Aaron Spink
speaking for myself inc.

the comparison dont make much sense because you can compress the audio a lot more without any evident problem but with the video you see fast the artifact of the compression.

And NO it wont give you a more quality if the data rate is almost the same of the original master.
 
Last edited by a moderator:
aaronspink said:
And I'm saying you don't know wtf you are talking about. It is possible for a different lossy codec to have either the same quality at higher rates of compression or better quality at the same rates of compression. If you don't want to believe, want to refute volumes of research, or just want to troll, go ahead.

Aaron Spink
speaking for myself inc.

:rolleyes:

I gave a link about the fact that when you use an higer compression ratio than 3:1 you start to lose quality , so unless the codec is something magic, yes, more you compress more quality you lose after you pass this 3:1 ratio.

"Absolute absence of losses is a very strong requirement, therefore it is often hard to achieve compression ratios above 3:1. "

http://www.compression.ru/video/codec_comparison/lossless_codecs_en.html

Why dont you get a clue ?
 
Last edited by a moderator:
iknowall said:
:LOL: higher fidelity compared to what ? To the original master ?

The original uncompressed master blow out of the water the mpeg2 encodec version.


:oops: Are you being sarcastic?

Obviously he meant that between the AAC and the MP3, the AAC one has better quality at the same bitrates, or the same quality at lower bitrates.

Of course the original master will always be the best version.
 
london-boy said:
:oops: Are you being sarcastic?

Obviously he meant that between the AAC and the MP3, the AAC one has better quality at the same bitrates, or the same quality at lower bitrates.

Of course the original master will always be the best version.

I tough he was talking about mpeg2 video not audio

And btw acc wont give you better quality if you are near the datarate of the origal master because more is the data rate less compression you need.

The argumet is always the same :


Do mpeg4 give you a better result than mpeg2 with a 700kbps ?

Sure.

DO this mean that mpeg4 give you better quality at 80mbps ?

NO .

SO stop to debate this point again and again.
 
Last edited by a moderator:
iknowall said:
I tough he was talking about mpeg2 video not audio

He mentioned AAC and MPEG Layer 3...

Either way, even if he was talking about video, obviously he didn't mean that the compressed formats look or sound better than the original master! Let's use a bit of common sense and read things properly before jumping at people's throats.
 
london-boy said:
He mentioned AAC and MPEG Layer 3...

I did not pay attention to "layer3" and i ridden only" AAC and MPEG2 " so i tough he was talking about why mpeg2 video and ACC audio codecs can give better result.

SO i asket better result compared to what ?

Either way, even if he was talking about video, obviously he didn't mean that the compressed formats look or sound better than the original master!

So the querstion is pointless because i alredy stated that where is a point where you start to get a better result.

Let's use a bit of common sense and read things properly before jumping at people's throats.
How many times i had to repeat the same concept again and again because people dont want to use common sense ?
 
Last edited by a moderator:
iknowall said:
No, I'm pretty sure you are missing the point

What means that the codec is more advanced ? It can give an higer compression ratio, but this also translate in a loss
of video quality so it wont give you the same quality of the original , and at hi bitrate where using hi compression ratio is pointless all this advancment the the codec have became pointless since higer bitrare = lower compression

MPEG2 can have pretty much any compression ratio you want. So can MPEG4. At any given compression ratio, MPEG4 will result in higher quality.



First, the comparison dont make much sense because you can compress the audio a lot more without any evident problem but with the video you see fast the artifact of the compression.

Audio compression has been a field that has had more research and lower computational demands. Video codec are constantly improving and will continue to improve until the point that full frame, full color video is doable with undetectable artifacts are reasonable bit rates.

Second, cd audio still use the same Wav uncompressed formate have always used,
and it give you the best quality , no mp3 have the same quality of the original wav,
an expert can tell the difference.

Actually, experts cannot tell the difference. They haven't been able to in all the double blind testing that has been done. Some people claim they can, but can't demonstrate it. And a large segment of the audio, is played but once in redbook format. After that its compressed to MP3 for use on computers and portable music devices. I know plenty of people that haven't listened to an actual CD for years. Then their are the online music stores selling MP3s for consumer download. CD are merely a distrobution medium.

The problem is he actually are doing this in pactive, he is not just stating an opinion, he is saying " we use this because we found out that this give the best result" wich give him a lot more cridibility than you that dont have any real word experience.

I know how "technologists" work and that is mostly as PR. Listen, the guy already has the library in MPEG2 format, he don't want to convert it, and he was on the losing side of the standard battle for inclusion of the MS codec into BR. I wouldn't take what he says at face value. There are a lot of other people out there that disagree with him.

Aaron Spink
speaking for myself inc.
 
iknowall said:
:rolleyes:

I gave a link about the fact that when you use an higer compression ratio than 3:1 you start to lose quality , so unless the codec is something magic, yes, more you compress more quality you lose after you pass this 3:1 ratio.

"Absolute absence of losses is a very strong requirement, therefore it is often hard to achieve compression ratios above 3:1. "

http://www.compression.ru/video/codec_comparison/lossless_codecs_en.html

Why dont you get a clue ?

Your link doesn't say anything about lossly codecs, and it has no bearing on the topic at hand, so stop posting it and claiming it is the uber incite.

All that link talks about is lossless codecs. big flipping deal. We are talking about lossly codecs here. There are no rules that state that if one codec has 3:1 or better compression vs another codec that its poorer. It all comes down to how advanced the codec is.

Aaron Spink
speaking for myself inc.
 
Last edited by a moderator:
iknowall said:
the comparison dont make much sense because you can compress the audio a lot more without any evident problem but with the video you see fast the artifact of the compression.

And NO it wont give you a more quality if the data rate is almost the same of the original master.

The data rate of the master is roughly 1.5 Gb/s. We're talking video data rates in the range of 19 Mb/s which is a compression ratio of 80.

Aaron Spink
speaking for myself inc.
 
Can any one give links to clips encoded at high bit rates of H.264 HD and mpeg2 HD video so we can see for our selfs which is better.

It would be good if they are at the same resolution.
 
aaronspink said:
MPEG2 can have pretty much any compression ratio you want.
No it can't , please get a F*k clue.

You can't use mpeg2 with any compression ratio, it has a limit and if you pass this limit all what you will see is a binch of artifact instead of a video.


Anyone with a basic experience in video encoding would know this.

I had already gave a link about this you can search it in one ym previuos post.

So can MPEG4. At any given compression ratio, MPEG4 will result in higher quality.

No at the same compression ratiowith an hi bitrate like 80Mbit/sec. it wont give you any more quality unless it have something magic.

Audio compression has been a field that has had more research and lower computational demands.

And what have this to do with the fact that you can compress a lot more the audio than video with no evident loss in quality ?

Video codec are constantly improving and will continue to improve until the point that full frame, full color video is doable with undetectable artifacts are reasonable bit rates.

:LOL:
Professional video codecs are constantly improving in the opposite sense and every new
codec give you less compression .

Hdcam give you a 4:2:2 8 bit with 135mbps

Hdcam sr, the evolution of hdcam , use a lot less compression and give you a 4:4:4
10 bit with 440mbps.

And everything you get in the consumer environment came from the tecnology uesed in the professional environment


Actually, experts cannot tell the difference. They haven't been able to in all the double blind testing that has been done. Some people claim they can, but can't demonstrate it.

Actually , an expert can tell the difference , how many expert do you know ?
YOu wont need to demostrate that an mp3 will never have the quality of the original wav, since it is a fact .

And a large segment of the audio, is played but once in redbook format. After that its compressed to MP3 for use on computers and portable music devices. I know plenty of people that haven't listened to an actual CD for years. Then their are the online music stores selling MP3s for consumer download. CD are merely a distrobution medium.

But this dont change the fact that cd wav audio have a better quality than mp3 audio

I know how "technologists" work and that is mostly as PR.

All what you say is only based on your opinion and not on real working experience.

Listen, the guy already has the library in MPEG2 format, he don't want to convert it, and he was on the losing side of the standard battle for inclusion of the MS codec into BR.


Hum, no. No one make a library in mpeg2 format. Mpeg2 is a delivery format, not a mastering format.

You don't have any clue about how work a professional environment.

He alredy have an hi def uncompressed master of all his work and he will encode in mpeg2 when he know exactly what are the target spec.

IF the target is a 50gb blu ray disk he will use different compression and bitrate that if the target is a 25bg blu ray disk.

I wouldn't take what he says at face value.

Every major Hollywood studios like lucas and warner all agree with him to the fact that mpeg2 with hi bitrate give you the best quality in the dcinema environmen .


There are a lot of other people out there that disagree with him.

Sorry but all the major Hollywood studios agree with him.

Here is what Don Eklund said, “Advanced (formats) don’t necessarily improve picture quality. Our goal is to present the best picture quality for Blu-ray.Right now, and for the foreseeable future, that’s with MPEG-2.”

here is a quote from the avica site :

Q. What is used now and why?

A. The codec currently approved by the major Hollywood studios for digital cinema use is HD MPEG2 at high bit-rates. There are a number of other codecs that are proprietary or have never been approved for major motion picture releases.

Avica - and our interoperability partners - use HD MPEG2 at 80Mb/sec MP@HL.

http://www.avicatech.com/jpeg2000.html#q15
 
Last edited by a moderator:
aaronspink said:
The data rate of the master is roughly 1.5 Gb/s. We're talking video data rates in the range of 19 Mb/s which is a compression ratio of 80.

Aaron Spink
speaking for myself inc.

No we are talking about 50gb blu ray disk wich allow 2 hour of 40mbit/sec. mpeg2 with 40:1 compression ratio.

Disney had alredy stated thet will use the 50gb disk is out to relese blut ray movies.

Please get a clue.
 
aaronspink said:
Your link doesn't say anything about lossly codecs, and it has no bearing on the topic at hand, so stop posting it and claiming it is the uber incite.

:rolleyes:
The link state that over 3:1 compression you lose quality, and every lossy codec
have more than 3:1 compression ratio .


And since you stated this :

"It is possible for a different lossy codec to have either the same quality at higher rates of compression or better quality at the same rates of compression."

this is false at hig bit rate condition you wont have the same quality.

An mpeg4 26mbit/sec. wont have the same quality an mpeg2 80mbit/sec. get over it.

And this is because a mpeg4 60:1 compression will always have more quality loss than a mpeg2 20:1 compression , so no it is false that "It is possible for different lossy codec to have either the same quality at higher rates of compression" if we are using an hi bitrate like 80mbit/sec. or 40Mbit/sec.

Why a mpeg4 60:1 compression will always have more quality loss than a mpeg2 20:1 compression ?

Because the mpeg4 codec is not a lossless codec is it can't apply higer compression ratio and have the same quality,
and this is why i gave you the link about the lossless condec.

Get a clue.



All that link talks about is lossless codecs. big flipping deal. We are talking about lossly codecs here. There are no rules that state that if one codec has 3:1 or better compression vs another codec that its poorer. It all comes down to how advanced the codec is.

Aaron Spink
speaking for myself inc.


:rolleyes: It state that over 3:1 compression you lose quality, and every lossy codec
have more than 3:1 , so it prove that mpeg4 with more compression than mmpeg2 have less quality , given the fact that we are talking about hi bitrate condition, why don't you get it ?
 
Last edited by a moderator:
If you stopped repeating over and over "u don't have a F*kin clue" and "get a clue" and "u don't know what the f**k u're talking about" i think people would be nicer to you and your points would be much easier to understand.

Just drop the attacks and name calling. You can very well discuss things without having to say and repeat those things.
 
iknowall said:
No we are talking about 50gb blu ray disk wich allow 2 hour of 40mbit/sec. mpeg2 with 40:1 compression ratio.

Disney had alredy stated thet will use the 50gb disk is out to relese blut ray movies.

Please get a clue.

It would be very nice if you somehow could prove what you are saying.

I haven't seen any comparison between AVC/VC-1/Whatever vs MPEG2 @ 40mbit.

There is good reasons why MPEG2 is used throughout the industry, it's a well established standard and it's easy to implement in hardware and software solutions for the same reasons (cheap!). But nowhere have i read that the reasons was the superior quality vs the new codecs.

Imho you are making claims without anything to back them up, you say at 80mbit you wont notice any difference, ehmm so i guess a mpeg encode @ 80mbit is pixel perfect?
 
Status
Not open for further replies.
Back
Top