Will Warner support Blu-ray?

Status
Not open for further replies.
iknowall said:
I gave links that state what i am saying is true, you are pulling all out of you ass .

Then, I'll link to my previous post, that says, "boy, you're wrong, all wrong". Since you put so much trust in links, maybe that will convince you.

The fundamental issues is you don't understand the fundamental issues and quote from websites that have nothing to do with any of the discussions at hand.
 
DemoCoder said:
Well, according to your theory, they should have used JPEG instead of JPEG-2000, because according to your idiotic theory, JPEG-2000 is more lossy than JPEG because JPEG-2000 has a higher compression ratio (about 20% higher) for a given PSNR.

Well, according with you theory they should have used an higer compression than the 20:1, because according to your idiotic theory you get more quality if you compress more the image.

But they use a 5 :1 compression....why ?


So Mr Know It All, why did dcinema use JPEG-2000? They didn't mandate lossless JPEG, which they could have done. Instead, they mandate only the 9/7 (lossy) wavelet filter.

Mjpeg video alredy exist you stupid moron. It have a lot of limitation.

First jpeg use a DCT compression over the JPEG2000 is a new wavelet-based compression.

Second, jpeg2000 have a scalable, multi-resolution compression.

Third JPEG2000 offers Region of Interest (ROI) coding. You can define regions within an image which need to be coded with a higher quality than the remaining parts of the image. These ROIs are placed early in the codestream so that they will be decoded or refined first in the decoding process.

And go on a lot of other aspect that i dont waste time to write since you can search by youeself.

Could be several reasons: If bandwidth is cheaper than CPU, then a more computationally intensive decoder imposes a higher overall cost.

Bandwitch is not cheaper, an array of raid disk capable capable to manage the 250Mbit/sec. data rate is absolutly not cheaper than a simple chip that decode a more compressed stream.

You make absolutly no sense.


Simplicity of workflow,

Simplicity of workflow my ass, having a bigger file make a lot more complicated and expansive all the delivery.

In a real word situation, that you obvious can't know because u talk out of your ass and dont have a real working experinece , you have to delivery the digital copy of the film at all the theaters that will memorize it in the server.

Today you can send it with using a Sat connection that would take a lot of time because the file is very big , or send a capable operator at the theater with the movie memorized with a aspecial program in an hd that will tranfer the movie to the hd of the server of the theater.

Transfer the movie with the hd is sure faster than using a sat connection, but also you have need a specialized person that know how the avica server work and trasfering the movie will also take its time.

Having a so much more bigger file will make all off this a lot more complicated that it is today, using the sat connection would take a whole lot of more time to transfer a single movie , and copying directly to the hd of the theraters server from an external hd will also take a lot of more time.



because authoring MPEG-2 requires more intervention than non-temporal/predictive coders. Features: intraframe based coding is more error resilant and supports seekable
streams.



:rolleyes:
Yeah and the fact that the quality is better using less compression dont matter at all right ?

It is just a conicidence if you get an higer quality, it was never taked in account.

Like I said, you claim that more compression = more loss and inferior quality, and then you trot out DCI which uses JPEG2000 instead of JPEG. If they were interested in minumum compression ratio, why didn't they pick JPEG over JPEG-2000 then?

:rolleyes:
Infact they go to a lower compression ratio, they go from the 20:1 compression using mpeg2 to a 5:1 compression using mjpeg2000.

If this is not interest in using a less compression ratio i dont know what it is

They dont use JPEG for the same reason why you dont use mpeg1 instead of mpeg2 , mjpeg video have too much limitation.


I'll tell you why: your thesis that more compression implies that loss must increase loss is wrong.

Then why the go from a 20:1 compression to a 5:1 compressio ?

your ridiculus thesis implies that with the more compression capability with mjpeg2000
they would use more compression instead of less.

But they go from 20:1 to 5:1

Frankly, I'm not very impressed by DCI. They spend 80% of their specification on security, and a few pages for compression/image quality, and the rest on infrastructure/transport which shows that the spec is not concerned with image quality, but with security and cost savings that come from digital distribution.

So i repeat if they dont care about using less compression why they go from 20:1 to 5:1 ?


DCT doesn't do ANY compression you stupid moron.

:rolleyes:
Yes it do stupid moron.

The porpoise of DTc is reducing the redundancy of information , DTC delete the video information that are redundancy and you wont notice the difference if they are deleted,
so it make a video smaller and this is a compression.

Also to proof that you can call DTC a compression i give you this quote :

" At the heart of JPEG2000 is a new wavelet-based compression methodology that imparts a number of benefits over the discrete cosine transform (DCT) compression methods used in JPEG.

http://www.us.design-reuse.com/articles/article4595.html


FFT, DCT, DWT are just transformations into frequency/scale domains. Why do you persist in pretending to know what you're talking about, like referencing Shannon's information and rate distortion theories, when you don't have a clue as to how those theories work. Google fishing expeditions are not helping your case.

The only one who have no clue here are you that pretend to know more than the sony vicepresindent and than the Ms Vp .

talking out of you ass are not helping your case .

MPEG-2 DCT introduces distortions because the DCT and IDCT introduce inverse transform mismatch errors. H.264's pseudo-DCT was designed to avoid these problems.

The compression step for spatial coding in MPEG is not DCT, it's quantization followed by entropy encoding.

But at hig bitrate those problem are so limited that you wont have any type of artifact to fix.

Number 1, it doesn't. Tests conducted by SMPTE and ISO/ITU show that MPEG-4 AVC/FRext @ 16Mbps is nearly indistinguishable from the uncompressed master in viewing tests.

16 Megabit or megabyte per second ?

If i take an uncompressed master and i encode it at 16mbit/sec. with a 90:1 compression it look like ass compared to the master.

If i take an uncompressed master and i encode it at 16MB/sec that are about 111mbit/sec. with a 13:1 compression
it will sure look near the quality of the original master with such low compression.

But it wont look any better than an mpeg2 video with a 111mbit/sec bitrate. that would look near the quality of the master also.







Number 2, the situations aren't comparable. DVCPRO is a 4:1:1 format. Try comparing DVCPRO 10-bit 4:1:1 to 10-bit 4:1:1 H.264 FRext.

Wrong

DVCProHD use a 4:2:2 8bit color sampling. Is common knowledge that the dv stantard use an 8Bit color format.

Please get a clue.

Btw give me a comparison where mpeg4 with the same bitrate and color sampling give a better result.

The whole reason that DV formats exist is to provide non-linear editing support. This means interframe coding can't be used.

NO the whole reason is to have a cheap digital mastering and editing format , and no mastering format would ever use an intraframe codec.

Not only that, but you're ignorant worship of DVCPRO HD ignores a fundamental flaw of DVCPRO HD: DVCPRO HD throws away 50% of the pixels. It downsamples 1920x1080 frames into 1280x1024 H.264 does not throw away 50% of luma information in the signal.

The only ignorant here is you. Please get a clue.

Dvcprohd dont downsample 1920 x 1080 to 1280 x 1024 , DVCPRO-HD camera
record DVCPRO-HD using a 1280x1080, which gets up-rezzed to 1920x1080

This was done to make a cheaper camera , it is not a limitation of the codec itself.

To make a less expansive 1080p camera panasonic made a camera that instead of recording a full frame 1920 x 1080 record at 1280x1080 which gets up-rezzed to 1920x1080 with the codec.

HDCAM does also records at 1440x1080, which gets up-rezzed to 1920x1080 on playback.

The camera do this internally it is not a limitation of the codec itself.

The only hd camera that record a real 1920 x 1080 resolution is the thomson Viper filmstream camera, please get a clue.


I provided two links. One link showing that OBJECTIVE PSNR tests show that H.264 beats MPEG-2 handily. Another, SUBJECTIVE visual test with human subjects shows that H.264 beats MPEG-2.

Look moron, google fishing doesn't help you.

I fail to see where your link say what is the advancment at an hi bitrate with no presence of distorsion.

Your link absolutly dont say what type of video artifact mpeg4 can fix making the video better with a video that dont have any video artifact to fix.

Look moron, google fishing doesn't help you. Your statement is trivially wrong.Take JPEG with huffman entropy encoding vs JPEG with arithematic entropy encoding, using the same quantization. The two decompressed images will be bit-for-bit IDENTICAL, but the compression ratio will be greater for JPEG with arithmetic entropy encoding.

Your statement is trivially wrong if you dont specity that compression ratio you use.

It will be equal starting from a point, it may be a very low compression point but it is still here unless the codec do something magic.

No, YOU need to provide the link that says the opposite, because YOU are the one making the claims that MPEG-2 is better than MPEG-4.

I dont have need a link, mpeg2 at hi bitrate is used in the dcinema and the video dont have any type of video artifact.

I dont have need a link to proff something that EXIST and averyone that dont beleave me can go and see by hisself.

YOU are the one making the claims that Mpeg4 at the same 80Mbit/sec. bitrare can make the video even looking better, so provide a link that state that mpeg4 can improve quality in a video that is alredy perfect lookin.

provide a link that state what type of video artifact mpeg4 can fix making the video better with a video that dont have any video artifact to fix.
 
Last edited by a moderator:
aaronspink said:
Then, I'll link to my previous post, that says, "boy, you're wrong, all wrong". Since you put so much trust in links, maybe that will convince you.

The fundamental issues is you don't understand the fundamental issues and quote from websites that have nothing to do with any of the discussions at hand.

The fundamental issues is you don't understand is that someone who talk out of his ass
dont have more credibility than someone that acctually do this in practice for work.

You can say that mpeg4 give a better result at 80Mbit/sec. because "democode say so" but this don't change the fact that when i send an hd-d5 master for the dcinema encoding i have the best result using the mpeg2hd codec.

He dont do this for work, nor he work in the movie industry, so to me that i actually had real work experience in the industry he dont' have any credibility compared to someone who do this for real.

The only fact to say "this is true because democode say so" don't make his argomentation valid nor true.

This is not a random quote taken from a website or a "google fishing" , this is a real word statement made from a Ms Vp that actually do this for work :

"Even the MS VP said at AVS that VC-1 bitrates over the mid to high teens don't yield better results."
 
Last edited by a moderator:
The sheer display of blockheadedness must be some sort of record, I have to admit that I have never seen someone so wrong argue so much about something you clearly know so little while claiming to be some sort of industry expert.

/salute
 
Thread Locked

This thread is going nowhere.
It's just the same same arguments reformulated again and again, in a more passive-agressive form each time.

If there's new info about Blue-Ray or if someone want to discuss the interest of the format, with regards to video games, just start a new thread.
 
Status
Not open for further replies.
Back
Top