Will Warner support Blu-ray?

Status
Not open for further replies.
iknowall said:
I am done with this point also.

With hi bitrare Less compression = more quality , and mpeg2 is less compressed.

the only case where you to get same quality with more compression is with a lossless codec.

Wrong. You simply don't understand how DCT based lossy codecs work. All codecs: MPEG-1, MPEG-2, MPEG-4, VC-1, et al, consist of a pipeline in which portions of the algorithm are reversible (non-lossy) and other portions are not. The lossy portion of MPEG-4 AVC is actually less lossy than MPEG-2, for example, the new Integer 4x4 DCT introduces less roundoff error and numeric mismatch between DCT and IDCT.

The biggest principle difference between MPEG-2 and MPEG-4 AVC efficiency wise is MPEG-4 AVC's superior interframe prediction, a process which is entirely lossless, because MPEG-4 widens the search window beyond the next or previous frame, and allows the blocksize to vary.

So H.264 can deliver higher compression ratios at the same bitrate with better quality.
 
iknowall said:
But not enough to know Hdv

What the hell does HDV vs DV have to do with this thread anyway?

Your arguments are so flawed it hurts the eyes..

In your world MPEG-1 quality is better than MPEG-2 because it compresses the videostream less. Old formats like Video-1 from Microsoft are even better, they hardly compressed the Video (looked like shit though).

If you don't understand that the new VC-1 and h.264 has the potential to give a better picture quality then you are working in the wrong place.

I suggest you read this:

http://en.wikipedia.org/wiki/H.264

It even points out where the h.264 codec is more advanced than MPEG2

And forget anything about a 100GB BR disc for the forseeable future...
 
DemoCoder said:

Improving television with larger screens and better resolution requires a huge increase in transmission bit rates. The bit rates are ,however, limited by the available broadcast spectrum or network connection. The only recourse is lossy image compression, most commonly JPEG, MPEG-2, Wavelets or Fractals. “Lossy” by name and lossy by nature. The more the image is compressed, using lossy methods, the worse the image quality.

http://www.autosophy.com/videcomp.htm

If this is wrong then so why the uncompressed master have always the best quality ?

Why Dcinema use a jpeg2000 250mbit/sec. bit rate instead of mpeg2 80mbit/sec. ?

Why the new standard is less compressed ?

If the future and the quality is obtained using more compression and more advanced codec, why they use a less compressed codec ?

Simply because your statemente is false , and and the fact are with me.

You simply don't understand how DCT based lossy codecs work. All codecs: MPEG-1, MPEG-2, MPEG-4, VC-1, et al, consist of a pipeline in which portions of the algorithm are reversible (non-lossy) and other portions are not. The lossy portion of MPEG-4 AVC is actually less lossy than MPEG-2, for example, the new Integer 4x4 DCT introduces less roundoff error and numeric mismatch between DCT and IDCT.
Mpeg4 complession ratio itself is limited Shannon's law or "Rate Distortion Theory".


Mpeg4 introduce new and more advanced algoritm because compress more the image, and unless the compression is something loseless, compressing more always lose more quality.

Mpeg4 have need to have a better error prediction because it is made to use more compression with more possible errors.

But less compression you apply, less error you can have, less importance the error prediction assume.

IF you have no error to fix having an algoritm that can fix more error dont do nothing.

This is the point that you dont want to get.


How do lossy codecs destroy information?


To begin with, DV-25 and MPEG-2 use a 4:1:1 or 4:2:0 pixel representation, meaning that they throw away 3 out of every 4 chroma samples compared to a full 4:4:4 pixel representation. Then they chop up the image into square tiles, and throw away most of the detail by approximating each tile with a few coarsely gradated sinusoidal waves to compress the 4:1:1 or 4:2:0 video by a factor of 5 (for DV-25), and typically by factor of 10 to 100 for MPEG-2. Due to roundoff errors, lossy codecs actually degrade the image further each time it is decompressed and recompressed.

http://www.bitjazz.com/sheervideo/support/faq.shtml




The only way to have a better lossy portion is to have a less complicated compression.

And mpeg4 is an evolution of mpeg2 and it is made to compress more the image, not to compress less the image.
The biggest principle difference between MPEG-2 and MPEG-4 AVC efficiency wise is MPEG-4 AVC's superior interframe prediction, a process which is entirely lossless, because MPEG-4 widens the search window beyond the next or previous frame, and allows the blocksize to vary.
So H.264 can deliver higher compression ratios at the same bitrate with better quality.

:rolleyes:

And here is where is clear you dont have any clue.

MPEG-4 AVC's have a superior interframe prediction than dvcprohd also.

Do this make mpeg4 look better than dvcprohd at the same bit rate ?

no, dvpcrohd blow mpeg4 out of the water.

Interframe prediction is a 'prediction error' picture.

It have to be more advanced in mpeg4 because mpeg4 have need to apply a bigger compression.

But less compression you apply, less error you can have, less importance Interframe prediction assume.

IF you have no error to fix having an algoritm that can fix more error dont do nothing.

SO the more advanced intraframe prediction is useless at 20:1 compression , and dont make mpeg4 better than mpeg2 at hi bitrare with low compression .
 
Last edited by a moderator:
-tkf- said:
What the hell does HDV vs DV have to do with this thread anyway?

read again the thread , you will understand what hdv have to do with the gop mpeg2 comrpession.


Your arguments are so flawed it hurts the eyes..

My argument are so solid that the new standard used for the Dci is a less compressed format than the previuos.

In your world MPEG-1 quality is better than MPEG-2 because it compresses the videostream less. Old formats like Video-1 from Microsoft are even better, they hardly compressed the Video (looked like shit though).

NO, because Mpeg2 actually use an higer resolution, an higer bitrate, and take more space on the disk.

You can put on a cd 50minutes of good quality mpeg1 and about 20min of mpeg2.

Sorry but say that video-1 dont compress the pictures and the video look like shit is a contraddition

If the codec dont apply any type of modification at the video the quality dont change.

So if the video quality became like shit it is obvious that it is not a codec that don' t apply any type of change at the video.

The old Digibeta standard with it's 3:1 loseless compression and it's 4:4:4 10bit blow out of the water h.264, vc1, mpeg2 and mpeg4.

If you don't understand that the new VC-1 and h.264 has the potential to give a better picture quality then you are working in the wrong place.

I suggest you read this:

http://en.wikipedia.org/wiki/H.264

It even points out where the h.264 codec is more advanced than MPEG2

And forget anything about a 100GB BR disc for the forseeable future...

You forgot that more advanced dont means it give better quality at hi bit rate where this advancement are useless

And this quote support my claim :

Don Eklund apparently said

"Advanced (formats) don’t necessarily improve picture quality."
 
Last edited by a moderator:
-tkf- said:
If you don't understand that the new VC-1 and h.264 has the potential to give a better picture quality then you are working in the wrong place.

I suggest you read this:

http://en.wikipedia.org/wiki/H.264

It even points out where the h.264 codec is more advanced than MPEG2

It's already been said, but I think it's worth asking again since your point seems to be made a lot - I thought the whole point of Mpeg4/h.264, at least, was to have up to the same quality as Mpeg2, but with a lower bitrate/more compression? If this is the case, it makes perfect sense to use Mpeg2, whilst you can fit everything on the disc (particularly since experience with Mpeg2 is also far greater than mpeg4). I'm sure later as studios look to pack more content onto a single disc, they'll look to these more "advanced" formats in order to get better compresion, but the reasons would not have much to do with picture quality I don't think..

I'm not very familiar with VC-1, though. Does it offer better quality, or similar-to-lower quality at significantly lower bitrates?
 
Titanio said:
It's already been said, but I think it's worth asking again since your point seems to be made a lot - I thought the whole point of Mpeg4/h.264, at least, was to have up to the same quality as Mpeg2, but with a lower bitrate/more compression? If this is the case, it makes perfect sense to use Mpeg2, whilst you can fit everything on the disc (particularly since experience with Mpeg2 is also far greater than mpeg4). I'm sure later as studios look to pack more content onto a single disc, they'll look to these more "advanced" formats in order to get better compresion, but the reasons would not have much to do with picture quality I don't think..

Excacltly.

In real word encoding situation mpeg2 at the right bitrate give you the best result, like you said "particularly since experience with Mpeg2 is also far greater than mpeg4" ,
anyone with the right experience in mpeg2 encoding know how much better is the result.

There is absolutly no point to use a more compressed codec when you have
more space.

The compression destroy the original video information
 
Last edited by a moderator:
Titanio said:
I'm not very familiar with VC-1, though. Does it offer better quality, or similar-to-lower quality at significantly lower bitrates?

VC-1 is essentially WMV 9. Microsoft keeps a Showcase section here where you can download 720p and 1080p versions. "Step into Liquid" is a quite famous example, but this is mostly because it was one of the first demos. Another is the Terminator 2 hi-def.

VC-1/WMV 9 should offer better quality at the same bitrates. It looks better at very low bitrates and allows for greater resolution at high bitrates. The strength is in how well it scales. You can have PSP/iPod type videos at very low bandwidth and you can have hi-def at high bitrates, something MPEG-2 will have a difficult time matching. There may be some differences (to MPEG-2, which is very mature and can produce a very nice soft image) in how colors come through under certain conditions, where many of these high compression algorithms have problems with gradients in low luma ranges. However, this may be more about the encoder used rather that a strict codec limitation. For the most part, these problems have been eliminated and the image looks very good, you just need the right hardware to decode it.

It's difficult to make a final assessment without reference encoder tools to play with. If you look at something like MPEG-2, the encoder used to create the video can make a huge difference. I assume these showcase videos are about as good as it gets and it looks very good to me.

EDIT:

Here is a nice page that summarizes (at the bottom) the scalability of VC-1/WMV9. You might want to add a little salt because it is a tradition to overstate the technical capabilities of a codec, but it should give you a rough idea. Otherwise, using the Windows Media Encoder to create some videos yourself should familiarize you with the codec. It may not be the best encoder in the world, but it's not bad either.
 
Last edited by a moderator:
Titanio said:
It's already been said, but I think it's worth asking again since your point seems to be made a lot - I thought the whole point of Mpeg4/h.264, at least, was to have up to the same quality as Mpeg2, but with a lower bitrate/more compression?

40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side. The question is will we be able to see the difference and will the source be good enough to show any difference

The sad truth is were discussing something totally unlikely, Hollywood releasing Dual Layer Discs with 50GB when they can settle for 25GB and get a quality that is "good enough"

Or 25mbit MPEG2 stream for a 2 hour movie with no extras :)
 
-tkf- said:
40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side. The question is will we be able to see the difference and will the source be good enough to show any difference

Why it should ? because it is more advanced ? because it use a more complicated and advanced compression ?

So it shoud look better than dvcprohd also .

Mpeg4 is more advanced and more efficent than dvcpro

Mpeg4 have a lot of more function and advancement than dvcprohd , dvpcro is a lot less complicated codec than mpeg4 , but it blow mpeg4 out of the water.

Is clear that at hi bitrate all of those "advancement" thinked for a low bit rate situation are useless since this professional codec that use an hi bitrate dont have them.
 
Last edited by a moderator:
-tkf- said:
40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side.
How much processing power does it take to play a 40mbps h.264 stream?
 
-tkf- said:
40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side. The question is will we be able to see the difference and will the source be good enough to show any difference

The sad truth is were discussing something totally unlikely, Hollywood releasing Dual Layer Discs with 50GB when they can settle for 25GB and get a quality that is "good enough"

Or 25mbit MPEG2 stream for a 2 hour movie with no extras :)

Actually, I heard there's a point of diminishing returns for the advanced codecs.

Even the MS VP said at AVS that VC-1 bitrates over the mid to high teens don't yield better results.

Instead, they're talking usually about 12-16 Mbps for their HD-DVD releases. And some reports about SBC using 5 Mbps VC-1 to deliver HDTV streams for their IPTV service.
 
-tkf- said:
40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side. The question is will we be able to see the difference and will the source be good enough to show any difference.

Here's the contradiction- if you are expecting the differences to be so slight as to question if people will be able to catch the difference or if the material will be so demanding to expose a difference, how can it "blow mpeg2 out of the water"? Either the difference is distinct and obvious or the differences are well into the point of diminushing returns. You can't have it both ways.

The datapoint that puts context to this is the 40 Mb/s figure. At such generous datarates, the differences between mpeg2 and 4 will be slight, if any (speaking of current consumer resolutions, of course- this can change as you would expect if we are talking about >1080p applications). In that range, there is nearly nothing for mpeg4 to do better, because both codecs are in a comfortable range for that resolution to get the job done properly. Where mpeg4 would really show off its advantages is when you have to wrench down the datarate to 15, 10, 5 Mb/s? That's where mpeg2 will have increasing problems keeping up for a given resolution. Additionally, it is not a matter of picture quality being "better" at that point. It is a matter of maintaining/retaining picture quality from the "unconstrained" 40 Mb/s scenario. I think one of the points that iknowall has been trying to get across (albeit, maybe not directly), is that when we are getting down to such aggressive compression ratios, there is no sense of "better" picture quality, rather how much picture quality can be salvaged compared to the original content. It's all downhill, so to speak, as you move further down the chain beyond the codecs that professional equipment use to store the program. From that standpoint, more compression (as it is presented in consumer formats) is your enemy no matter if it is coming from mpeg2 or mpeg4 or mpeg100. Proposing ways to compress even further inherently goes the opposite direction to picture quality. The only variable left is how well/how graceful the codec performs at decreasingly smaller datarates (higher compressions). Naturally, more advanced will perform better, but at that point, it is a matter of maintaining reasonable picture quality, not "improving" picture quality (because if you were really interested in "picture quality", you wouldn't be flirting with the lower datarates in the first place). The envelope of "picture quality" inherently lies in the realm of generous datarates, period.
 
-tkf- said:
40mbit h.264 should "blow mpeg2 out of the water" even if the experience is on the mpeg2 side. The question is will we be able to see the difference and will the source be good enough to show any difference

The sad truth is were discussing something totally unlikely, Hollywood releasing Dual Layer Discs with 50GB when they can settle for 25GB and get a quality that is "good enough"

Or 25mbit MPEG2 stream for a 2 hour movie with no extras :)

Sadly, 25Gb discs are not going to be ubiquitous everywhere thanks to HD-DVD9 and BD9 "red laser' formats. Warner Bros for example, wants to release its contents on DVD9 format, and this means VC-1 or H.264

Fear not, look here http://ftp3.itu.ch/av-arch/jvt-site/2004_07_Redmond/JVT-L033.doc

H.264 outperforms 24mbps D-Theatre MPEG-2 at all bitrates above 8Mbps as measured by perceptual quality tests. In fact, people rated H.264 @16Mbps as differing from the original uncompressed source by 0.7% as opposed to 12% for 24Mbps MPEG-2.

This means that "red laser" HD discs will still look the same or better than 24Mbps MPEG-2.
 
randycat99 said:
The datapoint that puts context to this is the 40 Mb/s figure. At such generous datarates, the differences between mpeg2 and 4 will be slight, if any (speaking of current consumer resolutions, of course- this can change as you would expect if we are talking about >1080p applications). In that range, there is nearly nothing for mpeg4 to do better, because both codecs are in a comfortable range for that resolution to get the job done properly.

Not true, because no matter how high you boost the datarate, MPEG-2 has a minimal distortion floor that is worse tha MPEG-4. For example, if you give MPEG-2 an an arbitrarily high bitrate, it will not give you lossless or near lossless images, and it's distortion will be higher than H.264 FRext. This is because of the inferior DCT/IDCT match of MPEG-2. Given that perceptual tests of MPEG-2 still show experimentally, a perceivable difference at 24Mbps, the extrapolation that this will be eliminated at higher data rates is not yet supportable.

Moreover, 40Mb/s isn't enough for MPEG-2 if you want to use 4:2:2 or 4:4:4 with 10-12bit samples. Going to 4:2:2 will doublee data rates and 4:4:4 quadruple them.

What people defending MPEG-2 don't seem to realize is that H.264 contains many proposals that were slated for MPEG-2 but got dropped for time constraints. Here are all talking about solving quality problems with enormously expensive hardware upgrades (manufacturing 100gb discs), when software solves the problem much better.


From that standpoint, more compression (as it is presented in consumer formats) is your enemy no matter if it is coming from mpeg2 or mpeg4 or mpeg100. Proposing ways to compress even further inherently goes the opposite direction to picture quality.

This is simply fallacious. The idea that going beyond MPEG-2 requires a loss of quality makes the assumption that MPEG-2 is already on the edge of the rate distortion curve, which is unknown and most likely not true. The major difference between MPEG-2 and H.264 temporal coding is the extension of the search space from 1-2 frames up to 32 reference frames. None of this throws away more data anymore than extending the window of a lossless windowed compression algorithm. H.264 also allows block sizes to vary (which allows more opportunities for the predictor), and more prediction modes (9 instead of 4) CABAC also adds efficiency with no loss of quality.

MPEG is a combination of JPEG and temporal compression. For 'I' frames, I think you'll find that H.264 won't be that much better than MPEG-2 (but it also won't be worse in terms of quality). For temporal compression, we started with M-JPEG and no-interframe encoding. Then we progressed to using 1-2 frames for the predictor. H.264 simply extends the amount of data that can be considered for the predictor. It does not toss away more data.

The result is that H.264 is able to take better advantage of temporal redundancy than MPEG-2 as opposed to perceptual invisibility which is what "lossy" parts of audio and video compression strive for. (tossing away inaudible frequencies, tossing away chroma resolution)

not "improving" picture quality (because if you were really interested in "picture quality", you wouldn't be flirting with the lower datarates in the first place). The envelope of "picture quality" inherently lies in the realm of generous datarates, period.

In the real world, you don't have the luxury of ignoring datarate limitations. All lossy compression codecs are concerned with achieving the best picture quality possible at a given data rate, not "achieving the best picture quality period". The latter question is irrelevent, since it would dictate just use lossless compressors.
 
Last edited by a moderator:
wco81 said:
Actually, I heard there's a point of diminishing returns for the advanced codecs.

Instead, they're talking usually about 12-16 Mbps for their HD-DVD releases. And some reports about SBC using 5 Mbps VC-1 to deliver HDTV streams for their IPTV service.


16Mbps has been found to be the sweet-spot for H.264 and VC-1. For example, H.264 at 16Mbps was tested with viewers by BDA and found that it was rated only 0.7% under the original uncompressed source, whereas MPEG-2 at 24MBps was 12% worse.
 
Titanio said:
It's already been said, but I think it's worth asking again since your point seems to be made a lot - I thought the whole point of Mpeg4/h.264, at least, was to have up to the same quality as Mpeg2, but with a lower bitrate/more compression? If this is the case, it makes perfect sense to use Mpeg2, whilst you can fit everything on the disc (particularly since experience with Mpeg2 is also far greater than mpeg4). I'm sure later as studios look to pack more content onto a single disc, they'll look to these more "advanced" formats in order to get better compresion, but the reasons would not have much to do with picture quality I don't think..

The point of MPEG4 was to extend a lot of the technologies that were used in MPEG2 such that there was lower total error in the solution. This allows either higher quality at the same bit rates or the same quality at lower bit rates.

Aaron Spink
speaking for myself inc.
 
iknowall said:
read again the thread , you will understand what hdv have to do with the gop mpeg2 comrpession.

My argument are so solid that the new standard used for the Dci is a less compressed format than the previuos.

NO, because Mpeg2 actually use an higer resolution, an higer bitrate, and take more space on the disk.

You can put on a cd 50minutes of good quality mpeg1 and about 20min of mpeg2.

Sorry but say that video-1 dont compress the pictures and the video look like shit is a contraddition

If the codec dont apply any type of modification at the video the quality dont change.

So if the video quality became like shit it is obvious that it is not a codec that don' t apply any type of change at the video.

The old Digibeta standard with it's 3:1 loseless compression and it's 4:4:4 10bit blow out of the water h.264, vc1, mpeg2 and mpeg4.

You forgot that more advanced dont means it give better quality at hi bit rate where this advancement are useless

And this quote support my claim :

Don Eklund apparently said

"Advanced (formats) don’t necessarily improve picture quality."

Sorry boy, you're wrong, all wrong.
 
DemoCoder said:
Not true,

Improving television with larger screens and better resolution requires a huge increase in transmission bit rates. The bit rates are ,however, limited by the available broadcast spectrum or network connection. The only recourse is lossy image compression, most commonly JPEG, MPEG-2, Wavelets or Fractals. “Lossy” by name and lossy by nature. The more the image is compressed, using lossy methods, the worse the image quality.
http://www.autosophy.com/videcomp.htm

If this is wrong then so why the uncompressed master have always the best quality ?

Why Dcinema use a jpeg2000 250mbit/sec. bit rate instead of mpeg2 80mbit/sec. ?

Why the new standard is less compressed ?

If the future and the quality is obtained using more compression and more advanced codec, why they use a less compressed codec ?

Simply because your statemente is false , and and the fact are with me.


because no matter how high you boost the datarate, MPEG-2 has a minimal distortion floor that is worse tha MPEG-4. For example, if you give MPEG-2 an an arbitrarily high bitrate, it will not give you lossless or near lossless images, and it's distortion will be higher than H.264 FRext. This is because of the inferior DCT/IDCT match of MPEG-2. Given that perceptual tests of MPEG-2 still show experimentally, a perceivable difference at 24Mbps, the extrapolation that this will be eliminated at higher data rates is not yet supportable.

Wrong, the distiorsion is introducted with the DCT compression, and less bitrare you use less distorsion you have, if you use a 40 : 1 compression you sure have more distorsion than if you use a 7:1 compression, so what you say is wrong, more is the bitrate, less is the distorsion.

You don't have any distorsion on the original master, the distorsion is an effect of the compression.

Dvcprohd like mpeg2 also has a Dtc compression less advanced than MPEG-4 and all what you say about mpeg2 is valid for Dvcprohd.


So why dvcprohd blow mpeg4 out of the water ?

distortion IS an artefact introducted with the compression , more compression you have more error you have to fix, but less compression you apply, less error you can have, less importance the error prediction and correction assume.

Compression techniques that allow this type of degradation are called lossy. This distinction is important because lossy techniques are much more effective at compression than lossless methods. The higher the compression ratio, the more noise added to the data.

http://www.dspguide.com/datacomp.htm





Moreover, 40Mb/s isn't enough for MPEG-2 if you want to use 4:2:2 or 4:4:4 with 10-12bit samples. Going to 4:2:2 will doublee data rates and 4:4:4 quadruple them.

What people defending MPEG-2 don't seem to realize is that H.264 contains many proposals that were slated for MPEG-2 but got dropped for time constraints. Here are all talking about solving quality problems with enormously expensive hardware upgrades (manufacturing 100gb discs), when software solves the problem much better.

Mpeg4 introduce new and more advanced algoritm because compress more the image, and unless the compression is something loseless, compressing more always lose more quality.

Mpeg4 have need to have a better error prediction because it is made to use more compression with more possible errors.

But less compression you apply, less error you can have, less importance the error prediction assume.

IF you have no error to fix having an algoritm that can fix more error dont do nothing.

This is the point that you dont want to get.

The higher the compression ratio, the more noise added to the data.

http://www.dspguide.com/datacomp.htm


This is simply fallacious. The idea that going beyond MPEG-2 requires a loss of quality makes the assumption that MPEG-2 is already on the edge of the rate distortion curve, which is unknown and most likely not true. The major difference between MPEG-2 and H.264 temporal coding is the extension of the search space from 1-2 frames up to 32 reference frames. None of this throws away more data anymore than extending the window of a lossless windowed compression algorithm. H.264 also allows block sizes to vary (which allows more opportunities for the predictor), and more prediction modes (9 instead of 4) CABAC also adds efficiency with no loss of quality.

And here is where is clear you dont have any clue.

MPEG-4 AVC's have a superior interframe prediction than dvcprohd also.

Do this make mpeg4 look better than dvcprohd at the same bit rate ?

no, dvpcrohd blow mpeg4 out of the water.

Interframe prediction is a 'prediction error' picture.

It have to be more advanced in mpeg4 because mpeg4 have need to apply a bigger compression.

But less compression you apply, less error you can have, less importance Interframe prediction assume.

IF you have no error to fix having an algoritm that can fix more error dont do nothing.

SO the more advanced intraframe prediction is useless at 20:1 compression , and dont make mpeg4 better than mpeg2 at hi bitrare with low compression


You provide no link, no proff, nothing that state that what you say is true, you are pulling all out of your ass without providing any proff that confirm your claim.

I provided a link that state what i am saying is true :

"The more the image is compressed, using lossy methods, the worse the image quality"
http://www.autosophy.com/videcomp.htm

The higher the compression ratio, the more noise added to the data.

http://www.dspguide.com/datacomp.htm



You need to provide a link that state clearly that at 80Mbti/sec. mpeg4 have a better image quality.

And another link that state that a 40:1 compression used my mpeg4 don't destroy more video information than a 20:1 compression using mpeg2

I alredy provided link that clearly state that that i am saying is true, but you never did.

Unless you provide link that proof what you are saying, what you are saying is bullshit
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top