Will Warner support Blu-ray?

Status
Not open for further replies.
Yes, you should keep within specified lengths for digital cable. An intermittent digital link is a bit different than picture quality.

Your 4:2:0 argument fell flat on its face, though.

You were most certainly adamant of the HD is "perfect", otherwise your stance would have allowed for my comments, as well, w/o vehement spasms. Nor would your knee-jerk reaction to question my equipment, my eyes, and my experience have to occur. No one has argued that HD is not better than SD. That's a no brainer. The subtly that you keep overlooking is how much. All things considered, it has shown to be incremental, so far. You seem to be in disagreement to this.

All you and anybody has to do is l_o_o_k__a_t__t_h_e__g_r_a_s_s. I know it pains you to acknowledge it. The evidence is right there. There's no denying it. HD has not proven itself to be all win-win. A great deal of it is still based in compromise. You don't even need high-end equipment to bear that out. If you are happy with HD, more power to you. However, if you truly are rational about the whole thing, you should have no problem acknowledging there are some pretty (not in a good way) rough edges. This is not a matter of feeling "ok" about it just because HD manages to come out better than SD. That's just the wrong way to look at it. HD should be considered a success because it simply looks good, not because it managed to edge out its predecessor SD (let alone digital SD*, which might as well be the red-headed stepchild of real SD). Nor should you need $10k worth of equipment to reach an impressive level of HD. The whole point of digital is to ensure consistency at even the entry level.

* The one exception being DVD, which has faired up quite decently compared to its various broadcast incarnations. This makes the one hope possible that whatever HD disc format comes about will really show off what is possible in HD, and have you scratching your head as to how the OTA/cable/satellite comparisons can even be considered under the same HD umbrella.
 
Last edited by a moderator:
randycat99 said:
Yes, you should keep within specified lengths for digital cable. An intermittent digital link is a bit different than picture quality.

Backpeddle my son, it doesn't work. That's exactly why you have to buy higher quality cable. You can buy DVI/HDMI cable rated for 12ft, 15ft, and longer. The specified lengths depend on the *quality of the cable* Some 6ft cables (the minimum spec) exhibit problems. Cable quality matters. And sometimes you even need amps/repeaters/extenders.

You can't win by playing semantics. The fact is, you tried to claim that cable quality don't matter for digital data when in fact, totally wrong.

You've been proven wrong because of your lack of experience, or maybe because you Google too much blog opinion on cabling and happen to parrot somehow who also didn't know what they were talking about.

In the realm of speaker wire, yes, alot of cable marketing is bogus. It is not the case with video, and is even more obvious with digital cable quality.

Your 4:2:0 argument fell flat on its face, though.

Nope, OTA ATSC has twice the chroma resolution of SD. That was my claim, and it is true. It is also true that if you scale a 720p image to 480p in YCrCb/YUV space, you can reconstruct a 4:2:2 ratio. That was my other claim, that a scaled down HD signal will approximate 4:2:2 SD.

You were most certainly adamant of the HD is "perfect"

Nope, total imagination on you part. A strawman argument. You cannot find a single quotation from me claiming OTA HD is perfect. In fact, I claimed the opposite in the past, which is why I like Blu-Ray, since Blu-Ray opted for 1080p with 4:2:2 as one of their profiles. You've also seen several other posters here admit that OTA HD quality is variable and depends on who the provider is. So you aren't telling anyone anything they don't already know. What I said is that HD is way better than SD, and it is. My HD broadcasts blow away OTA SD, SD DTV, Cable, and Satellite. And they also are obviously better than most DVDs. This doesn't mean perfection, since obviously I've been a frequently lobbiest for 1080p, Blu-Ray, and using those 50gb for insanely high bitrates.


So all of your many pages of verbosity are based on a bogus assumption (as well as a lot of ignorance about HD technology and HD experience).

Nor should you need $10k worth of equipment to reach an impressive level of HD. The whole point of digital is to ensure consistency at even the entry level.

Your ignorance is astounding. Digital is just the transmission format. That doesn't guarantee a high quality reconstruction of the digital signal on playback devices. This has nothing to do with HD. You need to spend $$$ on equipment even if you want good *SD* picture quality. You can't seem to get it through your thick skull that the quality of your TV, DVD player, Video Processor, et al, *all matter* You seem to think that if you go out and buy a $26 DVD player and pair it with a $150 TV, you will have the optimal PQ for SD.


You've been proven wrong multiple times, you've been shown to be ignorant, you've been shown to be arguing against a strawman, and you've been shown not even read what I wrote (like my chroma resolution claim). If seems the only way you can win an argument is to argue with scarecrow strawman, instead of what I actually wrote. So please, quit while you're ahead.
 
Last edited by a moderator:
joebloggs said:
Randy what are you arguing about?

I've read your to and fro with DemoC and I'm still not sure what point you're trying to make.

I don't think he has any idea himself :)

Sounds more like philosophy to me.
 
DemoCoder said:
Backpeddle my son, it doesn't work. That's exactly why you have to buy higher quality cable. You can buy DVI/HDMI cable rated for 12ft, 15ft, and longer. The specified lengths depend on the *quality of the cable* Some 6ft cables (the minimum spec) exhibit problems. Cable quality matters. And sometimes you even need amps/repeaters/extenders.

With digital, you stay with the prescribed cable lengths. Either it works or it doesn't. You can start stretching things and extending things, and that is where you encounter transmission problems you are describing. This is apart from "noise" or "artifacts" in the actual video signal.

You can't win by playing semantics. The fact is, you tried to claim that cable quality don't matter for digital data when in fact, totally wrong.

You baited and then shifted your discussion point. Not very sportsmanly of you. Digital cable has standard lengths for a reason. It's a silly game to push cable lengths yet further than the design spec, tweak cable qualities, and measure off data integrity tradeoffs.

You've been proven wrong because of your lack of experience, or maybe because you Google too much blog opinion on cabling and happen to parrot somehow who also didn't know what they were talking about.

As have you, as well. Big talk, demonstration of pseudo-experience, personal attacks in place of real material.

In the realm of speaker wire, yes, alot of cable marketing is bogus. It is not the case with video, and is even more obvious with digital cable quality.

It all leads down the same path, just like the speaker wire scene. You stick with standard digital cable of standard build and it simply works or it is a faulty product. Going beyond that, and you are simply catering to all that secret voodoo cable design crap that runs rampant with the speaker wire scene.

Nope, OTA ATSC has twice the chroma resolution of SD.

Keep dreaming. 4:2:0 is virtually the same as 4:1:1, unless you want monochrome HD images.

That was my claim, and it is true.

You are mistaken. You wish it to be true, even though the damning numbers have been put right before your face. That's just pure obstinence. Do you honestly think that 3rd digit is just "optional"? It's got to come from somewhere, and that just happens to be where the 2nd digit comes into play. It's just interleaved together, not some magical 2x deal.

It is also true that if you scale a 720p image to 480p in YUV space, you can reconstruct a 4:2:2 ratio. That was my other claim, that a scaled down HD signal will approximate 4:2:2 SD.

Sure, sounds good. You have the extra chroma information, when going downward. Might as well use it. Either way you are still looking at 480p on an HDTV, which must look horrid to you, anyway.

Nope, total imagination on you part. A strawman argument. You cannot find a single quotation from me claiming OTA HD is perfect.

It is self-evident in the vigor of your responses. There was no reason for you to respond with such venom, if you acknowledged it wasn't perfect. This was not the case for you. Hence, you launched into a tirade, and only now are you backpeddling to the position that you know it isn't perfect. You also diverted with the assertion that I must be saying that SD is better than HD, which was simply incorrect. If that is what you interpreted from me, then you did not read what I was saying in my post. There's no way around that.

You've also seen several other posters here admit that OTA HD quality is variable and depends on who the provider is.

...after much plying, yes. You can very well see how they would just assume remain silent on the issue, given the abusive rhetoric that immediately ensues upon any "dissent". The posts I made, created the bubble where people would admit that out loud. Your posts, otoh, served only to stomp out any such discussion.

So you aren't telling anyone anything they don't already know.

You should note that there were also people here who, indeed, didn't know. All of this comes from that persistent mindset that HD equals automatically good, whatever you see in HD is what "good" looks like.

What I said is that HD is way better than SD, and it is.

It's "better", sure, but the statement falls incorrect as soon as "way" sneaks in there.

My HD broadcasts blow away OTA SD, SD DTV, Cable, and Satellite. And they also are obviously better than most DVDs.

I'm sure you are very enthusiastic about it all. Others may experience the disparity on a more linear scale, however.

So all of your many pages of verbosity are based on a bogus assumption (as well as a lot of ignorance about HD technology and HD experience).

Not any more so than your own biases, to be sure. "Experience" is a funny thing. After all, you still haven't figured out the ramifications of 4:2:0, yet you are absolutely certain that it gives you 2x more of something.

Your ignorance is astounding.

Only to be matched by your arrogance.

Digital is just the transmission format.

So you have an analog hdtv? Clearly, digital is more pervasive than simply the delivery medium. It is a dominating entity of the device, itself.

That doesn't guarantee a high quality reconstruction of the digital signal on playback devices.

The whole point is to guarantee the reconstruction. You are still thinking back in the days of analog stereo equipment (where one could reason that higher and higher levels of esoteric hardware and $'s bring you closer and closer to some nirvanic "pure reconstruction"). You've been sold into the same old-age marketing applied to new-age HD. It sounds mysterious and magical, but ultimately its simply there to ensure company xyz continues to bleed $abc from some guy like you on a regular basis.

This has nothing to do with HD. You need to spend $$$ on equipment even if you want good *SD* picture quality. You can't seem to get it through your thick skull that the quality of your TV, DVD player, Video Processor, et al, *all matter*

It matters to a point. After which, it is all technical and monetary masturbation.

You seem to think that if you go out and buy a $26 DVD player and pair it with a $150 TV, you will have the optimal PQ for SD.

I said this somewhere? Exactly. I didn't. This is you putting words in my mouth.

You've been proven wrong multiple times, you've been shown to be ignorant, you've been shown to be arguing against a strawman, and you've been shown not even read what I wrote (like my chroma resolution claim). If seems the only way you can win an argument is to argue with scarecrow strawman, instead of what I actually wrote. So please, quit while you're ahead.

...anything else you'd like to add? ...eat sleeping kittens? ...take pictures of naked children? Discussions are so simple when you can simply conjure a laundry list of evils for the other guy, right? Perhaps a different tact would yield you greater credibility, rather than just coming off as a pretentious ass? Hmmm? ...a little different approach? [use rising pitch Stewie voice]

(moral: ability to be abusive, does not buy you credibility)
 
randycat99 said:
With digital, you stay with the prescribed cable lengths. Either it works or it doesn't.

All you are doing is trying to weasel out. The prescribed length of a cable is proportional to its construction characteristics. You want less errors, you buy a cable with better control over impedance, for example. The fact is, for any given length, you can buy cable on the market that performs subpar. As cable get longer, the ability to construct one that is completely error free grows much more difficult. That's why none of the HDMI cables on the market at 40ft are completely error free. a 40ft cable is completely within the HDMI spec. That's why people with HT installations opt for Cat5/6 extension, or optical DVI/HDMI.

See, there's your theory "either you buy a cable that performs perfectly or not" and there's the reality "most usable CHEAP cables for HT installations have non-zero error rates"

You baited and then shifted your discussion point. Not very sportsmanly of you. Digital cable has standard lengths for a reason.
...
It's a silly game to push cable lengths yet further than the design spec, tweak cable qualities, and measure off data integrity tradeoffs.

The HDMI spec has no maximum design spec.

HDMI.ORG said:
Yes. HDMI technology has been designed to use standard copper cable construction at long lengths. In order to allow cable manufacturers to improve their products through the use of new technologies, HDMI specifies the required performance of a cable but does not specify a maximum cable length.

It is not silly for people to have requirements to run digital video signals 20-50 foot or more in their house. Since by definition, cables at that length are not perfect, there are a range of cables on the market that exhibit different levels of performance compared to the spec.

If you are willing to spend about $300, you can buy 50ft HDMI cable that will be nearly (but not totally) error free. If you spend $50 for 50ft, you will be watching snow.


Keep dreaming. 4:2:0 is virtually the same as 4:1:1, unless you want monochrome HD images.

...
After all, you still haven't figured out the ramifications of 4:2:0, yet you are absolutely certain that it gives you 2x more of something.

A 720p image has a chrominance resolution of 640x360. A SD DVD has chrominance resolution of 360x240. So like I said, HD has twice the chroma resolution of DVD, in the same manner that it has twice (actually, 2.67 for 720p) the luminance resolution of DVD. It's actually worse when you consider the interlaced chroma problem that DVD has.

Unless you are laboring under a very very silly definition of resolution, my statement is a plain for you to understand. But likely you've against concocted some strawman that you're arguing against.



The whole point is to guarantee the reconstruction.

No, the point of digital is to allow error free transmission of information. It *may* allow the same reconstruction everywhere, but the *REALITY* is, it doesn't. It's simply not part of the standard. DVD players and TVs differ wildly in their reconstruction quality. There is no standard that specifies exactly how a DVD player is implemented, for example, deinterlacing technique is completely left up to the manufacturer, it is not part of the DVD spec. There's why there is a proliferation of deinterlacing algorithms. The logic that the player uses to detect film cadence, for example, is ill-defined. It's a priorietary value add.

Then there is the whole can of worms of having a display calibrated to reproduce colors and contrast exactly as the original film intended. Digital TVs simply do not solve this problem on their own, by being digital. Every single TV I have seen that comes from the factory, needs to be calibrated. Even after calibration, many are not able to match the desired response needed for good reconstruction. For example, the color decoder of many TVs is simply incapable of being tuned, or tuned to match ATSC standards.

So like I said, the idea that "oh, it's digital, and if I buy a digital TV, and DVD player, reconstruction will be exact" is hopelessly naive. That would require a rock solid standard specification for all parts of the digital pipeline, including how digital input in the TV results in an exact color value emitted onscreen (or better yet, exact tristimulus value). We just don't have it.
 
Last edited by a moderator:
Shifty Geezer said:
At the moment the current media news suggests you'll be able to buy HD-DVD with 50% of the movie industry available, or BRD with 100%. It's a hands down victory. Perhaps Toshiba are going to sign in with BRD's hardware format which was the trouble? Better to have some of the pie then none.
But then again, Toshiba are close friends with Sony after all. They'll come to something. I wouldn't be surprised if Toshiba got some of Blu-ray's royalties if the format wins.
 
Do you two (DemoCoder/Randy) really need to continue with your off topic war? For the sake of this already destroyed thread, take it elsewhere (try PMs!).

The arguements you guys are making are beyond a semantics debate, its just getting silly now. You two are disecting single sentences now to try to prove your points (which at this point I'm not even sure what they are).

Please, for the love of things holy just agree to disagree (even though if you guys bothered to actually read each others posts without just replying for the sake of replying you'd probably find you two aren't really disagreeing with each other).
 
DemoCoder said:
(on cable length)

The premise remains the same- you stick with standard lengths (which tend to be short) and you won't have problems. If you have to go longer, you use repeaters (it's the way digital networking works, as well). That is the optimal. That is normal operation. Once you go much longer, you put yourself at increasing risk of transmission problems. Certainly there will be a market for increasingly expensive and "esoteric" cable packages for people who want to explore that sort of thing. The fact is, it hasn't "improved" your PQ or made it "smoother" or "clearer" (or whatever other pleasing adjective marketing wants to throw on the table). You are simply trying to recover yourself back to normal operation (had you simply stuck to standard short cables).

The HDMI spec has no maximum design spec.

Gee, I wonder why...to create a market for fancy cables? This is unlike computer networking where the operational lengths are quite specific. They don't play that silly game when the objective is data integrity.

It is not silly for people to have requirements to run digital video signals 20-50 foot or more in their house. Since by definition, cables at that length are not perfect, there are a range of cables on the market that exhibit different levels of performance compared to the spec.

...or you do it the right way with repeaters. What you have described is certainly not the norm for most people, so bringing it up wrt to most people is kind of silly. Chances are there will be numerous location options at hand vs. running a 50 ft digital cable, if the situation is that contrived.

If you are willing to spend about $300, you can buy 50ft HDMI cable that will be nearly (but not totally) error free.

So you've settled for a compromised picture. Better make it $600 to be "extra sure" you get your bits, right?

A 720p image has a chrominance resolution of 640x360. A SD DVD has chrominance resolution of 360x240. So like I said, HD has twice the chroma resolution of DVD, in the same manner that it has twice (actually, 2.67 for 720p) the luminance resolution of DVD. It's actually worse when you consider the interlaced chroma problem that DVD has.

Ok, sure it has twice more than SD...but you need more because the overall resolution has gone up, as well. It was 4:2:0 in SD, and it is 4:2:0 in HD. So that means it exactly kept pace. There was no advantage there. Did you honestly think it was an option for HD to increase pixel resolution by 2-3x, but keep the color resolution the same as SD??? That would essentially be 4:1:0, which would be VHS level.

Your position is essentially akin to remarking that a 400 hp car will be twice as fast as a 200 hp car, but absolutely ignoring that the 400 hp happens to also be twice the weight. Yeah, it's 2x the power, but the net improvement is that it has simply kept pace with the weight. There was no real advantage, there.

Unless you are laboring under a very very silly definition of resolution, my statement is a plain for you to understand. But likely you've against concocted some strawman that you're arguing against.

Your statement in conceptually flawed.

No, the point of digital is to allow error free transmission of information.

That's simply another way of saying "integrity of reconstruction", unless you are inserting issues of interference resistance to the issue.

It *may* allow the same reconstruction everywhere, but the *REALITY* is, it doesn't.

It's closer than you let on. It's staying inside a very tight window or it ends up not working at all. You aren't going to observe many, many varied and fine shades of PQ. This mindset is going back to that "analog" approach to things.

DVD players and TVs differ wildly in their reconstruction quality.

This, we will attribute to your penchant to over emphasize differences.

There is no standard that specifies exactly how a DVD player is implemented, for example, deinterlacing technique is completely left up to the manufacturer, it is not part of the DVD spec.

The basic process is fairly straightforward. You can do a lot of extra stuff to tweak it. So maybe that is what you are referring to.

There's why there is a proliferation of deinterlacing algorithms. The logic that the player uses to detect film cadence, for example, is ill-defined. It's a priorietary value add.

The extra tweaking is. Doesn't every company want you to believe they have their own top secret sauce, anyway? It's all a matter of perspective.

Then there is the whole can of worms of having a display calibrated to reproduce colors and contrast exactly as the original film intended.

...calibration of analog-style functions. The digital part is going to be the digital that left the factory. You aren't calibrating the "digital" hardware anymore than you need to "calibrate" your PC to run the operating system correctly.

For example, the color decoder of many TVs is simply incapable of being tuned, or tuned to match ATSC standards.

Whether or not this actually matters in real use is if you are speaking of 10's of % or tenths of a %. All of that flys out the window anyway, when the user tweaks their color settings for personal preference or puts the white point to torch mode because it looks "punchier".

So like I said, the idea that "oh, it's digital, and if I buy a digital TV, and DVD player, reconstruction will be exact" is hopelessly naive.

The reconstruction is correct. The digital signal you get is the digital signal that was sent. There is the step to calibrate to the room and such. What the user tweaks and the amount of room the user is given to tweak from there out is the variable. The hardware differences you speak of are on the order of tenths of a percent- not going to be particularly perceptible. Given the content established in the bulk of this discussion, the dominating factor is far weighed to the side of HD broadcast quality for the time being, before hardware differences can really be observed. Maybe when we get to the point in time with HD disc delivery has happened, there will be a chance to really see differences in hardware, but then it will be plainly obvious that this is the realm of splitting hairs to find the differences.
 
Last edited by a moderator:
Bloody hell!
Do you guys mind taking your marital fights to PM? Not only you've long gone largely off-topic (console forum, remember?), but I don't think anyone except you two is actually reading these immense posts.
 
Though this debate is off topic and contrary to forum FAQ rules, I figure there's no real harm as long as it's contained to this one thread and I can ignore them, and sooner or later a Mod will come and close this down.
 
Note: for forum members tired of reading long messages, there is an educational section at the bottom of the message about colorspace that you may find interesting.

randycat99 said:
Gee, I wonder why...to create a market for fancy cables? This is unlike computer networking where the operational lengths are quite specific.
Computer networking uses error correction, so incremental improvements over and above a category of wiring don't matter because the transport layer is built to tolerate the errors. If you've got a protocol which has no error correction ability (DVI/HDMI), it is a different issue entirely. Substandard network cables just reduce the channel capacity, but since achieving the theoretical network bandwidth never occurs in practice, all network applications are written to deal with suboptimal bandwidth. With DVI/HDMI, the resolutions at the top of the ATSC chart take up almost all of the bandwidth and there is no error correction, so suboptimal cable quality is an important issue.

...or you do it the right way with repeaters.

Repeaters cost much more than cable. A cheap DVI repeater to go 50ft costs $249 today and still l doesn't totally eliminate sparklies. That's why many people opt for optical DVI. People who need their TV to sit more than 9ft from their other equipment, but don't want to pay several hundred dollars for repeater/extender solutions, are going to buy long DVI/HDMI cables, period.



Re: proprietary deinterlacing said:
The basic process is fairly straightforward. You can do a lot of extra stuff to tweak it. So maybe that is what you are referring to.

Nope, it's not a "tweak" When progressive DVD players first hit the market, they used simple bob and weave techniques to do deinterlacing. Per-pixel directional deinterlacing came along much later and is not "extra stuff to tweak it", .. It is much more computationally demanding, which is why Faroudja made alot of money selling custom hardware accelerate to do it. Again, your ignorance shines. Today, Faroudja is a commodity, but this is just one example in the variation of DVD quality. The HQV DVD is great at exposing player deficiencies.


...calibration of analog-style functions. The digital part is going to be the digital that left the factory. You aren't calibrating the "digital" hardware anymore than you need to "calibrate" your PC to run the operating system correctly.
...
The reconstruction is correct. The digital signal you get is the digital signal that was sent.

The hardware differences you speak of are on the order of tenths of a percent- not going to be particularly perceptible.

Nope, you clearly don't know what you're talking about. With respect to computers, most operating systems now have a color management system (CMS) and have ICC profiles, and most computer imagery is authored for the sRGB color space. So these days, you don't need to manually calibrate a computer monitor, because your computer gets an ICC profile for the monitor, and is able to map the sRGB gamut into your monitor's gamut @ factory settings. This was not the situation a few years ago, where digital artists would buy expensive colorimetry equipment to measure their monitors and expensive Adobe software in order to do gamut matching. (the CMS system on Windows still doesn't handle the case where you futz with the controls of your monitor unless you have a monitor which signals the computer that settings are adjusted)

The situation for consumer electronics is vastly different. sRGB is the colorspace defined for HD signals. sRGB is a colorspace derived to match the gamut of Trinitron-style CRT monitors. LCD/LCoS, DLP, PDP, and OLED displays have very different gamuts. But unlike your computer, DVD players have no idea of the ICC profile of your display, and existing HDTVs do not do ICC profile matching of incoming component/HDMI/VGA because they do not know what the source colorspace is! For consumer electronics, there is no end-to-end color management.

So the situation today with consumer electronics is vastly different compared to the ColorSync/Kodak CMS/etc automation we have in MacOS/Windows today. It is simply NOT CORRECT that reconstruction will be correct. Just a few years ago, Mac and Windows users use to see vastly different images on the Web, because the default MacOS assumed gamma is 1.7-1.8, whereas for Windows it was 2.2-2.5. Even today, you'll still have problems if images are not authored in sRGB.


Secondly, you've obviously never calibrated a digital TV. The calibrations are digital, done via software. Many HDTVs can't be calibrated via analog method. Lamp brightness can't be changed, lamp color spectrum emission is fixed. Color filters/Color wheels which are which are biased can't be altered.

With the exception of video projectors, where people may install optical filters on the lens, calibration of fixed pixel HDTV technology revolves around software calibration, which means that the TV employs a functional mapping between incoming colorspace, and the user's calibrated colorspace. My HDTV in fact, allows me to pause the display, click on a pixel in the image, and assign it a new color. That's how detailed the calibration is. Color temperature, gamma ramp, all of it is just digital image manipulation. Many new DVD players have an IRE gamma ramp equalizer that allows you to calibrate the contrast response inside the DVD player itself. This is useful for some people who have HDTVs where the gamma ramp can't be calibrated.

The fact is, digital TVs must still be calibrated, and calibration does not entail just "analog bits" The room you put the TV in has little to do with it. The calibration is needed because of color gamut and gamma response mismatch.

Even if we had the nirvana of the future which is TVs which can handle ICC profile gamut matching, the fact is, there is no one standard for how to best match an sRGB color to a destination gamut. There are multiple mechanisms, and all of them have issues. So again, users will need to tune gamut matching to their desired preference.

Finally, it does not come down to a .1% difference, or even a 10% difference. Film material is telecined for CRT. PDP, LCD, and DLP all have varying degrees of push. LCD pushses blue, PDP pushes green, and DLP pushes red. This means PDPs have a slight green tinge, LCDs are bluer, and DLP makes people more red-faced. In particularly bright scenes, the push is obvious, especially actors faces which are brightly list at D65 or D32.

This is all very obvious to anyone who sees them side by side, color temperature fuckup is one of the most easily recognizable artifacts, far easier for people to see MPEG-2 artifacts at hi-res. I find it amazing that a guy who complains so much about MPEG-2 artifacts, and claims other people debating him can't recognize those artifacts, himself seems blind to color gamut problems. Getting close to the TV to see artifacts is one thing, but actors with green or red tinged faces is quite another.

Gamma ramp is the same problem. Immediately obvious. Just watch Lord of the Rings mines of Moria sequence on a display with an uncalibrated gamma ramp. The result is a crushed out mess, and in a sequence that is mostly dark, all kinds of detail is lost, like the wrinkles and imperfections in the fabric of the fellowship's clothing. Once you've watched it on a good setup, watching it on a display which crushes shadow detail is obvious.

I've had my say. Hopefully the forum members found some of this educational, and hopefully they learn to ignore Randy, who is spreading misinformation, is not an expert in the area he is proclaiming knowledge in, and doesn't appear to have any experience either.

Go read the Gamma FAQ or Color FAQ yourself if you don't want to take my word for it.
 
Last edited by a moderator:
Without taking sides, I don't see why a topic like this should be closed because it went a little off-topic. We already have two other threads on the Warner deal and there's nothing left to discuss there either. This on the other hand was quite interesting and once again I've learned quite a bit reading it. I'm sure if anyone wanted to talk about Warner, they would have (or used one of the other threads around).

Kudos to Randycat99 and DemoCoder for keeping it civil.
 
Thank you for that post DC..very good information which shows how people automatically assume since it's digital it shoud be error free and immune to audio/visual degradation. Heck without error correction a device like a CD player would output pops and static like vinyl even though it's digital. You also hit the nail on the head about the false assumption that since it's digital, there's no calibration needed therefore everyone will get the same output results.
 
Yeah, the DVI/HDMI spec has many shortcomings. The DisplayPort standard that everyone at VESA is backing (ATI, NVidia, Dell, HP, et al) overcomes some fo the short comings, but still doesn't seem to include any forward error correction scheme. This will make it unsuitable as a replacement for DVI/HDMI in consumer electronics space, since it offers not many real advantages over DVI/HDMI. Architecturally though, it is a step forward since it is micro-packet based on an isochronous stream instead of the "raw" bitstream you get on DVI.

In PC space, it includes a bidirectional communications channel that will allow colorspace and calibration settings negotiation, among other things. Peer to peer device negotiation would be nice in the CE space, but I'd rather see both error correction and negotiation.
 
Yep, VESA DisplayPort is one reason why I'm not in a rush to move everything to HDMI. That and HDMI seems to be such a moving target with quite a few incompatibilities at this moment.
 
Status
Not open for further replies.
Back
Top