Will Warner support Blu-ray?

Status
Not open for further replies.
randycat99 said:
Evidently, you are not. No way around that. Sorry.

The fact that you claim to see them in only in broadcast SD and DVD, but not in HD, proves only that you allow yourself to see them where you want and ignore them where you don't want.

I never claimed not to see them, in fact, I claimed the opposite: I claimed that Fox and UPN were taken to task for their poor coding. What I specifically claimed is that I see *less* artifacts in general in HD.

That should be bar none reason enough to dismiss your observations as compromised

And since you apparently don't own HD equipment nor are a regular viewer of HD material, I think this is reason enough to dismiss your observations. You can't watch a few minutes at Best Buy or Circuit City and use that to form a conclusion. Your posts read like a guy with SD equipment and sour grapes.
 
wco81 said:
MS wants to eliminate Java in the worst possible way. If Java succeeds, then people don't need Windows as much. Sun sued MS over Java.

Now if Blu-Ray wins, Windows will have to have Java in order to play BD-Java software from Blu-Ray discs.

Ohhh ok I see now.

Thanks.;)
 
I think the interopalating between frames thing would primarly increase memory requirnments
because you would need to keep previous frames around. Also you would be placing the visual response to control movement of the game slightly out of synch, though this may happen any way for example the game might display 30 fps but poll a.i for movement, the controller and other things at different rates. Never the less having frames interopalated like this may effect how you synch these to make things "feel" right. Then again where only talking about generating 1 transition frame between normal frames so it may not be a issue.
 
Last edited by a moderator:
DemoCoder said:
I never claimed not to see them, in fact, I claimed the opposite: I claimed that Fox and UPN were taken to task for their poor coding. What I specifically claimed is that I see *less* artifacts in general in HD.



And since you apparently don't own HD equipment nor are a regular viewer of HD material, I think this is reason enough to dismiss your observations. You can't watch a few minutes at Best Buy or Circuit City and use that to form a conclusion. Your posts read like a guy with SD equipment and sour grapes.

You are backpeddling now. You'll have to trust me that my HD experience far exceeds "a few minutes at Best Buy..." That venue is still a good way to observe what "the general people" see when they decide that it's time for an HDTV, though. You simply cannot deny that. The status of "ownership" grants you nothing in credibility (as it is for a variety of consumer electronics). It simply means you bought something. From there, you either love it or learn to live with it. There is no cosmic rule that you suddenly become more informative of its nature (otherwise, we are left with one needs only to own something and then make any wild claim they please, and then stand behind some ridiculous credibility by ownership badge). That comes from an entirely different process. Many people own HDTV's. Few people are truly aware of what it has actually delivered into the livingroom. You shouldn't feel so threatened when someone can show some honest introspection upon your beloved display masterpiece (seriously, it's just a hunk of electronics technology- get a grip).

It matters little that you take the "bash anyone who dissents" posture. It's never going to mask what you see when you l_o_o_k__a_t__t_h_e__g_r_a_s_s and focus on things as they glide from one side of the screen to the other. There's some serious hash going on there- something you just don't see in DVD. It shouldn't have to be that way.
 
Last edited by a moderator:
Backpeddling my ass. Search B3D for my comments on compression artifacts in the past. You fail to understand the difference between "has artifacts", "has less artifacts than other formats", and "has zero artifacts"

You just admitted you don't have any experience with HD. If you don't own HD equipment, and regularly watch a large sample of varying sources, you have no basis for comparison.

Your grass analogy is just as stupid. It sounds like you went to Best Buy, looked at NFL HD Sunday Ticket, and concluded on that basis, HD has inherent artifacts worse than DVD.

I gave you a counter example already: OTA HD "The Lost", a TV Series set on a lush green island filled with grass and vegetation, vs DVD "The Lost" The DVD version shows *MORE* artifacting than the OTA HD version. *PERIOD*. OTA HD movies that are upconverted from the D5 master are superior to DVD in everyway. They have twice the chroma on both axis.

In summary, you are a bullshitter. You don't own an HDTV, don't own a HD Receiver, don't own an HD Tivo, don't own an upconverting to HD DVD player, don't own a D-Theatre. So unless you rented this hardware for a few months, you have no relevant personal experience, yet you argue as if whatever token anecdotal experience you have with HD applies everywhere. And then, you take your non-experience and have the balls to claim that videophiles who view HD on a regular basis don't know what these seeing. This, despite the fact that such people (myself included) regularly plop down thousands of dollars for small improvements in artifacts, noise reduction, black level, gamma ramp, etc, because we are annoyed by artifacting. I have regularly railed against compression artifacts.

This, despite the fact that you have argued against DVI/HDMI as being improved over component in the past, even though several people told you that DVI/HDMI removes artifacts seen in even the best component cables.

You deny artifacts on one hand, because they apply to SD, but overhype artifacts on HD. Yes, you still get artifacts on HD, but they are better than SD DTV and DVD artifacts, and nothing compared to "Never Twice The Same Color" NTSC issus.

Maybe you should stop posting about your HD expertise in these forums until you actually have some HD experience to speak of.
 
Sounds like you had to tweak an extraordinary array of equipment to get the result you have now (essentially, finding the right assortment of band-aids to fix something that shouldn't need that many fixes to run correctly). Unfortunately, this won't be the extent that everyone else will go to in their HD experience. The fact remains, these issues exist and are visible in the kinds of equipment you can pick up in a store. Running down anyone else who brings these up doesn't help anything.

I'm glad your "The Lost" experience was a gratifying one. It is only one example, though. It shouldn't surprise if the HD version actually prevails for once.

Your basis for my HD experience is quite contrived, however. You should just leave it at that, since you are straying more into realm of mundane character assassination to prove a point than anything else. This doesn't prove much other than your overbearing arrogance at throwing money at a problem (and then denying the problem ever occurred).
 
Last edited by a moderator:
That's the problem. Your assessment is oversimplified (anybody can make a blanket statement like that, but it ends up not being particularly meaningful or useful other than to reaffirm existing beliefs). If you aren't sure of my point, then it's not like you can respond to it.
 
mckmas8808 said:
Yep. But why? Are they really losing that much money by not having their i-HD layer in BDs? Or did they do this too slow any PS3 news that this could have received?

Nah I don't think profit concerns are the root issue for Microsoft at all; really just a puzzle piece that makes it less likely their ideal of the new digital paradigm becomes reality. Plus, you have to remember that Microsoft so dominates their core business sphere, that most of their growth has to be out into new areas. Sony is one of the stronger players in two of those areas at the moment, and rapid adoption/success of blu-ray over HD-DVD puts them in a position of strength for the next couple of years to determine how some things get implemented.

I don't think Microsoft has anything inherently against blu-ray per se, but nor do I think they were expecting this rapid build-up in momentum either. Certainly, they don't want to deal with a stronger Sony if they can help it, but I think even moreso for them they just hate being forced by the market to move in a certain direction - in terms of adoption - rather than doing the forcing themselves.
 
Last edited by a moderator:
Forgive me for butting into the ongoing arguement, but artifacting should be an issue of the encode, not specifically whether it's HD or SD. If the bit stream is poor on either there'll be macro-ing and lots of other weird artifacts. I am being ignorant probably, but this is my experience. Of course, it's also an issue of the compression alogrithm used; Xvid, DivX, H264, and so on all have their inherent quirks. Anyway, that's the experience I've walked away with, but granted this video work on my computer kind of thing, not specific HD AV equipment.
 
You claim I can't see artifacting. I tell you that I have very sensitive awareness of these issues, and as proof I cite my previous posts on why I buy HDMI over component, or why I and other videophiles go apeshit over crappy black levels, bad gamma ramps, poor CR, posterization, et al. I've been running calibrations and image quality/decoder checks on my theater setup for 3+ years.

And now, because I've bought better equipment that can more accurately reproduce a given SD or HD signal, you're claiming this is proof that the HD experience is bad?

A good cable, better video processor, or superior TV helps SD just as much as it helps HD. Long before HD, videophiles were spending tons of money on their home theater to get the perfect picture.

Your statements are like saying that someone who listens to CD audio and spends a lot of money on a good receiver, amp, and speaker only proves how "these issues exist and are visible in the kinds of equipment you can pick up in a store." Well, boo-hoo, but this has fuck all to do with the inate quality of CD encoding, it has to do with the playback equipment.

OTA ATSC encoding permits a much higher quality signal than standard SD TV or DVD. The fact that you need to invest in a high quality tuner or TV to get the best experience is not the fault of the ATSC standard.

The fact that some stations and some shows have poor quality coding while others have high quality is also not the fault of ATSC. You find the same variation in DVD which is why SuperBit versions exist. It's why "extreme editions" and remastered editions of blockbusters exist. NTSC conversion varies too.

Summary:
1) you don't own any HD equipment, don't apparently have any extensive experience
2) having to buy good equipment for noise-free high PQ reproduction is orthogonal to ATSC
3) ATSC has twice the bitrate of DVD, and D1/D5 masters converted to ATSC will have twice the chroma resolution of SD, and if you downscale it to 480p, you end up as if you had a DVD with 4:2:2 instead of 4:2:0
4) Both ATSC and DVD, for example, use the same codec, MPEG-2. But DVDs max out at 8.4Mbps to store 2hrs of video on a dual layer disc. If you add in multiple audio tracks, bonus features, widescreen/fullscreen versions, like most DVDs, you end up with far far less. Most DVDs actually use 1/3 the bitrate of ATSC broadcasts.

The fact is, HD is undeniably, better PQ than SD, period. There is no argument. Your posited scenarios of optimal SD content compared against the worst HD content you can find are bogus.

So please, try actually getting some relevant experience about HD before shooting your mouth off.
 
Mefisutoferesu said:
Forgive me for butting into the ongoing arguement, but artifacting should be an issue of the encode, not specifically whether it's HD or SD.

Exactly Doubly so since broadcast HD and SD DVD both use the same codec, but the former has 2x the bitrate of the maximal bandwidth of the latter, and in practice, DVD has 1/3 the bitrate. Upcoming changes are adding a 4:2:2 profile as well to HD formats.
 
DemoCoder said:
You claim I can't see artifacting. I tell you that I have very sensitive awareness of these issues, and as proof I cite my previous posts on why I buy HDMI over component, or why I and other videophiles go apeshit over crappy black levels, bad gamma ramps, poor CR, posterization, et al. I've been running calibrations and image quality/decoder checks on my theater setup for 3+ years.

And now, because I've bought better equipment that can more accurately reproduce a given SD or HD signal, you're claiming this is proof that the HD experience is bad?

A good cable, better video processor, or superior TV helps SD just as much as it helps HD. Long before HD, videophiles were spending tons of money on their home theater to get the perfect picture.

Your statements are like saying that someone who listens to CD audio and spends a lot of money on a good receiver, amp, and speaker only proves how "these issues exist and are visible in the kinds of equipment you can pick up in a store." Well, boo-hoo, but this has fuck all to do with the inate quality of CD encoding, it has to do with the playback equipment.

OTA ATSC encoding permits a much higher quality signal than standard SD TV or DVD. The fact that you need to invest in a high quality tuner or TV to get the best experience is not the fault of the ATSC standard.

The fact that some stations and some shows have poor quality coding while others have high quality is also not the fault of ATSC. You find the same variation in DVD which is why SuperBit versions exist. It's why "extreme editions" and remastered editions of blockbusters exist. NTSC conversion varies too.

Summary:
1) you don't own any HD equipment, don't apparently have any extensive experience
2) having to buy good equipment for noise-free high PQ reproduction is orthogonal to ATSC
3) ATSC has twice the bitrate of DVD, and D1/D5 masters converted to ATSC will have twice the chroma resolution of SD, and if you downscale it to 480p, you end up as if you had a DVD with 4:2:2 instead of 4:2:0
4) Both ATSC and DVD, for example, use the same codec, MPEG-2. But DVDs max out at 8.4Mbps to store 2hrs of video on a dual layer disc. If you add in multiple audio tracks, bonus features, widescreen/fullscreen versions, like most DVDs, you end up with far far less. Most DVDs actually use 1/3 the bitrate of ATSC broadcasts.

The fact is, HD is undeniably, better PQ than SD, period. There is no argument. Your posited scenarios of optimal SD content compared against the worst HD content you can find are bogus.

So please, try actually getting some relevant experience about HD before shooting your mouth off.
I think some of the issues regarding the encoding and lesser quality OTA HD signal, rests squarely on the shoulders of the OTA providers. Here in Colorado, there is an "uproar" amongst OTA HD tuner owners, because the providers are all using very low powered signals, and barely meeting the federal requirements. Whenever someone questions whether PQ is truly different from SD to HD, i pop in one of my D-Theater tapes and let them SEE the difference in quality. Best examples are movies like Spy Game or even X-men2. The difference is primarily in peoples faces, in the DVD version of Spy Game, Redford looks pretty good, in the D-VHS version, you can see just how old he is getting. Same thing with X-men 2, when there are close-ups of peoples faces the detail is simply amazing. On the flip side, my Gosford Park looks dare I say exactly the same as the DVD, I believe that has to do not only with the encoding, but the intention of the director while filming a "gritty" drama. Having only component output/input on my 40000u/CRT HDTV respectively, I absolutely love my analog. On my LCD TV, I would concur that DVI looks"better", but I don't have a higher end D-VHS that has HDMI or DVI, so I can't compare d-theater on both sets except through component and the CRT far exceeds the PQ of my LCDTV, but both my sets are small, CRT= 30" LCD= 26". Although, if I play my cards right, I will be able to grab a 37" LCD or I might go DLP.
 
DemoCoder said:
And now, because I've bought better equipment that can more accurately reproduce a given SD or HD signal, you're claiming this is proof that the HD experience is bad?

This is a frequent trap the consumer falls into when they fall off the deep end and become hopeless fanatics (i.e., the snake-oil audiophile syndrome). They just buy stuff habitually because they believe it will unveil just a few more % of the missing experience. I'm not saying this is you, but submitting that you have bought x amount of equipment as proof alone of your dedication and "awareness" of technology can discredit you just as easily as it credits you. It's better to just not go there. Either your equipment works or it doesn't.

A good cable, better video processor, or superior TV helps SD just as much as it helps HD. Long before HD, videophiles were spending tons of money on their home theater to get the perfect picture.

...and not always succeeding. Sometimes it ends up making a difference, sometimes it was just the "purchase of the month". At that point, it is much like throwing darts at a dart board- especially in the higher tiers of pricey merchandise.

Your statements are like saying that someone who listens to CD audio and spends a lot of money on a good receiver, amp, and speaker only proves how "these issues exist and are visible in the kinds of equipment you can pick up in a store."

Except the CD example you give is actually a reasonable expenditure to "do it right", except for the dubious qualifier of "lot of money". Your HD experience seems to suggest you've gone beyond that, spent "lots more money", into the range where diminushing returns is occurring for equipment with dubious effect. It really doesn't create more "credibility" for you, other than demonstrating you have an active hobby to pour money into because you can.

Well, boo-hoo, but this has fuck all to do with the inate quality of CD encoding, it has to do with the playback equipment.

...and there's the real kicker. CD encoding is on such a different realm than HD encoding, you should have realized the example was bound to be detrimental to your point. CD is more comparable to a straight DV feed, whereas mp3 would be more analogous to how HD works. The former has a BIG check in balance to ensure quality makes it through the chain, the latter is a free-for-all, anything goes zone for quality or lack thereof.

OTA ATSC encoding permits a much higher quality signal than standard SD TV or DVD. The fact that you need to invest in a high quality tuner or TV to get the best experience is not the fault of the ATSC standard.

It's technically able to. Whether it does seems to be highly variable in current practice. Wrt high quality tuners or TV, that would seem to suggest that you should be able to see more of the good AND bad with such equipment (i.e., it's more revealing). There's no doubt you see more of the good, but if the bad stuff is getting masked, then how faithful is the equipment, ultimately? The fact remains you can see the good AND bad on even mediocre equipment. That's not a good sign. It's a good reason to be honest and activistic for hdtv, rather than shamelessly promotional.

The fact that some stations and some shows have poor quality coding while others have high quality is also not the fault of ATSC.

Except it isn't a "fact" as you have commented it. It's got more to do with it than just stations and shows. Not everybody and not everything can be accessed through just ATSC. You need to also include satellite and cable sources, because that is what people may be using, as well. It ALL reflects on the state of HDTV. No arguments that ATSC seems to be the best of the bunch, currently. It's also good you finally concede that the technical spec is not a guarantee of what is achieved in real use. Now you can begin to realize it is, indeed, possible for HD in real use to have its weak areas, despite its technical specifications.

You find the same variation in DVD which is why SuperBit versions exist. It's why "extreme editions" and remastered editions of blockbusters exist.

The pros and cons are mixed on that, too. That they simply "exist" is not proof of something. You could also argue that they exist for you to buy another copy of what you already have, except with better mastering standards of the day (a window of 10 years or so should be worth something, higher bitrate or not). I'm not going to claim that must be the "proof", rather it's likely a mixture of numerous motivations to reissue a title.

1) you don't own any HD equipment, don't apparently have any extensive experience

You could say this of anybody who disagrees with you. Even if they DO have HD equipment, it surely must pale in comparison to the kingsly setup you must have... :rolleyes: The counterpoint is that you have become far more invested emotionally into your gear than even monetarily. Your observations have been compromised, if not just a leeeeeetle bit biased.

2) having to buy good equipment for noise-free high PQ reproduction is orthogonal to ATSC

Good equipment can be noise-free in of themselves. They cannot "hide" noise (unless by specific intent) that is already in the feed, itself. So the end-use is hardly orthagonal (though, nice try with putting the word into use). Either way, you seem to concede that you have acquired equipment that intentionally "masks" the "noise", so that you can claim you do not see it in your HD observations. It's all well and good, but isn't particularly useful for HD experiences for people at large.

3) ATSC has twice the bitrate of DVD,...

...yet it is also covering, what, "3 to 5x" the resolution (amount of source data). The numbers aren't "adding up", but few people care to acknowledge this tidbit. Squeezing 3-5 lbs of s*** into a 2 lb bag should bring pause to most people. Something else is going to have to give...

and D1/D5 masters converted to ATSC will have twice the chroma resolution of SD,...

To my recollection, the D1/D5 has twice the chroma resolution. ATSC is essentially the same chroma as DVD/SD/what consumers have been exposed to for quite some time.

...and if you downscale it to 480p, you end up as if you had a DVD with 4:2:2 instead of 4:2:0

...not if ATSC is 4:2:0 to begin with. Perhaps you have 4:2:2 confused with consumer DV cameras or something. Even if you are right, you are essentially bragging about color resolution still only being squarely at one-half of the much heralded HD resolution. Yeah, it's better, but an effective 640x360 or 1440x540 for color isn't particularly "hi-def".

To add to this, not only do you have more 3-5x the pixel data going in, but 2x the color data, but only 2x for overall bitrate..."10" isn't going to fit into "2" w/o something else giving...hmmm...

4) Both ATSC and DVD, for example, use the same codec, MPEG-2. But DVDs max out at 8.4Mbps to store 2hrs of video on a dual layer disc. If you add in multiple audio tracks, bonus features, widescreen/fullscreen versions, like most DVDs, you end up with far far less. Most DVDs actually use 1/3 the bitrate of ATSC broadcasts.

So you have 3-5x the pixel data plus 2x the color data that ATSC has to handle. Sounds like it is going to be pretty well stretched with only 3x the bitrate.

The fact is, HD is undeniably, better PQ than SD, period.

No one is saying it isn't better. How much better is a matter of great dispute. A single word "undeniably" is quite oversimplified a term for the matter at hand.

There is no argument. Your posited scenarios of optimal SD content compared against the worst HD content you can find are bogus.

You act like "worst HD" is something that is hard to come by, lately. If it was quite widespread, that would seem to be an issue that matters. You can choose to disavow it even exists if you throw enough money at the situation or the specs as the gold standard simply make this "impossible" to happen, but it does no good to real people using real equipment from real stores connected to real HD providers.

So please, try actually getting some relevant experience about HD before shooting your mouth off.

Ironically, the same can be said of you. [cue eaten Pacman sound]
 
Is that a new B3D Console record for the longest post? Acert certainly has some serious competition going on here :p
 
randycat99 said:
I'm not saying this is you, but submitting that you have bought x amount of equipment as proof alone of your dedication and "awareness" of technology can discredit you just as easily as it credits you. It's better to just not go there. Either your equipment works or it doesn't.

Either you own equipment, or you don't. You're like a guy who shoots his mouth off about what it's like to fly an airplane, even though he doesn't have a pilot's license and has no experience flying an airplane, other than reading about it. Or a guy who waxes on about the Mona Lisa even though he's never seen it.

You simply don't have any experience. One cannot argue about what "looks better" by looking at specs. It's subjective and there are many variables involved. People who have both SD and HD know it looks better.

It's technically able to. Whether it does seems to be highly variable in current practice.

All technology permits both good and bad renditions. The best 3D hardware on the planet doesn't mean every game is visually stunning. CD doesn't guarantee an excellent mastered soundtrack. 35mm film doesn't mean your exposures are perfect everytime. SD content is highly variable. I own several hundred DVDs, and quality varies incredibly. Two DVDs of the same content at the same bitrate can look different because of the quality of the film transfer/master. Two DVDs with different bitrates can look different because the compression is throwing away less.

The fact is, yes, these occur on both HD and SD, but it's irrelevent. A good mastered HD signal looks better than a good mastered SD signal, period. Apples to Apples comparison, HD is better. Your attempts to "warn" people that, OH MY GOSH, one can get poor content for HD is silly. Anyone with an HD set is aware of it already. Anyone with Satellite is well aware of how your mileage may vary (especially your SD mileage. UPN SD Enterprise used to look TERRIBLE, worse than VHS)


All of this is irrelevent because you are talking out of your ass. You don't own any HD equipment, and yes, experience doesn't matter. You have provided no scientific evidence or studies of HD content to back up your assertions. And you have no personal experience to back it up either, so ergo, you have no point to stand on.

Can you point to any ABX-style tests for HD? I thought so..

Wrt high quality tuners or TV, that would seem to suggest that you should be able to see more of the good AND bad with such equipment

The simple fact is, a TV with better CR, black levels, gamma ramp, and properly calibrated provides a more faithful reconstruction of the original signal. A cable with less noise, yields lower cable induced artifacts. A superior scaler does a better job reversing the conversion from film to interlaced video. The higher quality equipment simply reduces the amount of error added to the process. You get less noise in the image, you get less washed out shadow detail or highlight detail. You get a color gradient and color temperature more in line with the director's intentions.

The most famous example is CRTs and Plasmas with poor DC voltage restoration. Or RCA/S-Video signal cable loses (chroma primarily)

Except it isn't a "fact" as you have commented it. It's got more to do with it than just stations and shows. Not everybody and not everything can be accessed through just ATSC. You need to also include satellite and cable sources, because that is what people may be using, as well. It ALL reflects on the state of HDTV.

So? Yes, Satellite HD is currently lower quality than OTA HD. But Satellite HD looks *MUCH MUCH* better than Satellite SD. So for people who only have Satellite, and have no OTA reception, Satellite HD is a tremendous improvement over Satellite SD, even if they don't even have an HDTV to display it on.

Do you have DirectTV Mr Expert? Have you ever seen the difference between CBS in SD and the same show on UHD? Have you watched Discovery channel SD and Discovery Channel HD? The difference are night and day.



The counterpoint is that you have become far more invested emotionally into your gear than even monetarily. Your observations have been compromised, if not just a leeeeeetle bit biased.

Wrong. I buy new gear all the time and throw away the old. The amount of money is trivial for me. $450 for a HD Tivo. This represents 2% of my HT budget. I had a WXGA Projector and Samsung HDTV well before I owned a D-Threatre or HD Tivo. I used them to originally enjoy good SD content better: larger screen, better image response, widescreen.

Either way, you seem to concede that you have acquired equipment that intentionally "masks" the "noise", so that you can claim you do not see it in your HD observations. It's all well and good, but isn't particularly useful for HD experiences for people at large.

No, I have acquired equipment that doesn't *ADD* noise. Try re-reading.

...yet it is also covering, what, "3 to 5x" the resolution (amount of source data). The numbers aren't "adding up", but few people care to acknowledge this tidbit. Squeezing 3-5 lbs of s*** into a 2 lb bag should bring pause to most people. Something else is going to have to give...

Fix your math. 2.67x to 3x. There is no 1080p60 ATSC OTA format. The bitrate for 1080i and 720p are approximately the same amount. You repeat that 3x-5x nonsense several times. More evidence of your ignorance.

To my recollection, the D1/D5 has twice the chroma resolution. ATSC is essentially the same chroma as DVD/SD/what consumers have been exposed to for quite some time.

ATSC is a 4:2:0 format, but it still has 2x the chroma resolution of DVD. The result is better color gradients and less "clayface"/posterization. Posterization is all over the place in SD.

This is my last response to you. It's getting overly long, and you're hardheaded and have no experience in the field you are discussing. I've got years of experience as a videophile, you've got none.

What you do appear to have is sour grapes. Maybe because you own a shitty TV and can't afford a better one, I dunno, but your frequent rants against HD are hard to understand in any other context. You'd have alot more mileage if you were a disgruntled HD owner pissed off at currently available content. The thing is, this ATSC encoding thread is a strawman. In past threads in this forum, which discussed 1080p 4:2:2 formats for Blu-Ray, you poo-poohed higher resolution itself. You simply don't think anything 720p or 1080p as a resolution offers any big improvement.

In other words, you've got an axe to grind about HD.
 
Last edited by a moderator:
DemoCoder said:
Either you own equipment, or you don't. You're like a guy who shoots his mouth off about what it's like to fly an airplane, even though he doesn't have a pilot's license and has no experience flying an airplane, other than reading about it. Or a guy who waxes on about the Mona Lisa even though he's never seen it.

...or perhaps I do, and your repeated accusations have been utter bull? You can marginalize all you like, but it will never change the fact that there is actually another side to the story- one that you simply refuse to acknowledge. Your mind is absolutely closed to the mere possibility that someone could actually have forayed into HD just as you have, and may actually see some things that need improvement. What does that say about you, honestly? Should there ever be someone who has a few words of dissent, you frantically fire away with ad hominen attacks like we see above. This is poor, poor behavior we see on your part.

You simply don't have any experience. One cannot argue about what "looks better" by looking at specs. It's subjective and there are many variables involved. People who have both SD and HD know it looks better.

One cannot argue from such permanently affixed rose colored glasses, either. So I guess that counts you out, as well.

All technology permits both good and bad renditions. The best 3D hardware on the planet doesn't mean every game is visually stunning. CD doesn't guarantee an excellent mastered soundtrack. 35mm film doesn't mean your exposures are perfect everytime. SD content is highly variable. I own several hundred DVDs, and quality varies incredibly. Two DVDs of the same content at the same bitrate can look different because of the quality of the film transfer/master. Two DVDs with different bitrates can look different because the compression is throwing away less.

Ok...finally you are coming off of your extremist position to agree with me.

Your attempts to "warn" people that, OH MY GOSH, one can get poor content for HD is silly. Anyone with an HD set is aware of it already. Anyone with Satellite is well aware of how your mileage may vary (especially your SD mileage. UPN SD Enterprise used to look TERRIBLE, worse than VHS)

Evidently they are not aware or are in denial of acknowledging it. It took you this long to actually admit there are problems here, aside from your vicious attacks of questioning my equipment, questioning my eyesight, and frequent spews of "you're a liar, you're a bullshitter". It's like performing an exorcism on someone possessed by the "HD or death" spirit. Certainly such lengths should never be required to talk to someone who really is of rational mind on a topic.

All of this is irrelevent because you are talking out of your ass.

Once again, back to the denial phase.

You don't own any HD equipment, and yes, experience doesn't matter. You have provided no scientific evidence or studies of HD content to back up your assertions. And you have no personal experience to back it up either, so ergo, you have no point to stand on.

...and the discrediting of character and ad hominen attacks. Green projectile vomit, on deck... I have plenty of experience (and hopefully your measure of experience isn't by some foolish metric such as $'s spent, $'s pissed away on things that didn't work...). The observations from my experience are placed right in your view. They are just not to your liking, hence your knee jerk reaction to vehemently deny their relevance.

Can you point to any ABX-style tests for HD? I thought so..

You'll need more than that. How about ABX doubleblind? I don't think you have it, either. You simply have your cherry-picked tests to reaffirm your existing beliefs and where your $'s got spent.

The simple fact is, a TV with better CR, black levels, gamma ramp, and properly calibrated provides a more faithful reconstruction of the original signal.

...this after you just got done saying how specs don't tell you everything? Specs can mislead you, too. Specs + marketing, even worse. Can you really say you haven't been exposed to such for every thousand you've thrown down this HD quest? That little twinge you just felt was your conscience, btw.

A cable with less noise, yields lower cable induced artifacts.

Is this a Monster Cable ad for a freakin' digital signal you are hinting at? The more you reveal here, the worse it looks for you, I'm thinking.

A superior scaler does a better job reversing the conversion from film to interlaced video.

Film to video is a matter of deinterlacing and 3:2 pulldown, not fancy scaling. The scaling certainly allows more fancy contorting of the original signal, however. Therein is a goldmine of marketing, as well...I'm sure you've had a cup or two...

The higher quality equipment simply reduces the amount of error added to the process. You get less noise in the image, you get less washed out shadow detail or highlight detail. You get a color gradient and color temperature more in line with the director's intentions.

Surely it does. However, it won't help all the "noise" and hash that has become an inherent part of the HD signal, itself. So what you are left is 3 million ways from Sunday to "blend-out" the artifacts. Perhaps, the picture ends up "better" overall, but you've succeeded in leaning very far out of the realm of "fidelity" and into the realm of "heavily processed facsimile". ...This pretty much sums up the source of your own inability to "see" the inherent artifacts. All the things you did that you thought were "removing noise" and "preventing errors" were many attempts to paint layer upon layer of processing, blending, and edge enhancement. It looks pretty as the skin on a centerfold model, but with all the airbrushing, you are pretty far from viewing the real thing. I don't fault you for ending up on this plane, but at least know where you ended up in all of this and be able to acknowledge it.

So? Yes, Satellite HD is currently lower quality than OTA HD. But Satellite HD looks *MUCH MUCH* better than Satellite SD.

It's already been cited that satellite SD is horrid to begin with- far worse than even analog SD in its heyday (perhaps, you need to be clued in that what is making it look as bad as it is, is it being "satellite", rather than being "SD"). So it's a ridiculous point- like saying you outsmarted a retard in a race though a maze. Saying HD beats satellite SD is hardly a testament to quality. A top-quality movie transfer to VHS beats satellite SD- it's that bad.

So for people who only have Satellite, and have no OTA reception, Satellite HD is a tremendous improvement over Satellite SD, even if they don't even have an HDTV to display it on.

Sure, but it still looks bad. ...not just bad for HD, but just bad. Yet, people don't mind (they actually didn't mind for SD, either) about the quality. That tells volumes about how well they can distinguish "quality". They are told that "this is HD", so they simply believe that is what "good" looks like. The whole mental process for that is so fubarred. Then you got Dish owners and DirecTV owners fighting back and forth about how superior their feed is...absolutely oblivious that both feeds are really quite the crap.

Do you have DirectTV Mr Expert? Have you ever seen the difference between CBS in SD and the same show on UHD? Have you watched Discovery channel SD and Discovery Channel HD? The difference are night and day.

Actually, I do have it. Sure there is a difference. As you seemingly miss over and over, satellite SD is utter crap to begin with because it is "satellite", rather than just being "SD". You are in some dire straights if you need to compare to satellite SD to come out looking better. :LOL: The flipside, Discovery Channel HD still looks like crap, HD or not. It's embarrassing at times, it's so bad. Simply managing to look better than "SD" is hardly a consolation, for this case. You have to actually reach outside the "crap" zone, period, for this to make sense.

Wrong. I buy new gear all the time and throw away the old. The amount of money is trivial for me. $450 for a HD Tivo. This represents 2% of my HT budget. I had a WXGA Projector and Samsung HDTV well before I owned a D-Threatre or HD Tivo. I used them to originally enjoy good SD content better: larger screen, better image response, widescreen.

This habitual behavior pretty much confirms the charicature even further. Of course, the money is trivial to you. So is the equipment. Flavor of the month... We all see it. It's plainly obvious.

No, I have acquired equipment that doesn't *ADD* noise. Try re-reading.

No one said anything about adding noise.

Fix your math. 2.67x to 3x. There is no 1080p60 ATSC OTA format. The bitrate for 1080i and 720p are approximately the same amount. You repeat that 3x-5x nonsense several times. More evidence of your ignorance.

Every proud HD owner I've met is hair triggered to bandy the 5x the resolution argument. I don't doubt you have done it here at least once. This is simply just desserts for you.

ATSC is a 4:2:0 format, but it still has 2x the chroma resolution of DVD. The result is better color gradients and less "clayface"/posterization. Posterization is all over the place in SD.

4:2:0...essentially the same as NTSC...functionally the same as 4:1:1 (unless you are actually suggesting 2x the chroma at the expense of dropping the sampling of one primary color, outright? What's your preference- all green or all red HD picture? Can't just drop sampling one color outright so you can sample the other 2x). It's essentially the same as DVD, just a different manner of synchronizing opposing color components. :oops: So what is this "improvement" your eyes told you were seeing? Job well done, my friend! You bought the "2x chroma" tag hook, line, and sinker. We now rest from any more talk from you and your "handle" of technical specifications...

This is my last response to you. It's getting overly long, and you're hardheaded and have no experience in the field you are discussing. I've got years of experience as a videophile, you've got none.

Oh yeah?! Well I own a Porsche and have a trophy wife! That trumps all!!! :rolleyes:

In past threads in this forum, which discussed 1080p 4:2:2 formats for Blu-Ray, you poo-poohed higher resolution itself. You simply don't think anything 720p or 1080p as a resolution offers any big improvement.

Clearly, you took my words out of context.

In other words, you've got an axe to grind about HD.

...and you have the rose-colored goggles for it. Really, you think that remark puts you on a different pedestal???
 
Last edited by a moderator:
Randy what are you arguing about?

I've read your to and fro with DemoC and I'm still not sure what point you're trying to make.

He's saying HD is better than SD.

I *think* you're saying HD is not perfect.

What's the big conflict in those two assestions?
They can both be valid at the same time and whether HD is perfect or not if somebody wants better quality than SD then I don't see any alternative other than HD.

Reading the last few posts in this thread has made me want to pull my hair out :(
 
He thinks I said "HD is flawless" and then backtracked to "some HD programs are poorly coded" and now I "agree with him" which is a strawman argument.

I never, ever, said that HD is "flawless" that HD encoded signals don't have any artifacts, *NEVER* That was the strawman he's been belaboring against.

What I have said is "HD is better than SD, HD looks much better than SD". That good quality HD encodings look better than good quality SD encodings. Apples to Apples. The man has gone off the deepend bashing a strawman. Where's Randy's big objections to the state of SD encoded signals, all of which suffer from the *same* problems he argues HD does, only moreso in many circumstances. In an effort to avoid having his argument blown out of the water, he is now falling back on the idea that HD advocates claim there are no flaws in HD signals.

Then I catch him in a sign of ignorance. He claims 5x as many pixels won't fit in 2x the bitrate, when everyone knows that ATSC formats are not 5x the pixels, that are 3x at best, and 2.67x normally. After being shown to be wrong he goes and says "Every proud HD owner I've met is hair triggered to bandy the 5x the resolution argument.". Well, this isn't about what other people have said, it's about WHAT YOU SAID, and you were WRONG. The fact is, on average, ATSC OTA has 2x-3x the bitrate of SD DVD, and ATSC has 2x-3x the pixels of DVD, so on balance, the constraints for the compressor are the same.


Then to counter, Randy-never-owned-an-hd-talking-out-of-his-ass claimed scalers don't do pull down and deinterlacing, when practically every highend scaler on the market does so, in addition, they do pulldown fixes, audio syncing fixes, and dozens of other fixes. He argues as if playback video equipment can't fix and remove artifacts introduced during the film-to-video process. Anand, for example, vividly shows you that the decoder/playback/filtering can, and does, make a difference. I own an DVDO iScanHD+. It most certainly does deinterlacing, pulldown, scaling, and plenty of other functions. The quality of your playback hardware does make a difference.

The final straw in the hat is when I mention cable quality. He immediately leaps to the conclusion that since HD signals are digital, that a) there can't be any analog cables in the process (false, one can have component video connectors) and b) that digital cables don't have "noise"

HDMI/DVI cables *ARE* succeptable to noise, and Randy is a bloody moron. Run any bog standard HDMI cable over 9ft and you'll instantly know what I'm talking about. In fact, over 6ft and you may face errors on many cables. These manifest themselves as missing pixels. You get shimmering white pixels popping all over your image!

But this is the problem with Randy. He has no first hand experience with HD. He's never setup a home theater, and has no idea about the kind of issues you will run into, like cable noise on *digital cables*. Many HD theaters in fact run HD video over optical cables, or over Cat5e/Cat6 to eliminate noise in HDMI/DVI cables.

He wants to claim it is an ad hominem attack, but Randy is the one making bold assertions against the majority consensus about HD, and the burden of proof is on those making the extraordinary claims. It is not an ad hominem attack to reveal the fact that he in fact, has no real experience with setting up and experiencing HD equipment.

He's been proven incorrect over and over, and he has no experience.
 
Status
Not open for further replies.
Back
Top