An HDMI 1.3 MEGAreview!

Ruined

Regular
Ok, so this review focuses on two important features which came to fruition around the time of HDMI 1.3 - 1080p24 (which does not need HDMI 1.3) and bitstreamed nextgen audio codecs (which does need HDMI 1.3). Therefore this review of these two techs will be split in two seperate sections.

First off, equipment used:
Display: Sony 60" KDS-60A3000 (HDMI 1.3/1080p24/120hz/LCOS HDTV)
HD DVD Player: Toshiba HD-A35 (HDMI 1.3/1080p24)
Pre/Pro: Integra DTC-9.8 (HDMI 1.3/TrueHD/DTS-HDMA)
Power Amp: Parasound HCA-2205A (300x5 @ 4ohms/ 220x5 @ 8ohms)
Speakers: PSB Stratus Goldi, PSB C6i, PSB Image S50, Velodyne SPL-1200R

Moving on,

1080p24
========
The holy grail of video formats, the fabled 1080p24. This format allows the HD DVD player to transmit to your screen the original 24fps of film with no processing to your display. If you have a display that both accepts 1080p24 and also has a refresh rate that is a multiple of 24 - such as the 120hz TV above - then you will get perfect 1080p24 with no 3:2 judder as seen on standard 1080p sets. Note that simply accepting 1080p24 alone is not enough, as a display with a 60hz refresh will still demonstrate 3:2 judder even with 1080p24 input.

So in theory we agree that 1080p24 output into a TV with 1080p24 input with a refreshrate of 120hz results in the perfect cinema experience. But how does it fair in practice to standard 1080i? Remember, theoretically also 1080i should be able to match 1080p24 in quality and smoothness if the TV does inverse telecine properly. Read on.

In order to truly test the metal of 1080p24 vs 1080i60, a variety of material was watched but in particular I paid close attention to the Mission Impossible 3 HD DVD Vatican stairs scene (chapter 8 of the HD DVD). This pan down the very detailed stairs can bring some TVs to their knees.

For the 1080p24 tests the HD DVD player was set to output 1080p24 and the TV's cinemotion processing was disabled for optimal preservation of the 1080p24 signal; it was confirmed via the TVs display that it was displaying at 1080p24. For the 1080i60 tests the HD DVD player was set to output 1080i and the TV's cinemotion processing was set to Auto2 to enable processing of 1080i to 1080p24 via inverse telecine builtin to the Sony. The Sony's Motion Enhancer was disabled because it artificially interpolates between frames and we are not looking for that result here.

After careful examination between the HD DVD player in 1080i and 1080p24 mode I came to two conclusions. First off, when fast forwarding and rewinding in 1080i mode the HD-A35 displays interlacing artifacts a lot and this is not seen in 1080p24 mode. However and much more importantly, during actual film playback 1080i output mode was indistinguishable from 1080p24 output mode on the Sony KDS-60A3000. I watched the vatican stairs scene of MI3 time and time again to look for a hint of additional moire or twitter in the 1080i signal that was not there in the 1080p24 signal but there was none to be found. Looking at overall motion it appears that Cinemotion Auto2 did its job as there was no 3:2 judder to be found. Checked out Fast and the Furious Tokyo Drift to see some of the pans and they looked beautifully smooth both in 1080i and 1080p24.

SO, you ask, why in the hell would you even bother doing the 1080i vs. 1080p24 mode test? To frustrate a BD supporter hellbent on proving the necessity of 1080p transmission? Well, the real reason is that there is actually a functional disadvantage to locking the player in at 1080p24. That disadvantage is that not all HD material is 1080p24!! Some 30fps material I ran into on HD DVD includes: Dreaming Arizona, Dreaming Nevada, Galapagos, Nature's Journey, Nine Inch Nails: Beside You in Time, as well as most of the HDScape and Living Landscapes titles. I'd wager most of the other concert titles and HDnet titles are 30fps as well, though I can't confirm that. So what happens when you view these 30fps titles when the player is locked in at 1080p24? Well, ugly stuttering is what happens. But that is not where the complications end. The PiP features (IME/U-Control) of HD DVD are also often recorded at 30fps and those too can stutter while the movie itself is playing back smoothly; one could always cope with this since its only PiP but it still looks nicer to have smooth video in PiP. The only way to fix this is to go into the player's setup menu and switch to 1080i for 30fps material to avoid the stutter. The Cinemotion Auto2 mode intelligently is able to discern 24fps from 30fps material so that no matter what you get smooth output via 1080i, however if the player is locked 1080p24 you are out of luck getting smooth output on 30fps material.

With the Sony appearing to resolve full detail and properly do inverse telecine on 1080i signals to 1080p24, I saw no reason to leave my HD DVD player set to 1080p24. Sure, it felt warm and fuzzy to be getting the native frames off the disc, but that warm and fuzziness was shattered with the functional annoyance of having to switch out of 1080p24 mode to watch 30fps material and PiP extras without stuttering. Generally only AFTER watching the material stutter along for the first minute and realize you now have to stop the movie, change the settings, and restart it for 1080i... No need for me to do that with my setup.

So with my $435 player that outputs true 1080p24, it is now set to output at the same 1080i that the HD-A3 outputs. Hey, at least I got the nifty ABT1018 scaler chip though, eh? Note that some TVs may not have the same quality inverse telecine that the Sony KDS-60A3000 has, and in that case 1080p24 output may be useful. In my case though, it really did more harm than good.

So we move onto the next section, where I REALLY got my money's worth on the HD-A35, bitstream highdef audio.

Nextgen Audio Bitstreamed
=======================
For this next portion of the test, I listened to a variety of material with the player both set to bitstream mode with decoding done in player and PCM mode with decoding being done in player. However, I chose one title to focus in close on and this time the title was the first 10 minutes of Terminator 2: Judgment Day HD DVD (UK Edition). The title has a nice DTS-HD Master Audio track and I was able to directly compare the lossy core vs. the lossless MA track.

Results? Well, I'd have to say that after intense comparison of the first 10 minutes of the film, they sounded mostly identical. I did know exactly what I was looking for as I am familiar with lossy compression techniques and I was able to hone in on a 5-second period of time where the MA track had more detail than the lossy track - prior to the first T1000 we see crushing a human skull, there are some ambient background noises; there is one particular tone of very high frequency that is reproduced for 5 seconds with slightly more detail on the MA track. However, since it is a weird ambient background noise there is no way anyone would tell the difference unless doing this specific type of A/B comparison, especially since you need to crank it to hear the difference on this 5-second passage. The music, dialogue, lasers, sound effects, etc, all sounded the same for the most part on both versions.

So, was this section created to agitate "lossless or bust" audio fans? I mean, why would you be even comparing this if you can do the nextgen codecs bitstreamed. Because, like locked 1080p24 there is a functional disadvantage here, too, and it is far worse. When bitstreamed audio is enabled you lose all sound from PiP extras. You also lose all button sounds from menus (some may actually like this). Since the internal mixer of the player is bypassed, you dont get to actually hear any of the nextgen extras with bitstream audio enabled and that sucks. PiP commentaries become wholly useless without any acutal audible commentary.

If I get minimal improvement in audio quality via bitstream, why would I want to gimp the PiP extra features on virtually all the HD DVDs that have them? That makes no sense. So again, the warm fuzzies of "direct digital bitstream!!" get torn down its real world functionality and lack of significant improvement in quality. All I can say is forget bitstream!

So, again, my $435 HD DVD player gets set to the same PCM5.1 player-decoded output that the $199 HD-A3 would be set to... Normally I'd be pissed if I'd spent $1699 on a new preamp looking for bitstream audio and ended up not using it but in this case I needed to upgrade anyway as my Parasound lacked HDMI entirely and had bass management problems plus decoder issues due to the chipset being so darn old.

So, in my case, what was the true reason that made it worth spending an extra $250 on the HD-A35 over the HD-A3?

The HD-A35's true strength over HD-A3: Light up front panel HD DVD logo. :)
(and in all fairness, better DVD upscaling chip & Analog 5.1 outputs)

Well that about does it folks. Now that I have my HD-A35 setup essentially setup like an HD-A3 I think I can say that a lot of this HDMI 1.3 crap is simply a gimmick to get people to buy more gear. While I can see the fun in tinkering with it, functionally having the TV doing the deinterlacing and the player doing the decoding simply seems to work best with HD DVD and quality does not seem adversely affected. You can take my MEGAreview or leave it, but on the whole the cutting-edge technologies here seem more gimmicky than useful. Realize of course that not all gear is built equal, and some TVs will deinterlace poorly while some receivers may have bass management issues with PCM 5.1. My specific equipment however did not have these issues.

In parting I will say that the three new HDMI 1.3 units I purchased and used in this review - Sony KDS-60A3000, Toshiba HD-A35, and Integra DTC-9.8 - are all top notch awesome performing units. I would highly recommend all of them to anyone, and I think they were all worth the money... Although I must admit if the HD-A3 was substituted for the A35 it probably wouldn't look or sound any different aside from on upscaled DVDs where the A35's scaler chip might excel. It just so happens that the best configuration is not necessarily the latest technologies in this case. Just worry about getting a TV that can do good inverse telecine and supports true 24fps output/refresh rate paired with an HDMI receiver that accepts PCM5.1 and you should be fine. Also to be clear, 120hz is definitely worth the money! I recommend Sony's 120hz LCD/LCOS and Pioneer's 72hz Plasma HDTV sets because I know that both of them do proper inverse telecine - not sure how mitsu/samsung 120hz fairs here.

Hopefully this review was enlightening and saved people a few bucks, too! :)

UPDATE: Performance on Standard DVD, a comparison of Standard DVD deinterlacing!

With this wide array of hardware at my disposal, I have the ability to test how these units fair on standard DVD. So, with this in mind, I decided to watch some standard DVDs until I found noticable deinterlacing artifacts. I found some right at the start of the standard DVD side of "The Departed" HD DVD/DVD combo; the Warner Bros logo had obvious stairstepping.

So, when viewing this scene, I tried to view it four different ways - all scaled by the HD-A35 (ABT1018 scaler chip) but deinterlaced by different components
1) 1080p24 passed to TV from HD-A35, CineMotion off, Motion Enhancer off/ Deinterlaced by HD-A35
2) 1080i60 passed to TV from HD-A35, CineMotion Auto2, Motion Enhancer off / Deinterlaced by Sony KDS-60A3000
3) 1080p60 passed to TV from HD-A35, CineMotion Auto2, Motion Enhancer off / Deinterlaced by HD-A35
4) 1080i60 passed to DTC-9.8 from HD-A35, 1080p60 passed to TV by DTC-9.8, CineMotion Auto2, Motion Enhancer off / Deinterlaced by DTC-9.8

In short, test cases 1-3 all displayed the stairstepping while test case 4 did not. Why? Well, DVD is a much more difficult beast to deinterlace than HD DVD since you do not have native 24p stored on disc all the time; often with standard DVD there is flagging of partial field/frames in the signal. Therefore it becomes a much more difficult for the job for the deinterlacer compared to HD DVD, where the job is easy as the film is stored 24p on disc and the 60i signal is generated by the player. Since DVD is stored 480i60 on disc, in all methods the signal will be need to be deinterlaced as you cannot simply grab the native stream off the disc as with HD DVD. So methods 1 & 3 use the deinterlacer in the HD-A35 while method 2 uses the deinterlacer in the TV and method 4 uses the deinterlacer in the DTC-9.8. The acclaimed Reon chip in the DTC-9.8 did the best job of all. So, if you do not have access to a Reon chip in your pre/pro or receiver and watch a lot of standard DVD, you might want to pick up a Reon-based HD DVD player for best deinterlacing. Some popular ones are the Toshiba HD-XA2, Onkyo DV-HD805, and Samsung BD-UP5000.

In the end, for my particular setup, the best quality combined with the most convenience on ALL sources looks like it may be acquired by letting the Reon in the DTC-9.8 do the scaaling/deinterlacing work to 1080p, then pass that to the Sony where it can use its Cinemotion Auto2 technology to remove duplicate frames from the signal on film sources for 24p output.

On an unrelated note to this update, I can confirm that the DTC-9.8 does not accept 1080p over component; therefore I decided to bypass the scaler in the XBOX 360 by setting it to output its native 720p and allow the Reon to handle the scaling to 1080p; almost all 360 games are 720p and those that do support 1080p like Virtua Tennis 3 run better at 720p anyway. This seems to work well and is more convenient than running 1080p component directly to my HDTV.
 
Last edited by a moderator:
Alas, alas, I didn't get offered the Amazon $149 deal on the A35 anyway!

Tho I might consider an A3 some time this holiday season, or post-holiday sale time.

I'd like to see the HD DVD movies sold numbers close the gap first this holiday season.
 
Thanks for the review.
The HD-A35's true strength over HD-A3: Light up front panel HD DVD logo. :)
(and in all fairness, better DVD upscaling chip)
And analog outputs. If the decoding/DACs in the player are good, then putting that extra $150 there could starve off having to buy that brand spanking new terrifyingly expensive HDMI 1.3 receiver when you're perfectly happy with your Denon AVR-1803 for 97,9% of all your other needs.

Also, on the topic of DVD, the 24p output might be a boon. There are quite a lot of badly authored SD discs out there with messed up cadences and whatnot. Even if the TV does proper 3:2 pulldown removal and deinterleaving, they tend not to be so hot handling FUBARed content (there's was alway a market for quality video processors in DVD players after all).

Go buy/borrow the DVE and/or HQV discs, then come back and update your review... ;)
 
Last edited by a moderator:
Thanks for the review.
And analog outputs. If the decoding/DACs in the player are good, then putting that extra $150 there could starve off having to buy that brand spanking new terrifyingly expensive HDMI 1.3 receiver when you're perfectly happy with your Denon AVR-1803 for 97,9% of all your other needs.

True I forgot to mention that. Although one problem with Analog 5.1 is that many preamps/receivers with analog 5.1 input fail to properly amplify the LFE channel by 10db as is the case with digital inputs, so you often need to compensate at the player side for this lack of consistency between digital and 5.1 analog sources. This can be a pain sometimes. But it can be compensated for and thus for users of older receivers the HD-A35 is a great choice - for me personally, though, this feature was not useful.

Also, on the topic of DVD, the 24p output might be a boon. There are quite a lot of badly authored SD discs out there with messed up cadences and whatnot. Even if the TV does proper 3:2 pulldown removal and deinterleaving, they tend not to be so hot handling FUBARed content (there's was alway a market for quality video processors in DVD players after all).

I actually threw in some of the tougher standard DVDs and they seemed to work quite nicely with the Sony. But you seem to forget one thing - DVDs are not stored in 24p like HD DVDs and BDs. DVDs are stored at 480i60 on disc. Therefore with standard DVD upscaled when outputting locked 1080p24 instead of 1080i60 all you are doing is doing the deinterlacing in the DVD player instead of in the TV. Unlike HD DVD/BD, it is not possible to get a native 24p stream directly off a standard DVD because the video is natively stored at 60i on DVD.

Go buy/borrow the DVE and/or HQV discs, then come back and update your review... ;)

I have DVE, I didn't see much use for artificial tests though when real life ones tell so much more about the performance of a device. I like real-life torture scenes (again, MI3 HD DVD/BD Chapter 8 Vatican Stairs) better than artificially generated torture scenes. The Silicon Optix HQV disc will obviously be extremely biased towards Silicon Optix HQV-brand deinterlacers (Reon/Realta), so I would not consider that a reliable source for general deinterlacing tests/performance. It would be like running an Nvidia-made benchmark to gauge an ATI card's performance/quality.
 
DVDs are not stored in 24p like HD DVDs and BDs. DVDs are stored at 480i60 on disc.
It's been a while since i read up on the DVD spec, but they kind of are. The video for (much/most) 24p content isn't actually teleclined and stored half height at twice the field rate, but it is indicated in the bitstream that it's supposed to be. Then pulldown and interleaving is applied in the decoding stage. This saves bandwidth/space on the disc.
I didn't see much use for artificial tests though when real life ones tell so much more about the performance of a device. I like real-life torture scenes
Fair enough. I'll submit that they are useful for pinning down the source of an observed discrepancy and quantifying it, though. (Both of which should really a good thing.)
so I would not consider that a reliable source for general deinterlacing tests/performance. It would be like running an Nvidia-made benchmark to gauge an ATI card's performance/quality.
Must... resist... obvious... joke...

It's not that the tests aren't valid, though. They just used a set of cases that their own products did well. I've seen others argue in favor of other cases where that might not be the case, but that doesn't necessarily invalidate the tests that ate used, except as the only benchmark in strictly comparative cases. I haven't seen anyone claim that they're mostly anecdotal/artificial (which would invalidate them), but the cases presented are definitely less of an issue in the real world than running a HQV disc would have you believe.
 
Last edited by a moderator:
It's been a while since i read up on the DVD spec, but they kind of are. The video for (much/most) 24p content isn't actually teleclined and stored half height at twice the field rate, but it is indicated in the bitstream that it's supposed to be. Then pulldown and interleaving is applied in the decoding stage. This saves bandwidth/space on the disc.

Standard DVD is not stored natively 24p and hence can't be output so like HD DVD/BD. Rather than go into detail, this link does a good job:

(See How Progressive Players Work section)
http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-part-5-progressive-10-2000.html

It's not that the tests aren't valid, though. They just used a set of cases that their own products did well. I've seen others argue in favor of other cases where that might not be the case, but that doesn't necessarily invalidate the tests that ate used, except as the only benchmark in strictly comparative cases. I haven't seen anyone claim that they're mostly anecdotal/artificial (which would invalidate them), but the cases presented are definitely less of an issue in the real world than running a HQV disc would have you believe.

They are a good tool, but need to be used alongside many other tools. In my case, I used primarily problematic movie content as my benchmark.
 
(See How Progressive Players Work section)
Yes, and? Read what I said again. Often this isn't 'physically' done, rather indicated in the MPEG2 bitstream. It saves (a not insignificant amount of) bandwidth and saves having to do separate encodes for PAL/NTSC from 24p content. The vast majority of commercial DVDs from film content are stored this way. Actually telecining the video would just be a huge waste.
 
Cool. Now you should buy a $399 PS3, which also has 1.3 and 1080p24 after all, and compare :p

Seriously, I'd be interested to see how the PS3 stacks up now, after a year, to standalone players like these.
 
Yes, and? Read what I said again. Often this isn't 'physically' done, rather indicated in the MPEG2 bitstream. It saves (a not insignificant amount of) bandwidth and saves having to do separate encodes for PAL/NTSC from 24p content. The vast majority of commercial DVDs from film content are stored this way. Actually telecining the video would just be a huge waste.

Was hoping to avoid this but... From the link:

http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-part-5-progressive-10-2000.html

Why Deinterlacing is Necessary

A common question we get asked is, "Why can't the DVD player just take the progressive frames off the disc and send them out without ever converting them to interlaced in the first place?" The reason, in a nutshell, is that there are too many examples of discs where some or all of the frames are not stored progressively. Even if the original material was sourced from film, there is no requirement at all that the frames be stored like Example 1, above (see link). It's relatively common for films to be dumped onto the disc using an encoding similar to Example 3 (see link). As mentioned before, most major Hollywood releases look more like the first example, but it's just that they have better encoding software, which recognizes the 3-2 pattern and removes the extra repeated fields for compression efficiency. It's not done to improve progressive playback; that just happens to be a useful side-effect.

And, of course, there's plenty of material on DVD that was originally shot on video, or was shot on film, converted to video, and then edited on video. This material requires fairly sophisticated video-mode deinterlacing algorithms if it's going to look good. Cheaper deinterlacing chips skimp in this area.

Film-Mode Deinterlacing

To display a perfect progressive image from a film-sourced DVD, the player needs to figure out which fields in the MPEG stream go together to make each film frame. In theory, the progressive_frame flag should tell the player that the frames on the disc were originally from a film, and will go together, but as we’ve mentioned, that flag is not always optimized for progressive scan playback.

So what the best players do is use a standard MPEG-2 decoder to generate digital interlaced video and then feed that video to a deinterlacing chip. The chip makes decisions constantly about whether the video was originally from film by looking for repeated fields. In the standard 3-2 cadence, the 1st and 3rd fields are identical. If the deinterlacing chip sees a constant stream of 5-field sequences in which the 1st and 3rd fields are identical, it switches to film-mode deinterlacing.

Once it’s in film mode, the deinterlacer just combines fields 1 and 2 to make one progressive image, outputs that for 3 progressive frames, then combines fields 4 and 5 to make another progressive image, and outputs that for two frames. Then it repeats the process with the next 5 fields. The player is still outputting frames in a 3-2 pattern, but it’s creating 60 full progressive frames per second instead of 60 fields per second. Once the chip is in film mode, the deinterlacing algorithm is incredibly simple, and the complete film frame is recreated without loss or compromise. Film mode is the one area of deinterlacing that can be objectively perfect.

If the film is encoded with a 2-2 pattern, the job gets much harder. There are no repeated fields as there are with a 3-2 pattern, so much more sophisticated analysis is used. This is one case where good flags would be a huge help to deinterlacing. Sadly, the flags are just not correct often enough, and 2-2 material is much more often flagged badly than 3-2 material, for reasons we don't completely understand.

The most common, and most distracting, artifact one encounters in film mode happens when the deinterlacer blithely combines together two fields that weren't meant to go together, usually because the 3-2 sequence is interrupted and the deinterlacer doesn't adapt quickly enough. When this happens, the odd numbered lines of the image are from one moment in time, and the even numbered lines are from a different moment. If something in the image is moving, it looks like there are spiky lines sticking out from the sides of the object like the tines of a comb. Hence the effect is usually called combing, though it is also sometimes referred to as feathering.

From this informative article, we can see that the best "progressive" DVD players decode a 480i60 signal from the data stored on disc which can be a hybrid mix of different fields & frames, then internally deinterlace that to 480p. As the article states, you cannot simply pull the progressive frames off the disc because they are not necessarily stored in a progressive fashion. In a way, progressive is almost 'hacked' in as the format was originally geared towards 480i60, and it shows by the great inconsistencies in DVD encoding. This is unlike HD DVD/BD which both have the signal stored natively 24p on disc and that can be output directly by the player.

So, when an HD DVD or BD player displays a DVD in upscaled 1080p24, it does the same thing due to the inconsistencies in DVD encoding - it decodes the DVD as 480i60 then deinterlaces/upscales to 1080p24. For HD DVD/BD, it can simply dump the 1080p24 right off the disc; DVD's encoding is much more complex and confusing and requires the data essentially be decoded as 480i60 then deinterlaced for best results as stated by the link.

So, in the end, when you are talking about 1080i vs 1080p24 output for upscaled standard DVD, you are simply talking about whether the DVD player or TV does the deinterlacing of the decoded 480i60 DVD signal - whichever has a more robust deinterlacer will have the better final output.
 
Last edited by a moderator:
Well that about does it folks. Now that I have my HD-A35 setup essentially setup like an HD-A3 I think I can say that a lot of this HDMI 1.3 crap is simply a gimmick to get people to buy more gear.
Duh. ;)

Do you know how much money the AV cable manufacturers are making on this?

I never understood the hooplah over 1080p24. 1080i can transmit more information, so you're not losing anything. Deinterlacing a steady cadence is easy. I often hear "informed" arguments that deinterlacing flags are not ideal and so 24p reconstruction doesn't always work correctly. This is such a bogus argument, since a player capable of producing a steady 24p stream of video data is capable of producing an ideal 3:2 telecine cadence on its own.

There's no need for a new signal standard. If there are complications for the TV to deinterlace, then those would apply to the player too. Getting 24p from a DVD or BR or HD-DVD is a problem for the player, not the 1080i signal.
 
So, in the end, when you are talking about 1080i vs 1080p24 output for upscaled standard DVD, you are simply talking about whether the DVD player or TV does the deinterlacing of the decoded 480i60 DVD signal - whichever has a more robust deinterlacer will have the better final output.
Yes, basically. That's pretty clear from my original post, is it not? The entire point was that a higher end DVD player is probably (most of the time) better equipped to handle all those inconsistently encoded DVDs your long quote talks about than the TV. Additionally, outputting 24p is one less processing step in the player and one less processing step in the TV if the final display rate is to be a multiple of 24.

Then you said that "DVDs are not stored in 24p", but the fact of the matter is that many/most of the are. I was merely pointing out that this is a decoding/output issue, not a storage issue.

I think agree on the merits, just had a crossing of lines on the terminology.
 
Back
Top