Today's crazy scoop or internet hoax: 360 on-board HDMI?

expletive said:
Just to be clear, and to restate Guden's point...

When implemented correctly, HDMI is superior. The problem, up until this point, has been poor implementations of it.


Seriously I think most would not be able to tell the difference between these interconnects and while I agree the connection should in theory grant you a "cleaner" image, the same could have been done with dvi, but umm ... drm isn't as widely supported with that format. ;)
 
RobertR1 said:
While it might be one application it is one that effects millions of households. The Samsung Blu Ray player also seems to do a better job at component instead of HDMI depeding on your display, so theory and practise, yet again http://www.thedigitalbits.com/mytwocentsa123.html#comp Which HDMI are you a fan of? 1.0, 1.1, 1.2, 1.3 or the later ones that'll surely come down the line? HDMI is a pain in the ass and since it's hardware, you can't do much about it your current device has issues with it. HDMI isn't as rosy as CE's make it out to be.

HDMI is still a work in progress. One day it will be all it's cracked up to be but until till then it's hit or miss.

I am using HDMI today and I have none of the problems they have, so either they are doing something completely wrong, or they have busted hardware. I mean, come on, claiming that HDMI has less picture detail than component, and that 1080p has less detail visible than 1080i? More signal processing and filtering has to happen to generate a component signal than to generate an HDMI signal. HDMI is essentially a framebuffer dump of the exact digital data in RAM encoded via TDMS.

Perhaps they should have compared apples-to-apples and 1080i@HDMI vs 1080i@comp to see what is causing it. Perhaps the 1080p output is busted or is being scaled differently than 1080i and this is leading to the differences. They have not done the necessary investigations to blame HDMI.

My own personal experience is that HDMI looks better than component. I have 2 HD displays (720p DLP 60" and 720p 120") and have both HDMI and Component hooked up to them. Component video is less stable. There is a ever-so-slight shimmer to some pixels, as well as noise visible. Now, maybe it's because I have a 10 foot screen that makes it easy to see, but the difference is obvious. There is in fact, little color difference perceptable, it mostly comes down to noise and instability. HDMI has noise issues too, but they are different. If you try to run an HDMI cable too far, you'll get dropped pixels, but if you stay within cable specs, it will be way more stable and noise free than component. For component cables, if you keep them short, they'll be less of a difference, little to no noise. Just that with digital, this is more of an exact science since the artifacts are way more obvious.

Still, for purists, the idea of possible multiple D->A, A->D conversions is irritating.
 
It's not about being a purist.

What if the console of your choice last gen had not featured s-video (GCN Europe) or RGB-SCART / Component, would that have affected your choice then.

Would you have settled for a bad quality composite connection for your EU GCN if your TV only had one RGB scart and that was already occupied by your DVD player (I think that's one of the reasons that kept me from ever buying a GCN), or settled for S-video for your PS2 because the RGB-SCART didn't work for DVD's (That's what made me buy a standalone DVD sooner than I would've needed, and I'm still using the PS2 with S-video because it's more convenient to route my two s-video devices via my AV receiver and connect my DVD directly to my TV's RGB that has only one RGB enabled SCART, even tough I know I'd get a better PS2 pic via RGB SCART, I've had to prioritize because of convenience)

Would the PS3 not have a HDMI connector, again I would need to prioritize and compromise on my quality when I get my new projector.

That's just my case, but my setup is nothing exotic, very mainstream home theater setup with connectivity that I'm sure will be pretty standard in homes with a HDTV display and other gear in addition to a games console that'll need to connect.
 
rabidrabbit said:
That's just my case, but my setup is nothing exotic, very mainstream home theater setup with connectivity that I'm sure will be pretty standard in homes with a HDTV display and other gear in addition to a games console that'll need to connect.

I think your situation may be exotic if not your setup in general. I just don't see people running out of component inputs before HDMI inputs these days. Your display having a single HDMI and a DVI with only one component is probably the only of its kind! :)
 
Well, the situation is a bit different in Europe, where there are mad headbutting frenchmen and SCART.
Anyway, as general public is seeing it, HDMI=HDTV, if your next new display or the device you're connecting to the display doesn't have HDMI, it's already out-of-date.
The HDMI has been marketed as the next standard connector to replace the video and audio connectors of today, and that it is going to be too.
 
Theoretically, HDMI should yield a better picture than component due to a reduction in analog noice and repeated D/A A/D conversions. In practice this depends largely on what components you have and how you have them set up. HDMI is little more than a mess right now with an ever evolving spec and inconsistent implementation from manufacturers of HDMI equipped products up and down the signal chain. When done well HDMI is superb. When not done well....

Component is still a very viable connection solution of extremely high quality. Even in well implemented HDMI setups, if component is implemented with equal care, then the differences between the two are very slight at best. If you are seeing major differences between component and HDMI in your setup or are getting lots of noise, the I would suggest that your component setup is less than optimal.

Having said all of that, there are still lots of reasons to use HDMI. Convience is certainly at the top of the list, particularly when using HDMI sources with HDMI receivers, allowing you to route audio and video via a single capable. If your display supports it, there is also display control information (metadata) sent with the signal to enable your display to optimize its settings for the signal being sent. And of course, elimination D/A to A/D to D/A conversions is a bonus, though I find that it is really only a major benefit if you are going to do some image manipulation (video processing and scaling), as you can maximize the effectiveness of this processing and normally completely bypass the internal processing of the display.

If you are just looking for the convenience of running a single cable to your display and/or simplfying switching, then you can select one of the newer receivers that not only has HDMI switching, but will upconvert analog video signals to HDMI before sending them to the display. If you have the budget for it, you can even get a Surround Processeor with a full blown SOTA video processor built in. If you're going to watch anything encoded with MPEG-2 (including Blu-Ray), then you'll need advanced features like mosquito noise reduction and macroblocking reduction in your image processing to get the best picture.

Bottom line, having a 360 with an HDMI output wouldn't be a bad thing, but having a 360 without it isn't the end of the world either. And this is coming from a man who plays his 360 on a 106" screen with component connections. Of course, my display is a native 720p DLP projector so I don't have to scale my output which, as has been pointed out, if done poorly is worse than any D/A A/D artifact from connecting through component.
 
Phantom Gamer said:
Theoretically, HDMI should yield a better picture than component due to a reduction in analog noice and repeated D/A A/D conversions. In practice this depends largely on what components you have and how you have them set up. HDMI is little more than a mess right now with an ever evolving spec and inconsistent implementation from manufacturers of HDMI equipped products up and down the signal chain. When done well HDMI is superb. When not done well....
How do they bodge it up though? Digital to digital transmissions should have no variation in data. And if they're not changing the data to add problems, where do they come from?
 
Yes, my question is, how can TMDS chips, which already work well and people are using for DVI LCD monitors on their desktop result in severe color degradation? These chips do nothing more than take a digital framebuffer and encode it for wire transmission. They don't convert color spaces or run color filters, and why the hell would the color for HDMI output be so much different than component? For component, one actually has to convert sRGB data into luminance/chrominance formats, so there is not only the A->D conversion, but the mathematical errors introduced caused by repeated colorspace transforms.

This if anything, would cause component colors to be fubared. So "how could they botch this up!" is a real good question. It frankly, sounds bogus to me. D->D transmission with wire-encoding of digital data should not result in worse reconstruction of the sRGB framebuffer than sRGB digital -> YUV colorspace digital -> analog modulation -> analog to digital -> YUV to sRGB.
 
Shifty Geezer said:
How do they bodge it up though? Digital to digital transmissions should have no variation in data. And if they're not changing the data to add problems, where do they come from?

Does it matter? Bottom line is teh implentation sucks across a wide range of CE products. It's really irrelevant 'why' it sucks, it just does in alot of cases.

It all fine to say people have 'broken' hardware, but what are they supposed to do?replace it? You're stuck with what you have, and alot of the hardware out there just doesn't seem to handle HDMI as well. When even bleeding edge tech like BR has issues, that speaks loudly to me about the current state of HDMI.

As a consumer, I'm more comfortable sticking to component as I don't have to risk buying 'broken' hardware.
 
DemoCoder said:
This if anything, would cause component colors to be fubared. So "how could they botch this up!" is a real good question. It frankly, sounds bogus to me. D->D transmission with wire-encoding of digital data should not result in worse reconstruction of the sRGB framebuffer than sRGB digital -> YUV colorspace digital -> analog modulation -> analog to digital -> YUV to sRGB.

Botching up examples in my experience has been mostly due to poor handshaking between display and the players. Nothing like staring at blue or dark grey screens with error text for hours trying to get the devices talking to each other at correct resolutions, especially at the client's place under tight schedules!
 
Shifty Geezer said:
How do they bodge it up though? Digital to digital transmissions should have no variation in data. And if they're not changing the data to add problems, where do they come from?


Mostly it's just that device A cannot negotiate HDCP correctly with device B so it just doesn't work.
My cable box won't sync reliably with my Projector over DVI for example.
Although it's also possible for the resolution negotiation to be messed up so you don't get the output resolution you expect.
 
Strictly speaking, HDMI supports not only the RGB 4:4:4 color space but the Component Color Space (YCbCr) in both 4:4:4 and 4:2:2 encoding so a color space conversion may not be done prior to sending a signal out of either the HDMI or Component video outputs. Unlike DVI, HDMI is a consumer electronics product oriented output; it even supports 480i output where DVI does not.

The ATSC spec for digital TV uses the component video color space, so any preparation of a video signal for transmission to an HDTV set from the sRGB color space must be converted to the component color space, regardless of how that signal is then sent out, digital or analog. Some sources and displays will give you the option to use RGB instead, but in some displays this simply means that it is prepared to receive and then convert the signal into the proper color space before passing it on. In some sources this setting can mean that the signal is converted back into RGB after being converted into component, rather than just not being converted at all. The proper way to do things? Not at all. Does it happen? Certainly.
 
Reference? I have seen examples of DVD players that output RGB over HDMI. In any case then, the likely source of the original article's confusion could be a colorspace mismatch between display and playback device.

See http://www.avsforum.com/avs-vb/archive/index.php/t-486428.html

I guess my objection is the idea that ATSC is relevant for wired packaged media playback. For broadcast media yes, but any device with a DVI/HDMI output show be capable of treating the display like a monitor. I can plug my computer into my HDMI port of my TV and it deals with sRGB output fine.
 
Here's a link to a decent primer on HDMI: http://www.audioholics.com/techtips/specsformats/HDMIinterfaceguide.php

This is a decent article on color space differences and conversion between HDTV and PC: http://www.videsignline.com/howto/183700392

There's nothing that says that a DVD player can't output RGB over HDMI since RGB is supported. However, the DVD is encoded in YUV (YCbCr) and then a color space conversion is performed on the signal to convert it to RGB for output. This is a pretty decent explanation of DVD color space: http://www.dvd-replica.com/DVD/colorspace.php

Your display may accept sRGB just fine, and depending on how its designed sRGB may be treated like a higher quality signal requiring less internal processing by the display. Or it my not. But that differs from display to display, which is all I've really been saying. You won't know, nor should you assume, that using an HDMI connection or outputting your video in a certain color space will be better than component and its color space until you try it. The odds are that HDMI will be better, especially the newer your equipment is, since HDMI 1.3 should be the last major revision of the spec for a while.
 
scooby_dooby said:
Does it matter? Bottom line is teh implentation sucks across a wide range of CE products. It's really irrelevant 'why' it sucks, it just does in alot of cases.

In my experience component implementation is also broken - except it's not just in a lot of cases - I've never actually seen it done right at all. So I'll go for digital ASAP, because although there are some teething issues, it can and *does* work correctly with the right set-up.
 
scooby_dooby said:
How so? I haven't heard a fraction of the horror stories about component that I've heard about HDMI.

I have yet to see an implementation where I can put two different pixels next to each other, and have them actually displayed discretely without bleeding into each other.

For DVD it doesn't matter because the source material has already been lossily compressed (though for the record, I use a digital signal path from my DVD player to my screen anyway). For "next-gen" content, frankly I expect better. I don't like the fact that what amounts to an interconnect is allowed to throw my content away.

The only horror stories I've seen about HDMI, are things not working - usually because they didn't follow the specification and aren't compatible. They're fringe cases IMO, and won't apply to much equipment in the longer term. Mostly it does seem to work. And the upshot of that is that pixels on the display are the colour I told them to be, all the way back in the frame-buffer. Which is no less than I ought to be able to expect.
 
Back
Top