DVI, HDMI 1.3 and HD displays

Scott_Arm

Legend
I'm looking into buying a new computer at some point and buying a small HDTV as a monitor. I thought the monitor would come first, but I don't really follow HD specs that much and I've just recently discovered HDMI 1.3.

Do you think HDMI 1.3 is worth waiting for if t he display is going to be used primarily as a computer monitor with the occasional HD movie?

Is DVI w/ HDCP even capable of pushing the colour fidelity that HDMI 1.3 is capable of?


You know, I like reading about technology and advancements, but some of this technology turnover rate is really starting to bother me. I don't have an HDTV yet and the industry keeps finding ways to make me hold off. I mean, they should be making standards and sticking with them for a significant period of time, even if they can make something better.
 
I'm looking into buying a new computer at some point and buying a small HDTV as a monitor. I thought the monitor would come first, but I don't really follow HD specs that much and I've just recently discovered HDMI 1.3.

Do you think HDMI 1.3 is worth waiting for if t he display is going to be used primarily as a computer monitor with the occasional HD movie?


If it's just for internet browsing and all that, then there wouldn't be much point. If you're work with photoshop professionally, then i guess it might come useful...? Maybe... Depends how occasional is that HD movie...

Is DVI w/ HDCP even capable of pushing the colour fidelity that HDMI 1.3 is capable of?

Nope.
 

I'm going to try and find out more about this as I'm not convinded that is correct. The colour depth may depend more on the device capabilities rather than the actual interface. For instance, Radeon X1000 series can drive "deep colour" (up to 16-bit per component) displays via DVI.
 
I was just curious as to whether the video card manufacturers would have to move over to HDMI to take advantage of deep colour displays.

It seems like DVI can do 7.4 Gbps and more than 24bits per pixel in dual-link mode (http://en.wikipedia.org/wiki/DVI). HDMI 1.3 can do 10.2 Gbps (http://en.wikipedia.org/wiki/HDMI#HDMI_1.3a). Only HDMI type B connectors are compatible with DVI dual-link connections, but according to the wiki type B connectors are not in use. Sounds like a pretty complex problem. Do any video cards even use DVI dual-link outputs?


I'm one of those people that wants to know that anything I purchase that costs in the range of hundreds of dollars or more is going to last me a reasonably long time. When you get into the thousands, it had better be pretty future proof. It's annoying to buy something so expensive and have it outdated so quickly. It's like they're bringing the annoying hardware cycle of the PC space to consumer electronics.
 
I'm fairly certain all X1000 series cards have dual-link DVI connectors, or at least the majority of them do. My X1950 Pro has two DVI-I dual-link connections and supports HDCP (though I believe that is not the norm on most cards still, but don't quote me on that one).
 
Scott_arm, how 'small' of an HDTV are you considering?

Sharp in the US has 2 32s with 1920x1080 res coming up.

While hdmi 1.3 is nice, its the other things that are lumped with it that make it nicer. xvYCC will be there meaning the backlight has to be able to support such colors. If its not very verywide gamut CCFL then its LED. And because it has 1.3, it will probably be a highendish display so 120hz will probably be included. Then it will probably uses the latest panels with higher contrast ratios and maybe even 10 bit panels ...

You're not likely to have 1 without other things throw in.

Is video cards going to support xvYCC in the near future?

Looking at Dells new 2707 which has a wide gamut backlight, the menu lists a setting for changing color formats to RGB or YPbPr over dvi(does that make sense?)
 
Scott_arm, how 'small' of an HDTV are you considering?

Sharp in the US has 2 32s with 1920x1080 res coming up.

While hdmi 1.3 is nice, its the other things that are lumped with it that make it nicer. xvYCC will be there meaning the backlight has to be able to support such colors. If its not very verywide gamut CCFL then its LED. And because it has 1.3, it will probably be a highendish display so 120hz will probably be included. Then it will probably uses the latest panels with higher contrast ratios and maybe even 10 bit panels ...

You're not likely to have 1 without other things throw in.

Is video cards going to support xvYCC in the near future?

Looking at Dells new 2707 which has a wide gamut backlight, the menu lists a setting for changing color formats to RGB or YPbPr over dvi(does that make sense?)


Well, it has to fit on my desk to be usable as a monitor. Do they measure widescreen monitors width wise rather than diagonally? I don't know if a 32" display would fit on a desktop very well. Seems like I'd be way to close for a screen of that size. 26" was the upper end that seemed reasonable to me, but I haven't been looking yet, just thinking about.

I'm not very HDTV literate, so can you explain CCFL backlighting, or point me to a link that explains the term?

I'm not sure when video cards will support xvYCC.
 
CCFL is just Cold Cathode Fluorescent Lamp.

I don't think you'll find hdmi 1.3 on desktop monitors for a while. It will be top end tvs first juding from CES last week.

The closest on desktop is the nec 26, dell 27 or the newer 3007-HC. I doubt they even accept any deep color formats even thought they can display a higher color range.
 
CCFL is just Cold Cathode Fluorescent Lamp.

I don't think you'll find hdmi 1.3 on desktop monitors for a while. It will be top end tvs first juding from CES last week.

The closest on desktop is the nec 26, dell 27 or the newer 3007-HC. I doubt they even accept any deep color formats even thought they can display a higher color range.

So, as far as mainstream electronics goes, deep colour and HDMI 1.3 acceptance and features are a long way off. As much as the new colour format sounds great, I wish they'd kept it under wraps and out of the HDTV space for a long time. I always hate buying something when the "better" version has already been announced.

Another somewhat related question about this whole thing: Do the current HDTV content encoding formats, like H.264, support the new colour format, or will new encoding methods be required?
 
Thats a very good question. When I read about Sony's avchd h.264 camcorders supporting xvycc and the ps3 outputing it in future firmware upgrades, I just assumed bluray/hddvd was designed with it min mind. Back to avsforum!
 
If its not very verywide gamut CCFL then its LED.

The problem is that the CCFL backlights, although they can reproduce a wider colour gamut, are still "old style" LCD backlights, which means that black levels and therefore contrast will still be "LCD style" black levels and contrast.

Things need to move to dynamic arrays of LED lights to ensure that blacks really are black (as the LED would turn down or off in dark areas), and it's good to see that manufacturers are moving in that direction relatively quickly...
 
Things need to move to dynamic arrays of LED lights to ensure that blacks really are black (as the LED would turn down or off in dark areas), and it's good to see that manufacturers are moving in that direction relatively quickly...

LEDs will introduce another set of problems though; heat and power usage. Until they pulse them anyhow.
 
It seems like DVI can do 7.4 Gbps and more than 24bits per pixel in dual-link mode (http://en.wikipedia.org/wiki/DVI). HDMI 1.3 can do 10.2 Gbps (http://en.wikipedia.org/wiki/HDMI#HDMI_1.3a). Only HDMI type B connectors are compatible with DVI dual-link connections, but according to the wiki type B connectors are not in use. Sounds like a pretty complex problem. Do any video cards even use DVI dual-link outputs?
OK, I caught up with one of the display guys.

Most graphics cards these days do have at least one dual-link DVI output on the, many above entry level now actually support two. However, "deep-colour" displays for the PC are acheved by using single link resolutions but (kinda) spilling the bit width in to the second link. HDMI 1.3 is still a single link but increases the singalling rate for the TMDS transmitter to over 300MHz, which kind of makes them incompatible.

There may be displays that have dual-link DVI input capabilities, but I'm not sure I'd count on that.
 
OK, I caught up with one of the display guys.

Most graphics cards these days do have at least one dual-link DVI output on the, many above entry level now actually support two. However, "deep-colour" displays for the PC are acheved by using single link resolutions but (kinda) spilling the bit width in to the second link. HDMI 1.3 is still a single link but increases the singalling rate for the TMDS transmitter to over 300MHz, which kind of makes them incompatible.

There may be displays that have dual-link DVI input capabilities, but I'm not sure I'd count on that.

Thanks for the information.

Now to find out about xvycc and H.264 or VC-1(?). I'm curious to see what the issues are in providing content to these new fancy hdmi1.3 displays.
 
How do you find out what revision of HDMI a given device uses?
 
What if it doesn't? Should one assume it is the latest revision if one has a recent monitor model?
 
Yeah, I guess it's no use having HDMI 1.3 if you don't have a display capable of deep colour and stuff like that.
 
Yeah, I guess it's no use having HDMI 1.3 if you don't have a display capable of deep colour and stuff like that.

Absolutely. And i wouldn't bother with "old style" LCD either (as in non-LED backlight), cause they will still have issues with black levels - which affects contrast and therefore the ability of the set to display all the colours it's marketed as being able to display.

If you want a TV for HDMI1.3 and Deep Colour, then you have to wait until the sets can do proper blacks, which is not going to happen until LED arrays are used. Or OLED. Or anything really except old style LCD backlit screens.
 
Back
Top