DisplayPort

epicstruggle

Passenger on Serenity
Veteran
If this thread would be suit better elsewhere, please kindly move it.

http://www.vesa.org/press/displayportaug.htm
DisplayPort allows high quality audio to be available to the display device over the same cable as the video signal. It delivers true plug-and-play with robust interoperability, and is cost-competitive with existing digital display interconnects. Designed to be available throughout the industry as an open, extensible standard, DisplayPort is expected to accelerate adoption of protected digital outputs on PCs to support viewing high definition and other types of protected content through an optional content protection capability, while enabling higher levels of display performance.

DisplayPort enables a common interface approach across both internal connections, such as interfaces within a PC or monitor, and external display connections, including interfaces between a PC and monitor or projector, between a PC and TV or between a device such as DVD player and TV display. The standard includes an optional digital audio capability so high definition digital audio and video can be streamed over the interface, and it provides performance scalability so the next generation of displays can feature higher color depths, refresh rates, and display resolutions. It also features a small, user-friendly connector optimized for use on thin profile notebooks in addition to allowing multiple connectors on a graphics card.


Cnet is also carrying a story on it. LINK


epic
 
I'm sure its a very wonderful new port, but what's wrong with moving over to HDMI for now?

Anand even did a story about just such a move in January: http://anandtech.com/multimedia/showdoc.aspx?i=2321

Ok, so HDMI does 5Gb/s instead of the 10Gb/s of Displayport, but since 5 is enough for 1080P plus 7 channel HD audio (SACD or DVD-A) I am lost as to the advantages of Displayport. And HDMI's HDCP encryption is sufficient for current content, or at least it seems that way based on how it has become the DVD player and HD-TV standard.

"FCC just mandated that all digital cable ready TVs sold after July 2005 must have DVI-HDCP or HDMI-HDCP capability."

Plus as far as I'm aware HDMI streams allow considerable hand-shaking communication, and they can undoubtedly allow for plug and play communication as seen in older PC video streams?

Looking at displayport spec, I assume it can do higher resolutions than 1080P, or eventually be used for dual video streams from a single port, but is this sufficient reason to start using it in the short term? It seems to me that for the consumer a move to HDMI for the next 3-5 years would allow PCs and HD-TV equipment to interoperate as never before, and be a significant improvement on now, irrespective of the eventual need to move to a higher standard for more resolution later on.

Come on VESA... move over to HDMI for video, with iLink for digital audio output on future sound cards :) Then I can buy an HD-TV and a high end iLink amp and be happy about interoperability with any HTPC for at least the next 5 years.
 
But HDMI is essentially repackaged DVI, and there doesn't seem to be much controversy about THAT interface, considering it's on every graphics card, and on quite a few TVs, projectors, DVD players etc. So what's the deal with HDMI?
 
Unfortunatly this will be most likely implemented because its going to make it easier to control content.

epic
 
PC-Engine said:
HDMI has bandwidth limits beyond 1920x1080. Single link cannot go higher hit wall.
Not really true. Currently, all (?) HDMI implementations use type A connectors, but HDMI also defines type B connectors, which have 2 tmds channels (e.g. correspond to dual-link dvi). That gets the bandwidth up to just the same as the new proposed standard.
That's not to say there are no good reasons for this new standard (I guess there's a reason I've never heard of a tmds chip NOT from silicon image...), but it can't really be a bandwidth problem of hdmi...
 
epicstruggle said:
Unfortunatly this will be most likely implemented because its going to make it easier to control content.

epic

Content control isn't part of the base standard and therefore optional (and hopefully stays that way).

I think the real reasons why they want the new standard are (quoted from the cnet article:

"The specification, called DisplayPort, lets high-quality audio share the same cables as video signals and allows for data transfer rates of up to 10.8 gigabits per second using a total of four lanes. That's fewer lanes than many current cable interfaces."

"Picture quality is improved with DisplayPort, VESA member Bill Lempesis said, because the specification allows for higher bandwidth and refreshes images instead of reloading them, which makes for better performance on the screen." I don't know how the image quality is "improved" with refreshing compared to reloading but I'd imagine it would cut down on data transfers.
 
mczak said:
Not really true. Currently, all (?) HDMI implementations use type A connectors, but HDMI also defines type B connectors, which have 2 tmds channels (e.g. correspond to dual-link dvi). That gets the bandwidth up to just the same as the new proposed standard.
That's not to say there are no good reasons for this new standard (I guess there's a reason I've never heard of a tmds chip NOT from silicon image...), but it can't really be a bandwidth problem of hdmi...

That's why I said single link. Yeah sure you can get past the wall, but nobody wants type D with 4 separate links and a $200 cable. :rolleyes:

Read the post from A688 above.
 
This is an incredibly welcome development.
If Microsoft could get its act together on a resolution indipendent GUI, (I'm not holding my breath), perhaps we will finally see widespread use of displays with both higher pixel density and higher pixel count.

I'm willing to bet the farm that Microsoft will fumble the ball with Vista as well though.

Still, todays situation, with display technology being limited by the #%&!! connector, is ludicrous. I hope they get this ball rolling, and quickly.
 
Entropy said:
This is an incredibly welcome development.
If Microsoft could get its act together on a resolution indipendent GUI, (I'm not holding my breath), perhaps we will finally see widespread use of displays with both higher pixel density and higher pixel count.

I'm willing to bet the farm that Microsoft will fumble the ball with Vista as well though.

Still, todays situation, with display technology being limited by the #%&!! connector, is ludicrous. I hope they get this ball rolling, and quickly.

But given the price of >24 inch screens, it seems to me that it's pretty logical that typical graphics cards don't need to cater to resolutions higher than 1920x1080 (or 1920x1200 at a pinch) just yet. If you can afford a 30 inch screen, you can afford a dual-link DVI video card.

Besides, Nvidia has put a dual link DVI port on every 7800 series model released so far, which as far as I'm concerned is a very good sign.
By the time most of us have screens which go above 2xxx resolution, we should all have TMDS chips onboard our graphics cards which can handle it. And that might be for DVI, HDMI or Displayport - in the end the PC makers will have to allow for conversion and compatibility to avoid a total fiasco.
 
skazz said:
But given the price of >24 inch screens, it seems to me that it's pretty logical that typical graphics cards don't need to cater to resolutions higher than 1920x1080 (or 1920x1200 at a pinch) just yet. If you can afford a 30 inch screen, you can afford a dual-link DVI video card.

Besides, Nvidia has put a dual link DVI port on every 7800 series model released so far, which as far as I'm concerned is a very good sign.
By the time most of us have screens which go above 2xxx resolution, we should all have TMDS chips onboard our graphics cards which can handle it. And that might be for DVI, HDMI or Displayport - in the end the PC makers will have to allow for conversion and compatibility to avoid a total fiasco.
With all due respect, you are thinking too narrowly.
Wouldn't you like 200 dpi screens? 300 perhaps? With anti-aliasing producing print like quality, even better in some respects?
It's possible right now, but is hampered by industry inertia, mostly in software. And, of course, the connector.
Wouldn't you like, once there are panels out that offer extremely fast switching (using OLED or whatever) want to experience smoother animation, without being limited by - the connector?

Why, oh why, when proposing a digital replacement for the cheap-ass VGA connector, did they settle for a standard with less than half the bandwidth of its analog predecessor? Ick! Yes, in true PC tradition it served the immediate purpose in the cheapest possible way, but it was a limitation from the start in some niches, and it hasn't taken long even for the relatively mainstream market to catch up and chafe against its limitations. Why should Microsoft care about a resolution independent GUI if nobody can drive monitors where it would offer real benefits due to a connector standard? Why care about stereoscopic 3D when monitors flicker horribly because of too slow refresh-rates, why bother with....

The list goes on. Don't think about it from the standpoint of where we are, think about it from the standpoint of where you'd like to go and fix whatever needs fixing to get there. Being limited by the connector is ridiculous.
 
As far as I am concerned, routing the audio to the monitor is a waste of time, all you are doing is making the audio path longer as it goes from the monitor to an amp that can handle dolby digital or DTS. It implies a tie in between video and audio otherwise, who wants to buy both their audio solution and video solution at the same time? Unless you are a wallmart electronics section super fan, noone is going to want a 10amp el cheapo digital audio solution built into their monitor or tv, or be willing to add $1000 to the cost of every tv they buy every 5 years. If all computer audio was digital that would be cool, and it would be even cooler if the 3D stuff of the video card was used for 3D audio, (see a few practical issues there, but certainly possible if really motivated with control over all the hardware and OS/drivers). Unless you have that, you are looking at noisy patch cables from the audio to the video card or bouncing audio around the system bus, for what? So you can use the tiny speakers on your lcd monitor and eliminate one cord?

Way I see it, the only point of this is to get HDCP or other crap into your computer, that's all. Faster cables? We use single DVI-D now just fine, we want twice that, there is DVI-DL, if we need even faster than that in the future, maybe another cable, if we aren't all using laptops by then. :)
 
Entropy said:
With all due respect, you are thinking too narrowly.
Wouldn't you like 200 dpi screens? 300 perhaps? With anti-aliasing producing print like quality, even better in some respects?
It's possible right now, but is hampered by industry inertia, mostly in software. And, of course, the connector.
Wouldn't you like, once there are panels out that offer extremely fast switching (using OLED or whatever) want to experience smoother animation, without being limited by - the connector?
There's no way in hell 200+ dpi fastswitching screens covering a large-ish physical real-estate area could be technically possible within even a foreseeable future while costing less than an absolute fortune.

Even the "low" 1920*1200 @ a rather pedestrian 60Hz requires a 4.4Gbit/s bandwidth connection, assuming you want 32bpp. A screen DPI of 300 would quickly ramp screen pixels into the tens of millions. Add 'extreme fast-switching' to this and you end up with RAMDAC use requiring many gigabytes/sec of memory bandwidth just to feed the screen, and any fullscreen read-modify-write redraws would mean quadrupling that bandwidth. It'd be a challenge to design a video card with LOCAL memory that could deliver this level of performance, much less a cable interface that could do it reliable and over an extended distance...
 
I seem to recall an IBM study suggesting that 120 or 133 DPI (at typical monitor viewing distances) was the plateau where adults stopped seeing any quality improvements with increasing pixel density. And although there is certainly room for improvement in the colour gamut area, I suspect that diminishing returns will be reached quite quickly.
 
Temporary Name said:
I seem to recall an IBM study suggesting that 120 or 133 DPI (at typical monitor viewing distances) was the plateau where adults stopped seeing any quality improvements with increasing pixel density. And although there is certainly room for improvement in the colour gamut area, I suspect that diminishing returns will be reached quite quickly.

Well, I've used higher density screens, and I love them.
Dotmatrix -> laser difference, to those of you old enough to have experienced that paradigm shift. And for exactly the same reasons.

Your reasoning Guden, is the reason why we have DVI in the first place. It is perfectly valid, hell, it's the ingrained tradition in PC-space. Once a problem acutely needs to be solved, you look at the mainstream part of that problem and solve it in the cheapest possible way. Which is why PC evolution looks the way it does. PC users regard this as the natural order of things, where people with other experiences tend to regard it as one kludge built upon the previous kludge, which didn't really replace the kludge before that which hangs around and has to be supported... some problems never get adressed at all because they don't really affect all that large a part of the total usergroup, so either development grinds to a halt or they have to make their own expensive and incompatible solution even though a platform wide solution would be trivial.
For things where there are real difficulties or significant expense involved in doing what you want this is understandable, but when it comes to where a cheapo connector is a very real force in how displays and OS display models evolve, then it's pathetic.

All IMHO obviously.
 
Himself said:
As far as I am concerned, routing the audio to the monitor is a waste of time, all you are doing is making the audio path longer as it goes from the monitor to an amp that can handle dolby digital or DTS. It implies a tie in between video and audio otherwise, who wants to buy both their audio solution and video solution at the same time? Unless you are a wallmart electronics section super fan, noone is going to want a 10amp el cheapo digital audio solution built into their monitor or tv, or be willing to add $1000 to the cost of every tv they buy every 5 years.

HDMI combines audio and video signals already. I assume VESA wants to combine audio and video together is for home theaters where a SINGLE cable from the cable box/satellite box/whatever goes to a receiver and a SINGLE cable goes from the receiver to a tv. Just because the cable CAN carry audio doesn't mean it will and even though a cable might carry audio to the tv doesn't mean you can't mute the sound or turn it all the way down on the TV and just use the speakers connected to your receiver.

PS: Pretty much ALL tvs have shitty speakers built into them already so I don't see why you are getting your panties all in a twist over "audio solutions" being built into tvs.
 
Thankfully i-link was developed for HD audio connections. Now if the hifi industry would all start using it, not to mention PC soundcard makers.... ;)

I regard the audio channel in HDMI/displayport as being analogous to the audio in SCART. You can use it for a simple source->display connection if you have no desire for better sound. Very handy for a digital playback/video recorder device+TV in a bedroom, for instance.

Oh, and there are loads of SCART -> s-video + analogue audio splitter cable options to make cable runs slightly easier, but I imagine splitting out the audio from HDMI/displayport won't be possible into say i-link plug format?
 
a688 said:
HDMI combines audio and video signals already. I assume VESA wants to combine audio and video together is for home theaters where a SINGLE cable from the cable box/satellite box/whatever goes to a receiver and a SINGLE cable goes from the receiver to a tv. Just because the cable CAN carry audio doesn't mean it will and even though a cable might carry audio to the tv doesn't mean you can't mute the sound or turn it all the way down on the TV and just use the speakers connected to your receiver.

HDMI is just as lame. So, instead of routing the audio to a tv and from the tv to an amp, you'd rather route video to an amp and from the amp to the tv. What's the point in doing either, keep the cables separate. Having breaks along the line for audio or video is generally bad news, granted, not so bad if it's all digital, but certainly defeats the point of using quality cables. You are still using two cables, and it's no cheaper, this kind of thing only makes sense for tv's with their crappy built in sound, or lcd monitors with their tiny speakers, I'm sure someone uses those..

If they came up with optical cables for video that would be interesting.
 
skazz said:
Thankfully i-link was developed for HD audio connections. Now if the hifi industry would all start using it, not to mention PC soundcard makers.... ;)
Indeed. It baffles me to no small degree why sound card manufacturers spend time and money developing complicated hacks like real-time AC-3 encoding instead of just sticking a Firewire port on their hardware, offering undiluted multichannel audio. I realise, of course, that SPDIF connectors are ubiquitous on the speaker/amplifier end, but that could change quickly with a little will.

I regard the audio channel in HDMI/displayport as being analogous to the audio in SCART. You can use it for a simple source->display connection if you have no desire for better sound. Very handy for a digital playback/video recorder device+TV in a bedroom, for instance.
True also. Millions of people connect their VCRs, DVD players and set-top boxes to their televisions with a scart cable, getting basic audio that way. That's the idea behind the HDMI audio capability too. I can't understand the people complaining about it. The fact that it is capable of carrying audio doen't mean you need to use it do that.
 
Back
Top