DisplayPort

Time to resurrect this old thread. :)
http://www.engadget.com/2007/04/04/vesa-approves-displayport-1-1-kiss-those-dvi-and-vga-ports-good/
http://home.businesswire.com/portal...d=news_view&newsId=20070402006096&newsLang=en
The Video Electronics Standards Association (VESA) announced today that its membership has approved version 1.1 of the DisplayPortâ„¢ interface standard for use in new designs of flat panel displays, projectors, PCs and CE devices.

DisplayPort 1.1 gives manufacturers of LCD panels, monitors, graphics cards, PC chipsets, projectors, peripherals, components, and consumer electronics a next generation digital interface that is designed to replace LVDS, DVI, and eventually VGA. DisplayPort provides the ability to connect to both internal and external displays with a common digital interface. This common interface capability means that DisplayPort can carry pixels directly from any display source to any LCD panel, simplifying the design complexity that is present today. As presented at CES 2007, a DisplayPort Interoperability Guideline that recommends best practices for providing DVI and HDMI connectivity via the DisplayPort connector and simple cable adapters is nearing completion.
 
DisplayPort is all about cost, especiallly IP costs. However, HDMI is entrenched now and won't go away. Most upcoming CE living room equipment will now have HDMI, and because of the network effect, no one will be able to ship a display without paying the HDMI "tax" So there is zero benefit to DisplayPort in those markets, since they must support HDMI.

DisplayPort only has a chance in the PC realm, but even there, I have my doubts. Display makers will have to continue to support DVI/HDMI (umm, dongle adapter? Uh yeah, just like I love using DVI->HDMI adapters too), and that means GPU manufacturers will need to support DVI/HDMI SKUs. DisplayPort would just add another SKU, complicating the lineup, making stuff more complex, for a market with zero penetration at the moment. The GPU manufacturers have a vested interest in wanting to get away from HDMI IP, but the reality is, they can't, so I predict they'll ship token cards with DP support, and quitely bury it.

DIsplayPort had a real chance if VESA hadn't taken so damn long. If they had shipped 1.1 a year ago and gotten some HW out in the market, it may have had a change. It's just not offering enough benefits over HDMI for consumers to care.
 
DIsplayPort had a real chance if VESA hadn't taken so damn long. If they had shipped 1.1 a year ago and gotten some HW out in the market, it may have had a change. It's just not offering enough benefits over HDMI for consumers to care.
Exactly, this thread is almost 2 years old. It is unlikely that DisplayPort will be adopted at this point, but there is always hoping.
 
It is unlikely that DisplayPort will be adopted at this point, but there is always hoping.
Hoping why?

There's no tangible benefit for any end-users to have this thing. Maybe in the past if it had been available but now HDMI 1.3 is comin gwhich is much faster than standard DVI/HDMI should anyone have issues with the current version's performance (unlikely at the moment).

All it accomplishes is fragmenting the market by being incompatible with the overwhelmingly dominant competitor which most likely will be fatal for it lknowing thw way the tech sector works..
Peace.
 
No benefit? How about cost. Every single HDMI capable system that is sold has to pay royalties on the patent. Royalties/licensing fees from players, display devices, receivers, etc. Add to that royalties/licensing fees for DVD, HD-DVD, Blu-Ray, Dolby, THX, etc...

Presumably Displayport being a VESA standard would mean no royalties or fees to pass along to the consumer.

Personally, I'm all for less fees being passed along to us.

And I'm not sure whoat home theatre enthiasts you know, but every single one that I know passes both Audio and Vido to the amplifier/receiver. Usually from 2-5 sources. Then a SINGLE video cable is passed on to the TV.

Thus allowing us to switch both the audio source and the video souce at one point. This also simplifies cable routing.

Currently with my 5 devices. I have around 15 cables going into my receiver and 1 cable going to my TV. It'd be great if everything would use HDMI/Displayport to where I can have just 5 cables going into the receiver and (still) only 1 cable to the TV.

Likewise, if I ever DO want to use my TV for computing, it'd be nice to only have to route 1 cable.

Regards,
SB
 
And how many home A/V enthusiasts who pass all their audio and video through a switcher built into a A/V Receiver are constrainted by HDMI licensing or cable costs? I mean, let's get real. A good THX Ultra2 receiver with A/V switching, support for all the new sound codecs, is going to cost you $1k-3k unless you're talking shitty theater-in-a-box style receivers like the crappy panasonics.

We're talking $1-3k for receiver, $1k minimum for *tolerable* 7.1 speakers, plus several $k for display, yet you think these high margin devices will be effected by HDMI licensing? They're already paying fees for codecs, iLink, and god knows what else, and they will have to have atleast *1* HDMI port, so they're going to pay for it no matter what.

DisplayPort makes zero sense in the home theater market. PCs and embedded mobile space, I could see an argument. A iPod with DisplayPort would make alot more sense. But videophile HW is not going to drop HDMI and the cost is an insignificant fraction of these devices.
 
True the cost of licensing/royalties for each individual thing is insignificant but in total it makes up for rather large chunk.

I'd rather see a trend started towards less licensing rather than a trend for more licensing.

Then again since each additional thing is so "insignifcant" I guess it's all right if more and more fees get passed on.

If we're lucky with that type of thinking every single thing that goes into our electronic devices will have to pay fees for every single little thing.

At least that's the current trend, and it's exactly that sort of thinking which is why it's happening.

Either way, I'll take HDMI or Displayport or whatever single cable solution comes out over the mess of cables that was used previously.

And I'll always hope that a royalty free solution wins out over the competition. Although I suppose the days of a VHS (no licensing fees) winning out over a BETA (licensing fees) are over. Since after all, it's just an insignificant added cost. I'm certainly glad people/manufacturers didn't think that way in the past. :p

Regards,
SB
 
DailyTech stirs the debate further:
http://www.dailytech.com/article.aspx?newsid=6786
Today, it's looking like DisplayPort is pretty much going to happen. AMD and ATI were both big proponents of DisplayPort even before the merger, and that has only amplified with the merger.

Philips, the company behind the DisplayPort Content Protection scheme, has deep relations with virtually every major PC manufacturer. Not surprisingly, Philips also has the most to gain from DisplayPort as DPCP is not royalty free. HDCP, the de facto DRM standard on HDMI, DVI and UDI, is licensed by a subsidiary of Intel.

Several weeks ago I had a conference call with Intel to discuss its position on UDI, which solidified my feeling that the standard is dead in the water. In a nutshell, the company isn't pursuing UDI development anymore -- DisplayPort has taken the center stage.
http://www.dailytech.com/article.aspx?newsid=5359
During its Analyst Day today, AMD announced details that it will be backing DisplayPort as a standard near the end of 2007. DisplayPort will be fully supported in 2008, along with other technologies such as DirectX 10 and HyperTransport 3.0.
So it looks like you have AMD/ATI, Intel and Philips all moving forward with DisplayPort. What about Nvidia?

epic
edit:quotes from a few companies http://www.vesa.org/press/DP_CES_PR_Finalz.htm
 
I'd imagine Nvidia will do as Nvidia has always done, support whatever technology is in use when their chips ship. Especially if both AMD and Intel back it. Of course, that's also dependant on monitor makers supporting display port.

Regards,
SB
 
First LCD to have displayport 1.1 announced:
http://www.engadget.com/2007/07/25/samsungs-30-inch-lcd-with-worlds-first-displayport-game-on/
It's on HDMI fans, the first LCD panel sporting a VESA-approved DisplayPort 1.1 jack was just announced by Samsung -- a world's first. The 30-inch LCD pumps a 2,560 x 1,600 pixels with a 10-bit color depth at a smokin' data rate of 10.8Gbps over a single port.

Looks like we still have about 6 months before its shipped out, but now we just need some graphics card to support it too. :)
 
Well, AMD has already announced that it's HD 3000 (R700) series cards will have support for displayport in the FireGL line. Supposedly it will launch about the same time as that Samsung 30" monitor. Not sure if the consumer line will also have displayport.

I'd imagine Nvidia will do something similar.

So in theory, there will be Display port cards available around when Display Port monitors are available.

It'd be interesting to have a card with 4 display ports on the back. Or a half-height card with 2 display ports. Without having to use ungainly split port adapters as used on many Quadro and FireGL cards.

Regards,
SB
 
with a 10-bit color depth

from the ati web site details 9800pro specs (i assume later cards also support it) :
Dual integrated 10-bit per channel 400 MHz DACs

Now the question when will they actually support it in the drivers, because when i had a 9800pro + a crt I could never enable 10-bit per channel colour ?
 
with a 10-bit color depth

from the ati web site details 9800pro specs (i assume later cards also support it) :
Dual integrated 10-bit per channel 400 MHz DACs

Now the question when will they actually support it in the drivers, because when i had a 9800pro + a crt I could never enable 10-bit per channel colour ?

Maybe it was enabled all along? It says explicitly 'DACs'. This could mean that only the final steps are 10-bits. E.g. there could be a digital filter in there (scaling filter?), or some color conversion tables.
 
with a 10-bit color depth

from the ati web site details 9800pro specs (i assume later cards also support it) :
Dual integrated 10-bit per channel 400 MHz DACs

Now the question when will they actually support it in the drivers, because when i had a 9800pro + a crt I could never enable 10-bit per channel colour ?
10 bit DAC doesn't mean it can output a 10 bit per channel framebuffer. It uses an 8 bit to 10 bit gamma/color correction lookup table, then uses that 10 bit value in the DAC.

The Radeon X1k series can output 10bpc framebuffers AFAIK.
 
but will the drivers allow you to select 40bit colour (or 38bit if the alpha is still 8bit) ?
It's 2 bit alpha and 30 bit color. And yes, you can use it as framebuffer format (though 10bpc output only works in fullscreen mode).
 
It's 2 bit alpha and 30 bit color. And yes, you can use it as framebuffer format (though 10bpc output only works in fullscreen mode).

1. does that mean you can set your desktop to 10bpc or just a 3d app
2. does "you can use it as framebuffer format " mean i cant actually run anything in that depth(eg: quake4) but i could program something to run at that depth
3. if answer to 2 is yes does that mean if i did make a game to run in 10bpc i would be limited to 4 levels of transparency ?
 
1. does that mean you can set your desktop to 10bpc or just a 3d app
Just 3D applications. Matrox has a somewhat hackish way of supporting "Gigacolor" on the desktop as well, but applications need to be patched to take advantage of it.

2. does "you can use it as framebuffer format " mean i cant actually run anything in that depth(eg: quake4) but i could program something to run at that depth
AFAIK no game uses it, but it would be easy to patch into most games, just as it would be easy to be forced by the driver. You can actually force it with the Present Changer plugin of DXTweaker.

3. if answer to 2 is yes does that mean if i did make a game to run in 10bpc i would be limited to 4 levels of transparency ?
No, the destination alpha channel of the back buffer is usually irrelevant for transparency. Most games don't use it at all.
 
Back
Top