DaveBaumann said:DemoCoder said:DaveBaumann said:plug an HDMI output into a high-def display and without DRM you'll get nothing.
This is not true. At worst, you'll get 480p.
So you're saying an HD source will downscale to 480 without DRM?
[/quote]
That's the reality today. No one would be stupid enough to ship a device that wouldn't work AT ALL without HDCP since the vast majority of deployed HD sets don't have DVI+HDCP nor HDMI. As a practical matter, there are two primary sources of recorded content today: DVD and Cable/Satellite. DVD players are inherently 480p. If they output more, they must have a builtin scaler (many do). Cable/Satellite/TV tuner devices typically have builtin scalers as well. In fact, so many of my HT devices have scalers, I find it annoying to have to pay extra for all the scalers, and manually switch them off. (PJ has scaler, receiver has scaler, DVD has scaler, etc) Faroudja scalers are practically a commodity now.
What I am saying is this: In order to avoid being sued by irate customers, today's HD devices have fallbacks to 480p when DRM isn't available. My receiver already has it today for Macrovision. And since no real true HD sources exist except for broadcast/cable/satellite, and existing tuners have such fallbacks for a fact, I feel confident in stating its not needed, especially on an HTPC.
The fact of the matter is, no Video Card will be *rejected* by the HD display device before it lacks HDCP. That is not how the HDCP works.
The only reason to put HDCP into your video card is so you can make someone selling a software DVD player "more legal". To me, it's worthless, and I'd rather not have it. There's no *consumer* reason to have it. It's only a selling point for people who want to sell software that outputs DVD and others scaled up.
Is there anything to suggest it does "switch on" when those sources aren't there?
Many HD devices were simply manufactured before HDCP. currently, HDCP isn't used.