HDCP Support In Next-Gen Hardware

By Justin Mann on August 22, 2005, 8:02 PM
We have all heard of the new-age display devices that will be capable of displaying incredibly detailed content, and are aware of the fact that many of these newer devices are essentially a DRM interface in that non-DRM content will be played at reduced quality, if at all. But did you know that the 3 year old standard employed by these devices, HDCP, is not even yet present in the highest of the high end of current display devices? This means that in all likelyhood, even high-end $3,000 displays will not be able to play newer content at full capacity, if at all. I am not much one for big brother tales, but the more I hear about HDCP and the way Microsoft and Apple intend to implement, the more it sounds to me like a set of virtual handcuffs. Although the article raises some good points on why this technology can (could) be a good idea, restricting the way I use my own data which I paid for has always been something frustrating.

If a monitor doesn't support HDCP, one of two things will happen at the discretion of the content providers. It's a possibility that a given studio may simply refuse to allow the content to be displayed at all. More likely, the studios will allow for playback on unauthenticated devices with purposely degraded quality. The thinking is that Joe Consumer will be more likely to pay for HD content than seek out pirated content that's not in HD
Essentially, this is a sneaky way of forced upgrading, whether or not you actually need it. Want to rent DVDs with HDCP video? Better get that newer screen!




Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.