DRM for Privacy: Part 2

You may also like...

5 Responses

  1. Adam Shostack says:

    Hi Ryan,

    Some colleagues and I have a somewhat related paper in the first workshop on economics of information security, ”
    We considered the “traditional” proposed use of DRM protections for privacy, such as wrapping identifiers in crypto. Your posts strike an interesting middle ground. The technologist in me has a hard time considering these things as Technical Protections under the DMCA, but that statement applies to many things which, as you point out, are Technical Protections.

    Joan Feigenbaum, Michael J. Freedman, Tomas Sander and Adam Shostack, “Economic Barriers to the Deployment of Existing Privacy Technologies.” http://www.homeport.org/%7Eadam/econbar-wes02.pdf

  2. Ryan Calo says:

    Thanks for your comment, Adam, and for the very interesting position paper. Also: great Twitter pic.

    Ryan

  3. Chris Soghoian says:

    Ryan,

    Instead of describing this as DRM, you may want to consider describing this as “anti circumvention protections for privacy”. This is far more accurate, and avoids the initial negative response that most people will have who hate DRM with a passion.

  4. Bruce Boyden says:

    It’s not a case, but I don’t think a mere warning of copyright protection would qualify as a technological measure under Section 1201(c)(3), the “no-mandate” clause. I describe this to my students as a provision that clarifies that there is no requirement in 1201 to look for and respond to what I call “flags” — bits of information that signify the protected status of the content, but do not themselves block or restrict access to the content. The reason that provision was inserted was that computer manufacturers did not want to have an obligation to scan all data coming into the computer to see if it carried this or that flag in it somewhere; rather, all they would have to do is not actively circumvent technological measures that blocked or restricted access, without authorization. There is a possible caveat here for “flags” that are required by some sort of law or regulation, and thus mandated elsewhere; perhaps “Do Not Track” would fall under that category. But it’s an untested argument.

  5. Steve Mathews says:

    In the Standards Committee ISO/IEC JTC1/SC32 WG1 (a bit of a mouthful, but that’s labelling for you) we have always taken the approach that Privacy can only occur where the sender of information is effectively able to constrain the use of data that they provide for the furtherance of some transaction. Whatever else you may think about DRM, that is exactly the function it SHOULD provide (not sure about the monitoring and tracking stuff). There is a whole standard (ISO/IEC 15944 Part 8) available free and addressing this requirement.

    But what you have to take on board is that no commercial party wants to have anything less than the total ability to do what they want with your data, howsoever they have obtained it. So far, what you do have in regulation is a ‘sort of’ requirement to encrypt data at rest because some folk in California had their vehicle licensing data stolen.

    Now you may not care for DRM – that’s not my problem, but absent the ability for a user to be able to determine the forwards use of their data then you have the free-for-all so ably demonstrated by all and sundry, not just the social networking sites.