Category: Web 2.0

0

The E.U. Data Protection Directive and Robot Chicken

The European Commission released a draft of its revised Data Protection Directive this morning, and Jane Yakowitz has a trenchant critique up at Forbes.com. In addition to the sharp legal analysis, her article has both a Star Wars and Robot Chicken reference, which makes it basically the perfect information law piece…

0

Cybersecurity Puzzles

Cybersecurity is in the news: a network intrusion allegedly interfered with railroad signals in the Northwest in December; the Obama administration refused to support the Stop Online Piracy Act due to worries about interfering with DNSSEC; and the GAO concluded that the Department of Homeland Security is making things worse by oversharing. So, I’m fortunate that the Minnesota Law Review has just published the final version of Conundrum (available on SSRN), in which I argue that we should take an information-based approach to cybersecurity:

Cybersecurity is a conundrum. Despite a decade of sustained attention from scholars, legislators, military officials, popular media, and successive presidential administrations, little if any progress has been made in augmenting Internet security. Current scholarship on cybersecurity is bound to ill-fitting doctrinal models. It addresses cybersecurity based upon identification of actors and intent, arguing that inherent defects in the Internet’s architecture must be remedied to enable attribution. These proposals, if adopted, would badly damage the Internet’s generative capacity for innovation. Drawing upon scholarship in economics, animal behavior, and mathematics, this Article takes a radical new path, offering a theoretical model oriented around information, in distinction to the near-obsession with technical infrastructure demonstrated by other models. It posits a regulatory focus on access and alteration of data, and on guaranteeing its integrity. Counterintuitively, it suggests that creating inefficient storage and connectivity best protects user capabilities to access and alter information, but this necessitates difficult tradeoffs with preventing unauthorized interaction with data. The Article outlines how to implement inefficient information storage and connectivity through legislation. Lastly, it describes the stakes in cybersecurity debates: adopting current scholarly approaches jeopardizes not only the Internet’s generative architecture, but also key normative commitments to free expression on-line.

Conundrum, 96 Minn. L. Rev. 584 (2011).

Cross-posted at Info/Law.

0

Goldilocks and Cybersecurity

It may seem strange in a week where Megaupload’s owners were arrested and SOPA / PROTECT IP went under, but cybersecurity is the most important Internet issue out there. Examples? Chinese corporate espionage. Cyberweapons like Stuxnet. Anonymous DDOSing everyone from the Department of Justice to the RIAA. The Net is full of holes, and there are a lot of folks expert in slipping through them.

I argue in a forthcoming paper, Conundrum, that cybersecurity can only be understood as an information problem. Conundrum posits that, if we’re worried about ensuring access to critical information on-line, we should make the Net less efficient – building in redundancy. But for cybersecurity, information is like the porridge in Goldilocks: you can’t have too much or too little. For example, there was recent panic that a water pump burnout in Illinois was the work of cyberterrorists. It turned out that it was actually the work of a contractor for the utility who happened to be vacationing in Russia. (This is what you get for actually answering your pager.)

The “too little” problem can be described via two examples. First, prior to the attacks of September 11, 2001, the government had information about some of the hijackers, but was impeded by lack of information-sharing and by IT systems that made such sharing difficult. Second, denial of service attacks prevent Internet users from reaching sites they seek – a tactic perfected by Anonymous. The problem is the same: needed information is unavailable. I think the solution, as described in Conundrum, is:

increasing the inefficiency with which information is stored. The positive aspects of both access to and alteration of data emphasize the need to ensure that authorized users can reach, and modify, information. This is more likely to occur when users can reach data at multiple locations, both because it increases attackers’ difficulty in blocking their attempts, and because it provides fallback options if a given copy is not available. In short, data should reside in many places.

But there is also the “too much” problem. This is exemplified by the water pump fiasco: after 9/11, the federal government, including the Department of Homeland Security, began a massive information-sharing effort, such as through Fusion Centers. The difficulty is that the Fusion Centers, and other DHS projects, are simply firehosing information onto companies who constitute “critical infrastructure.” Much of this information is repetitive or simply wrong – as with the water pump report. Bad information can be worse than none at all: it distracts critical infrastructure operators, breeds mistrust, and consumes scarce security resources. The pendulum has swung too far the other way: from undersharing to oversharing. Finding the “just right” solution is impossible; this is a dynamic environment with constantly changing threats. But the government hasn’t yet made the effort to synthesize and analyze information before sounding the alarm. It must, or we will pay the price of either false alarms, or missed ones.

(A side note: I don’t put much stock in which federal agency takes the lead on cybersecurity – there are proposals for the Department of Defense, or the Department of Energy, among others – but why has the Obama administration delegated responsibility to DHS? Having the TSA set Internet policy hardly seems sensible. Beware of Web-based snow globes!)

Cross-posted at Info/Law.

0

Censorship on the March

Today, you can’t get to The Oatmeal, or Dinosaur Comics, or XKCD, or (less importantly) Wikipedia. The sites have gone dark to protest the Stop Online Piracy Act (SOPA) and the PROTECT IP Act, America’s attempt to censor the Internet to reduce copyright infringement. This is part of a remarkable, distributed, coordinated protest effort, both online and in realspace (I saw my colleague and friend Jonathan Askin headed to protest outside the offices of Senators Charles Schumer and Kirstin Gillibrand). Many of the protesters argue that America is headed in the direction of authoritarian states such as China, Iran, and Bahrain in censoring the Net. The problem, though, is that America is not alone: most Western democracies are censoring the Internet. Britain does it for child pornography. France: hate speech. The EU is debating a proposal to allow “flagging” of objectionable content for ISPs to ban. Australia’s ISPs are engaging in pre-emptive censorship to prevent even worse legislation from passing. India wants Facebook, Google, and other online platforms to remove any content the government finds problematic.

Censorship is on the march, in democracies as well as dictatorships. With this movement we see, finally, the death of the American myth of free speech exceptionalism. We have viewed ourselves as qualitatively different – as defenders of unfettered expression. We are not. Even without SOPA and PROTECT IP, we are seizing domain names, filtering municipal wi-fi, and using funding to leverage colleges and universities to filter P2P. The reasons for American Internet censorship differ from those of France, South Korea, or China. The mechanism of restriction does not. It is time for us to be honest: America, too, censors. I think we can, and should, defend the legitimacy of our restrictions – the fight on-line and in Congress and in the media shows how we differ from China – but we need to stop pretending there is an easy line to be drawn between blocking human rights sites and blocking Rojadirecta or Dajaz1.

Cross-posted at Info/Law.

2

Supporting the Stop Online Piracy Act Protest Day

As my co-blogger Gerard notes, today is SOPA protest day.  Sites like Google or WordPress have censored their logo or offered up a away to contact your congressperson, though remain live.  Other sites like Wikipedia, Reddit, and Craigslist have shutdown, and more are set to shut down at some point today.  There’s lots of terrific commentary on SOPA, which is designed to tackle the problem of foreign-based websites that sell pirated movies, music, and other products–but with a heavy hand that threatens free expression and due process. The Wall Street Journal’s Amy Schatz has this story and Politico has another helpful piece; The Hill’s Brendan Sasso’s Twitter feed has lots of terrific updates.  Mark Lemley, David Levine, and David Post carefully explain why we ought to reject SOPA and the PROTECT IP Act in “Don’t Break the Internet” published by Stanford Law Review Online.  In the face of the protest, House Judiciary Committee Chairman Lamar Smith (R-TX) vowed to bring SOPA to a vote in his committee next month. “I am committed to continuing to work with my colleagues in the House and Senate to send a bipartisan bill to the White House that saves American jobs and protects intellectual property,” he said.  So, too, Senator Patrick Leahy (D-VT) pushed back against websites planning to shut down today in protest of his bill.  “Much of what has been claimed about the Senate’s PROTECT IP Act is flatly wrong and seems intended more to stoke fear and concern than to shed light or foster workable solutions. The PROTECT IP Act will not affect Wikipedia, will not affect reddit, and will not affect any website that has any legitimate use,” Chairman Leahy said. Everyone’s abuzz on the issue, and rightly so.  I spoke at a panel on intermediary liability at the Congressional Internet Caucus’ State of the Net conference and everyone wanted to talk about SOPA.  I’m hoping that the black out and other shows of disapproval will convince our representatives in the House and Senate to back off the most troubling parts of the bill.  As fabulous guest blogger Derek Bambauer argues, we need to bring greater care and thought to the issue of Internet censorship.  Cybersecurity is at issue too, and we need to pay attention.  Derek may be right that both bills may go nowhere, especially given Silicon Valley’s concerted lobbying efforts against the bills.  But we will have to watch to see if Representative Smith lives up to his promise to bring SOPA back to committee and if Senator Leahy remains as committed to PROTECT IP Act in a few weeks as he is today.

1

The Fight For Internet Censorship

Thanks to Danielle and the CoOp crew for having me! I’m excited.

Speaking of exciting developments, it appears that the Stop Online Piracy Act (SOPA) is dead, at least for now. House Majority Leader Eric Cantor has said that the bill will not move forward until there is a consensus position on it, which is to say, never. Media sources credit the Obama administration’s opposition to some of the more noxious parts of SOPA, such as its DNSSEC-killing filtering provisions, and also the tech community’s efforts to raise awareness. (Techdirt’s Mike Masnick has been working overtime in reporting on SOPA; Wikipedia and Reddit are adopting a blackout to draw attention; even the New York City techies are holding a demonstration in front of the offices of Senators Kirstin Gillibrand and Charles Schumer. Schumer has been bailing water on the SOPA front after one of his staffers told a local entrepreneur that the senator supports Internet censorship. Props for candor.) I think the Obama administration’s lack of enthusiasm for the bill is important, but I suspect that a crowded legislative calendar is also playing a significant role.

Of course, the PROTECT IP Act is still floating around the Senate. It’s less worse than SOPA, in the same way that Transformers 2 is less worse than Transformers 3. (You still might want to see what else Netflix has available.) And sponsor Senator Patrick Leahy has suggested that the DNS filtering provisions of the bill be studied – after the legislation is passed. It’s much more efficient, legislatively, to regulate first and then see if it will be effective. A more cynical view is that Senator Leahy’s move is a public relations tactic designed to undercut the opposition, but no one wants to say so to his face.

I am not opposed to Internet censorship in all situations, which means I am often lonely at tech-related events. But these bills have significant flaws. They threaten to badly weaken cybersecurity, an area that is purportedly a national priority (and has been for 15 years). They claim to address a major threat to IP rightsholders despite the complete lack of data that the threat is anything other than chimerical. They provide scant procedural protections for accused infringers, and confer extraordinary power on private rightsholders – power that will, inevitably, be abused. And they reflect a significant public choice imbalance in how IP and Internet policy is made in the United States.

Surprisingly, the Obama administration has it about right: we shouldn’t reject Internet censorship as a regulatory mechanism out of hand, but we should be wary of it. This isn’t the last stage of this debate – like Wesley in The Princess Bride, SOPA-like legislation is only mostly dead. (And, if you don’t like the Obama administration’s position today, just wait a day or two.)

Cross-posted at Info/Law.

4

The idealization/practice nexus

Inspired by Orin Kerr’s question (“is your work focused on the internal narratives and ideologies that people use to describe/justify what they do, or is it focused externally on the actual conduct of what people do?”) below I will give a sense of how I walk the line between what we might call idealism and practice among the geeks and hackers I study.

One of the toughest parts about working with the type of technologists I focus on— intelligent, opinionated, online a lot of the time—is that many will unabashedly dissect my every word, statement, and media appearance. This attribute of my research, unsurprisingly, has been the source of considerable anxiety, only made worse in recent times with Anonymous as I have to make “authoritative” statements about them in the midst studying them, in other words, in the midst of having incomplete information.

All of this is to say I am deliberate and diplomatic when it comes to word choice, framing, and arguments. But most of the time examining practice in light of or up against idealism does not take the somewhat noxious form of “exposing” secrets, the implication being that people are so mystified and deluded that you, the outsider, are there to inform the world of what is really going on (there is a a long standing tradition in the humanities and social sciences, loosely inspired by Karl Marx and especially Pierre Bourdieu, taking this stance, not my favorite strain of analysis unless done really when needed and very well).

Much of what I do is to unearth those dynamics which may not be natively theorized but are certainly in operation. Take for instance the following example at the nexus of law and politics: during fieldwork it was patently clear that many free software hackers were wholly uninterested in politics outside of software freedom and those aligned with open source explicitly disavowed even this narrowly defined political agenda. Many were also repelled by the law (as one developer put it, “writing an algorithm in legalese should be punished with death…. a horrible one, by preference”) and yet weeks into research it was obvious that many developers are nimble legal thinkers, which helps explain how they have built, in a relatively short time period, a robust alternative body of legal theory and laws. One reason for this facility is that the skills, mental dispositions, and forms of reasoning necessary to read and analyze a formal, rule-based system like the law parallel the operations necessary to code software. Both are logic-oriented, internally consistent textual practices that require great attention to detail. Small mistakes in both law and software—a missing comma in a contract or a missing semicolon in code—can jeopardize the integrity of the system and compromise the intention of the author of the text. Both lawyers and programmers develop mental habits for making, reading, and parsing what are primarily utilitarian texts and this makes a lot of free software hackers, who already must pay attention to the law in light of free software licenses, adept legal thinkers, although of course this does not necessarily mean they would make good lawyers.

Read More

0

BRIGHT IDEAS: Anita Allen’s Unpopular Privacy

Lucky for CoOp readers, I had a chance to talk to Professor Anita Allen about her new book Unpopular Privacy, which Oxford University Press recently published.  My co-blogger Dan Solove included Professor Allen’s new book on his must-read privacy books for the year.  And rightly so: the book is insightful, important, and engrossing.  Before I reproduce below my interview with Professor Allen, let me introduce her to you.  She is a true renaissance person, just see her Wikipedia page.  Professor Allen is the Henry R. Silverman Professor of Law and professor of philosophy at the University of Pennsylvania Law School.  She is also a senior fellow in the bioethics department of the University of Pennsylvania School of Medicine, a collaborating faculty member in African studies, and an affiliated faculty member in the women’s studies program.  In 2010, President Barack Obama named Professor Allen to the Presidential Commission for the Study of Bioethical Issues. She is a Hastings CenterFellow.  Her publications are too numerous to list here: suffice it to say that she’s written several books, a casebook, and countless articles in law reviews and philosophy journals.  She also writes for the Daily Beast and other popular media.

Question: You began writing about privacy in the 1980s, long before the Internet and long before many of the federal privacy statutes we take for granted. What has changed? 

 I started writing about privacy when I was a law student at Harvard in the early 1980s and have never stopped. Unpopular Privacy, What Must We Hide (Oxford University Press 2011) is my third book about privacy in addition to a privacy law casebook Privacy Law and Society (West Publishing 2011).  My original impetus was to understand and explore the relationships of power and control among governments, individuals, groups, and families.  In the 1970s and 1980s, the big privacy issues in the newspapers and the courts related to abortion, gay sex, and the right to die.  Surveillance, search and seizure, and database issues were on the table, as they had been since the early 1960s, but they often seemed the special province of criminal lawyers and technocrats.

To use a cliché, it’s a brave new world.   Since my early interest in privacy, times have indeed changed, the role of electronic communications and the pervasiveness of networked technologies in daily life has transformed how personal data flows and how we think about and prioritize our privacy.  Terms like webcam, “text messaging,” “social networking,” and “cloud computing” have entered the lexicon, along with devices like mobile, personal digital assistants, and iPads.

The public is just beginning to grasp ways in which genetics and neuroscience will impact privacy in daily life—I have begun to reflect, write, and speak more about these matters recently, including in connection with my work as a member of President Obama’s Presidential Commission for the Study of Bioethical Issues.

Question: Your book coins the phrase “unpopular privacy.”  In what way is privacy unpopular?  

First let me say that I think of “popular privacy” as the privacy that people in the United States and similar developed nations tend to want, believe they have a right to, and expect government to secure.  For example, typical adults very much want privacy protection for the content of their telephone calls, e-mail, tax filings, health records, academic transcripts, and bank transactions.

I wrote this book because I think we need to think more about “unpopular” privacy. “Unpopular” privacy is the kind that people reject, despise, or are indifferent to.  My book focuses on the moral and political underpinnings of laws that promote, require, and enforce physical and informational privacy that is unpopular with the very people that those laws are supposed to help or control.  (I call such people the beneficiaries and targets of privacy laws.)  “Don’t Ask, Don’t Tell,” for instance, was an unpopular government mandated privacy for military service members.  My book suggests that some types of privacy that should be popular aren’t and asks what, if anything, we should do about it.

Question: If people don’t want privacy or don’t care about it, why should we care?

We should care because privacy is important.  I urge that we think of it as a “foundational” good like freedom and equality.  Privacy is not a purely optional good like cookies and sports cars.  Since the 1960s, when scholars first began to analyze privacy in earnest, philosophers and other theorists have rightly linked the experience of privacy with dignity, autonomy, civility, and intimacy. They have linked it to repose, self-expression, creativity, and reflection. They have tied it to the preservation of unique preferences and distinct traditions.  I agree with moral, legal and political theorists who have argued that privacy is a right.

I go further to join a small group of theorists that includes Jean L. Cohen who have argued that privacy is also potentially a duty; and not only a duty to others, but a duty to one’s self.  I believe we each have a duty to take into account the way in which one’s own personality and life enterprises could be affected by decisions to dispense with foundational goods that are lost when one decides to flaunt, expose, and share rather than to reserve, conceal, and keep.

If people are completely morally and legally free to pick and choose the degrees of privacy they will enter, they are potentially deprived of highly valued states that promote their vital interests, and those of their fellow human beings. For me, this suggests that we need to restrain choice—if not by law, then by ethics and other social norms.  Respect for privacy rights and the ascription of privacy duties must comprise a part of a society’s formative project for shaping citizens. Read More

0

Neil Richards on Why Video Privacy Matters

Our guest blogger Neil Richards, a Professor of Law at Washington University School of Law, turns his sights on video privacy in this guest blog post.  It whets our appetite for his forthcoming book on Intellectual Privacy.  So here is Professor Richards’s post:

The House of Representatives recently passed an amendment to a fairly obscure a law known as the Video Privacy Protection Act.  This law protects the privacy of our video rental records.  It ensures that companies who have information about what videos we watch keep them confidential, and it requires them to get meaningful consent from us before they publish them.  The House, at the urging of Netflix and Facebook, has passed an amendment that would allow these companies to share our movie watching habits much more easily.  The Video Privacy Act was passed after the Washington City Paper obtained the video rental records of Supreme Court nominee Robert Bork and published them in order to politically discredit him.  It worked.  The Video Privacy Act rests on the enduring wisdom that what we watch is our own business, regardless of our politics.  It allows us to share films we’ve watched on our own terms and not those of video stores or online video providers.

What’s at stake is something privacy scholars call “intellectual privacy” – the idea that records of our reading habits, movie watching habits, and private conversations deserve special protection from other kinds of personal information.  The films we watch, the books we read, and the web sites we visit are essential to the ways we make sense of the world and make up our minds about political and non-political issues.  Intellectual privacy protects our ability to think for ourselves, without worrying that other people might judge us based on what we read.  It allows us to explore ideas that other people might not approve of, and to figure out our politics, sexuality, and personal values, among other things.  It lets us watch or read whatever we want without fear of embarrassment or being outed.  This is the case whether we’re reading communist or anti-globalization books; or visiting web sites about abortion, gun control, cancer, or coming out as gay; or watching videos of pornography, or documentaries by Michael Moore, or even “The Hangover 2.”

For generations, librarians have understood this.  Libraries were the Internet before computers – they presented the world of reading to us, and let us as patrons read (and watch) freely for ourselves. But librarians understood that intellectual privacy matters.  A good library lets us read freely, but keeps our records confidential in order to safeguard our intellectual privacy.   But we are told by Netflix, Facebook, and other companies that the world has changed.  “Sharing” as they call it is the way of the future.  I disagree.  Sharing can be good, and sharing of what we watch and read is very important.  But the way we share is essential.  Telling our friends “hey – read this – it’s important” or “watch this movie – it’s really moving” is one of the great things that the Internet has made easier.  But sharing has to be done on our terms, not on those that are most profitable for business.  Sharing doesn’t mean a norm of publishing everything we read on the Internet.  It means giving us a conscious choice about when we are sharing our intellectual habits, and when we are not.

Industry groups are fond of saying that good privacy practices require consumer notice and consumer choice.  The current Video Privacy Act is one of the few laws that does give consumers meaningful choice about protecting their sensitive personal information.  Now is not the time to cut back on the VPPA’s protections.  Now is the time to extend its protections to the whole range of intellectual records – the books we buy, our internet search histories, and ISP logs of what we read on the Internet.  As a first step, we should reject this attempt to eviscerate our intellectual privacy.

16

Bigoted Harassment, Alive and Well Online

With the help of law and changing norms, invidious discrimination has become less prevalent in arenas like schools, workplaces, hotels, and public transportation.  Due to our social environments, anti-discrimination law is fairly easy to enforce.  Because leaders usually can figure out those responsible for discriminatory conduct and ignore such behavior at their peril, bigotry raises a real risk of social sanction.  So too hate discourse in the public sphere is more muted.  A hundred years ago, Southern newspapers and leaders explicitly endorsed mob violence against blacks.  As late as 1940, a newspaper editor in Durham, North Carolina could state that: “A Negro is different from other people in that he’s an unfortunate branch of the human family who hasn’t been able to make out of himself all he is capable of” due to his “background of the jungle.”  In the post-Civil Rights era, the public expression of bigoted epithets and slurs occurs infrequently.  One rarely hears racist, sexist, or homophobic speech in mainstream media outlets.  Some interpret this state of affairs optimistically, as a sign that we are moving beyond race, gender, and arguably even sexual orientation.  The election of the first black President provoked proclamations of our entry into a “post-racial” era.  Many contend that we no longer need feminism anymore.  Prime time television is filled with images of female power, from Brenda Leigh Johnson’s chief on The Closer to Dr. Miranda Bailey’s “take no prisoners” surgeon on Grey’s Anatomy.  Who needs feminism anymore as its goals have been achieved?

But a new era is not upon us.  In some arenas, hate’s explicit form has repackaged itself in subtlety.  In public discourse, crude biological views of group inferiority are often replaced with a kinder, gentler “color-blind racism,” as sociologist Eduardo Bonilla-Silva calls it. The face of modern racism is, in journalist Touré’s estimation, “invisible or hard to discern, lurking in the shadows or hidden.”  The media has also better disguised sexism with its anxiety about female achievement, renewed and amplified objectification of young women’s bodies and faces, and the dual exploitation and punishment of female sexuality, as media scholar Susan Douglas explains.

Offline public discourse may now be on more neutral ground but its online counterpart is not.  While virulent bigotry continues behind closed doors, it increasingly appears in online spaces that blend public and private discourse.  Although televised sports commentary rarely features anti-gay rhetoric, online sports message boards are awash in in-your-face homophobic speech.  Racial epithets and slurs are common online, whether in Facebook profiles, Twitter posts, blog comments, or YouTube videos.  College students encounter more sexually inappropriate speech in online interactions than in face-to-face ones.

Matters have not improved since I started talking and writing about it since 2007, when we woke up, for a brief second, and paid attention to sexualized, misogynistic attacks on Kathy Sierra on her blog and two others and the targeting of female law students on AutoAdmit.  Then, technologist Tim O’Reilly and Wikipedia co-founder Jimmy Wales called for a Blogger’s Code of Conduct.  That effort failed to gain traction, and ever since the bigoted online abuse continues, silencing victims, ruining their online reputations, costing them jobs, and interfering with their ability to engage with others online and offline.  Newsweek’s always insightful Jessica Bennett has published important new piece on online misogyny and the Guardian’s Vanessa Thorpe and Richard Rogers similarly explore the rape threats and abuse of female bloggers.  I will be blogging about bigoted online harassment, as I am amidst writing a book about it and serving on the Inter-Parliamentary Task Force on Online Hate, which recently held a hearing at the House of Commons.  This all has to stop, and now.