Site Meter

Category: Privacy (Consumer Privacy)

1

“Brain Spyware”

As if we don’t have enough to worry about, now there’s spyware for your brain.  Or, there could be.  Researchers at Oxford, Geneva, and Berkeley have created a proof of concept for using commercially available brain-computer interfaces to discover private facts about today’s gamers. Read More

1

Netflix, Facebook, and Social Sharing

Just as Neil Richards’s The Perils of Social Reading (101 Georgetown Law Journal 689 (2013)) is out in final form, Netflix released its new social sharing features in partnership with that privacy protector, Facebook. Not that working with Google, Apple, or Microsoft would be much better. There may be things I am missing. But I don’t see how turning on this feature is wise given that it seems to require you to remember not to share in ways that make sharing a bit leakier than you may want.

Apparently one has to connect your Netflix account to Facebook to get the feature to work. The way it works after that link is made poses problems.

According to SlashGear two rows appear. One is called Friends’ Favorites tells you just that. Now, consider that the algorithm works in part by you rating movies. So if you want to signal that odd documentaries, disturbing art movies, guilty pleasures (this one may range from The Hangover to Twilight), are of interest, you should rate them highly. If you turn this on, are all old ratings shared? And cool! Now everyone knows that you think March of the Penguins and Die Hard are 5 stars. The other button:

is called “Watched By Your Friends,” and it consists of movies and shows that your friends have recently watched. It provides a list of all your Facebook friends who are on Netflix, and you can cycle through individual friends to see what they recently watched. This is an unfiltered list, meaning that it shows all the movies and TV shows that your friends have agreed to share.

Of course, you can control what you share and what you don’t want to share, so if there’s a movie or TV show that you watch, but you don’t want to share it with your friends, you can simply click on the “Don’t Share This” button under each item. Netflix is rolling out the feature over the next couple of days, and the company says that all US members will have access to Netflix social by the end of the week.

Right. So imagine you forget that your viewing habits are broadcast. And what about Roku or other streaming devices? How does one ensure that the “Don’t Share” button is used before the word goes out that you watched one, two, or three movies on drugs, sex, gay culture, how great guns are, etc.?

As Richards puts it, “the ways in which we set up the defaults for sharing matter a great deal. Our reader records implicate
our intellectual privacy—the protection of reading from surveillance and interference so that we can read freely, widely, and without inhibition.” So too for video and really any information consumption.

4

New Edition of Solove & Schwartz’s Privacy Law Fundamentals: Must-Read (and Check out the Video)

Privacy leading lights Dan Solove and Paul Schwartz have recently released the 2013 edition of Privacy Law Fundamentals, a must-have for privacy practitioners, scholars, students, and really anyone who cares about privacy.

Privacy Law Fundamentals is an essential primer of the state of privacy law, capturing the up-to-date developments in legislation, FTC enforcement actions, and cases here and abroad.  As Chief Privacy Officers like Intel’s David Hoffman and renown privacy practitioners like Hogan’s Chris Wolf and Covington’s Kurt Wimmer agree, Privacy Law Fundamentals is an “essential” and “authoritative guide” on privacy law, compact and incredibly useful.  For those of you who know Dan and Paul, their work is not only incredibly wise and helpful but also dispensed in person with serious humor.  Check out this You Tube video, “Privacy Law in 60 Seconds,” to see what I mean.  I think that Psy may have a run for his money on making us smile.

4

In Honor of Alan Westin: Privacy Trailblazer, Seer, and Changemaker

Privacy leading light Alan Westin passed away this week.  Almost fifty years ago, Westin started his trailblazing work helping us understand the dangers of surveillance technologies.  Building on the work that Warren and Brandeis started in “The Right to Privacy” in 1898, Westin published Privacy and Freedom in 1967.  A year later, he took his normative case for privacy to the trenches.  As Director of the National Academy of Science’s Computer Science and Engineering Board, he and a team of researchers studied governmental, commercial, and private organizations using databases to amass, use, and share personal information.  Westin’s team interviewed 55 organizations, from local law enforcement, federal agencies like the Social Security Administration, and direct-mail companies like R.L. Polk (a predecessor to our behavioral advertising industry).

The 1972 report, Databanks in a Free Society: Computers, Record-Keeping, and Privacy, is a masterpiece.  With 14 case studies, the report made clear the extent to which public and private entities had been building substantial computerized dossiers of people’s activities and the risks to economic livelihood, reputation, and self-determination.  It demonstrated the unrestrained nature of data collection and sharing, with driver’s license bureaus selling personal information to direct-mail companies and law enforcement sharing arrest records with local and state agencies for employment and licensing matters.  Surely influenced by Westin’s earlier work, some data collectors, like the Kansas City Police Department, talked to the team about privacy protections, suggesting the need for verification of source documents, audit logs, passwords, and discipline for improper use of data. Westin’s report called for data collectors to adopt ethical procedures for data collection and sharing, including procedural protections such as notice and chance to correct inaccurate or incomplete information, data minimization requirements, and sharing limits.

Westin’s work shaped the debate about the right to privacy at the dawn of our surveillance era. His changing making agenda was front and center of  the Privacy Act of 1974.  In the early 1970s, nearly fifty congressional hearings and reports investigated a range of data privacy issues, including the use of census records, access to criminal history records, employers’ use of lie detector tests, and the military and law enforcement’s monitoring of political dissidents. State and federal executives spearheaded investigations of surveillance technologies including a proposed National Databank Center.

Just as public discourse was consumed with the “data-bank problem,” the courts began to pay attention. In Whalen v. Roe, a 1977 case involving New York’s mandatory collection of prescription drug records, the Supreme Court strongly suggested that the Constitution contains a right to information privacy based on substantive due process. Although it held that the state prescription drug database did not violate the constitutional right to information privacy because it was adequately secured, the Court recognized an individual’s interest in avoiding disclosure of certain kinds of personal information. Writing for the Court, Justice Stevens noted the “threat to privacy implicit in the accumulation of vast amounts of personal information in computerized data banks or other massive government files.”  In a concurring opinion, Justice Brennan warned that the “central storage and easy accessibility of computerized data vastly increase the potential for abuse of that information, and I am not prepared to say that future developments will not demonstrate the necessity of some curb on such technology.”

What Westin underscored so long ago, and what Whalen v. Roe signaled, technologies used for broad, indiscriminate, and intrusive public surveillance threaten liberty interests.  Last term, in United States v. Jones, the Supreme Court signaled that these concerns have Fourth Amendment salience. Concurring opinions indicate that at least five justices have serious Fourth Amendment concerns about law enforcement’s growing surveillance capabilities. Those justices insisted that citizens have reasonable expectations of privacy in substantial quantities of personal information.  In our article “The Right to Quantitative Privacy,” David Gray and I are seeking to carry forward Westin’s insights (and those of Brandeis and Warren before him) into the Fourth Amendment arena as the five concurring justices in Jones suggested.  More on that to come, but for now, let’s thank Alan Westin for his extraordinary work on the “computerized databanks” problem.

 

1

Data Brokers in the FTC’s Sights

The ethos of our age is the more data, the better, and nowhere is that more true than the data-broker industry.  Data-broker databases contain dossiers on hundreds of millions of individuals, including their Social Security numbers, property records, criminal-justice records, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, social network profiles, online activity, and drug- and food-store records.  According to FTC Chairman Jon Leibowitz, companies like Acxiom are the ‘invisible cyberazzi’ that follow us around every where we go on- and offline, or as Chris Hoofnagle has aptly called them “Little Brothers” helping Big Brother and industry.  Data brokers are largely unbridled by regulation. The FTC’s enforcement authority over data brokers stems from the Fair Credit Reporting Act (FCRA), which was passed in 1970 to protect the privacy and accuracy of information included in credit reports.  FCRA requires consumer reporting agencies to use reasonable procedures to ensure that entities to which they disclose sensitive consumer data have a permissible purpose for receiving that data.  Under FCRA, employers are required to inform individuals about intended adverse actions against them based on their credit reports.  Individuals get a chance to explain inaccurate or incomplete information and to contact credit-reporting agencies to dispute the information in the hopes of getting it corrected.  During the past two years, the FTC has gone after social media intelligence company and online people search engine on the grounds that they constituted consumer reporting agencies subject to FCRA.  In June 2012, the FTC settled charges against Spokeo, an online service that compiles and sells digital dossiers on consumers to human resource professionals, job recruiters, and other businesses.  Spokeo assembles consumer data from on- and offline sources, including social media sites, to create searchable consumer profiles.  The profiles include an individual’s full name, physical address, phone number, age range, and email address, hobbies, photos, ethnicity, religion, and social network activity.  The FTC alleged that Spokeo failed to adhere to FCRA, including its obligation to ensure the accuracy of consumer reports.  Ultimately, it obtained a $800,000 settlement with the company.  That’s helpful, to be sure, but given the FTC’s limited resources may not lead to more accurate dossiers.  (It also may mean that employers will keep online intelligence in-house and thus their use of unreliable online information outside the reach of FCRA, as my co-blogger Frank Pasquale wrote so ably about in The Offensive Internet: Speech, Privacy, and Reputation).  More recently,the FTC issued orders requiring nine data brokerage companies to provide the agency with information about how they collect and use data about consumers.  The agency will use the information to study privacy practices in the data broker industry.  The nine data brokers receiving orders from the FTC were (1) Acxiom, (2) Corelogic, (3) Datalogix, (4) eBureau, (5) ID Analytics, (6) Intelius, (7) Peekyou, (8) Rapleaf, and (9) Recorded Future.  In its press release, the FTC explained that it is seeking details about: “the nature and sources of the consumer information the data brokers collect; how they use, maintain, and disseminate the information; and the extent to which the data brokers allow consumers to access and correct their information or to opt out of having their personal information sold.”  The FTC called on the data broker industry to improve the transparency of its practices as part of a Commission report, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers.  FTC Commissioner Julie Brill has been a tireless advocate for greater oversight over data brokers–here is hoping that her efforts and those of her agency produce important reforms.

 

 

 

0

Identity Theft: Coming to Screens Near You (and Not Just the Movies)

Identity theft, now so common, we can joke about it.

Or as Alan Alda’s character in Woody Allen’s Crimes and Misdemeanors says, “comedy is tragedy plus time.”  Time to transform tragedy into comedy, indeed.  Scanning the Privacy Rights Clearinghouse database demonstrates that reported data breaches are a daily occurrence.  Since January 1, 2013, private and public entities have reported over 20 major data breaches.  Included on the list were hospitals, universities, and businesses.  Sometimes, the most vulnerable are targeted.  For instance, on January 8, 2013, a dishonest employee of the Texas Department of Health and Human Services was arrested on suspicion on misusing client information to apply for credit cards and to receive medical care under their names.  Bad enough that automated systems erroneously take recipients of public benefits off the rolls, as my work on Technological Due Process explores.  Those designed to help them are destroying their medical and credit histories as well.

We have had over 600 million records breached since 2005, from approximately 3,500 reported data breaches.  Of course, those figures represented those officially reported, likely due to state data breach laws, whose requirements vary and leave lots of discretion with regard to reporting up to the entities who have little incentive to err on the side of reporting if they are not legally required to do so.  So the bad news is that identity theft is prevalent, but at least we can laugh about it.

5

The Importance of Section 230 Immunity for Most

Why leave the safe harbor provision intact for site operators, search engines, and other online service providers do not attempt to block offensive, indecent, or illegal activity but by no means encourage or are principally used to host illicit material as cyber cesspools do?  If we retain that immunity, some harassment and stalking — including revenge porn — will remain online because site operators hosting it cannot be legally required to take them down.  Why countenance that possibility?

Because of the risk of collateral censorship—blocking or filtering speech to avoid potential liability even if the speech is legally protected.  In what is often called the heckler’s veto, people may abuse their ability to complain, using the threat of liability to ensure that site operators block or remove posts for no good reason.  They might complain because they disagree with the political views expressed or dislike the posters’ disparaging tone.  Providers would be especially inclined to remove content in the face of frivolous complaints in instances where they have little interest in keeping up the complained about content.  Take, as an illustration, the popular newsgathering sites Digg.  If faced with legal liability, it might automatically take down posts even though they involve protected speech.  The news gathering site lacks a vested interest in keeping up any particular post given its overall goal of crowd sourcing vast quantities of news that people like.  Given the scale of their operation, they may lack the resources to hire enough people to cull through complaints to weed out frivolous ones.

Sites like Digg differ from revenge porn sites and other cyber cesspools whose operators have an incentive to refrain from removing complained-about content such as revenge porn and the like.  Cyber cesspools obtain economic benefits by hosting harassing material that may make it worth the risk to continue to do so.  Collateral censorship is far less likely—because it is in their economic interest to keep up destructive material.  As Slate reporter and cyber bullying expert Emily Bazelon has remarked, concerns about the heckler’s veto get more deference than it should in the context of revenge porn sites and other cyber cesspools.  (Read Bazelon’s important new book Sticks and Stones: Defeating the Culture of Bullying and Rediscovering the Power of Character and Empathy).  It does not justify immunizing cyber cesspool operators from liability.

Let’s be clear about what this would mean.  Dispensing with cyber cesspools’ immunity would not mean that they would be strictly liable for user-generated content.  A legal theory would need to sanction remedies against them.  Read More

15

Harvard Law Review Symposium on Privacy & Technology

This Friday, November 9th, I will be introducing and participating in the Harvard Law Review’s symposium on privacy and technology.  The symposium is open to the public, and is from 8:30 AM to 4:30 PM at Harvard Law School (Langdell South).

I have posted a draft of my symposium essay on SSRN, where it can be downloaded for free.  The essay will be published in the Harvard Law Review in 2013.  My essay is entitled Privacy Self-Management and the Consent Paradox, and I discuss what I call the “privacy self-management model,” which is the current regulatory approach for protecting privacy — the law provides people with a set of rights to enable them to decide for themselves about how to weigh the costs and benefits of the collection, use, or disclosure of their data. I demonstrate how this model fails to serve as adequate protection of privacy, and I argue that privacy law and policy must confront a confounding paradox with consent.  Currently, consent to the collection, use, and disclosure of personal data is often not meaningful, but the most apparent solution — paternalistic measures — even more directly denies people the freedom to make consensual choices about their data.

I welcome your comments on the draft, which will undergo considerable revision in the months to come.  In future posts, I plan to discuss a few points that I raise my essay, so I welcome your comments in these discussions as well.

The line up of the symposium is as follows:

Symposium 2012:
Privacy & Technology

Daniel J. Solove
George Washinton University
“Introduction: Privacy Self-Management and the Consent Paradox”

Jonathan Zittrain
Harvard Law School

Paul Schwartz
Berkeley Law School
“The E.U.-U.S. Privacy Collision”

Lior Strahilevitz
University of Chicago
“A Positive Theory of Privacy”

Julie Cohen
Georgetown University
“What Privacy is For”

Neil Richards
Washington University
“The Harms of Surveillance”

Danielle Citron
University of Maryland

Anita Allen
University of Pennsylvania

Orin Kerr
George Washington University

Alessandro Acquisti
Carnegie Mellon University

Latanya Sweeney
Harvard University

Joel Reidenberg
Fordham University

Paul Ohm
University of Colorado

Tim Wu
Columbia University

Thomas Crocker
University of South Carolina

Danny Weitzner
MIT

6

PETs, Law and Surveillance

In Europe, privacy is considered a fundamental human right. Section 8 of the European Convention of Human Rights (ECHR) limits the power of the state to interfere in citizens’ privacy, ”except such as is in accordance with the law and is necessary in a democratic society”. Privacy is also granted constitutional protection in the Fourth Amendment to the United States Constitution. Both the ECHR and the US Constitution establish the right to privacy as freedom from government surveillance (I’ll call this “constitutional privacy”). Over the past 40 years, a specific framework has emerged to protect informational privacy (see here and here and here and here); yet this framework (“information privacy”) provides little protection against surveillance by either government or private sector organizations. Indeed, the information privacy framework presumes that a data controller (i.e., a government or business organization collecting, storing and using personal data) is a trusted party, essentially acting as a steward of individual rights. In doing so, it overlooks the fact that organizations often have strong incentives to subject individuals to persistent surveillance; to monetize individuals’ data; and to maximize information collection, storage and use.

Read More