Category: Privacy (Electronic Surveillance)

0

Over-Parenting Goes International

The thought of hiring a private detective in this age of relatively accessible electronic surveillance seems a bit retro, like a black-and-white scene from a smoky film noire. But it has been enjoying a surprising comeback in recent years, with parents who hire private investigators to spy on their children.

In an article titled Over-Parenting, my co-author Gaia Bernstein and I identified a trend of legal adoption of intensive parenting norms. We cautioned against society legally sanctioning a single parenting style – namely, intensive parenting – while deeming potentially neglectful other parenting styles which could be perfectly legitimate. We also pointed out that involved parenting is class-biased, since it is costly, and not all parents can afford the technology that would enable them to be intensive parents, such as purchasing GPS enabled smartphones for their kids. We argued that when intensive parenting is used for children who do not need it, it becomes over-parenting. Not all children need the same level of involvement in their lives; one of the most important roles of parents is to prepare their children for independent life, and over-parenting might thwart that role. Finally, we speculated that the cultural model for intensive parenting originates in media depictions of upper-middle class families, and that how these families are portrayed in movies and TV shows influences real-life parents.

Well, I’m sad to report that over-parenting is not a unique American phenomenon. Last year, for example, a Chinese newspaper reported that parents in china are increasingly becoming more involved in their children’s lives by hiring private investigators to check whether the children use drugs, drink alcohol or have sex. In Israel some parents are doing the same, especially during the long summer break, during which bored teenagers, many parent fear, are prone to engage in such activities (if you read Hebrew, you can read the story here). I am sure that some American parents do the same.

Leaving aside the class question (are parents who cannot afford a private eye neglectful?), what does this say about parents’ role as educators? Or about the level of trust (or distrust) between those parents and their children? It used to be that a spouse would hire a private investigator because they thought that their partner was having an affair. Nowadays, a growing chunk of a private investigator’s work involved parents spying on their children. Can’t we say that the fact that parents feel that they need to spy on their children already testifies to their limited parental skills?

NSA Penalty Proposed

Readers suggested potential penalties for improper gathering or misuse of surveillance data last month.  As revelations continue, Congressmen have recently proposed some new ideas:

Rep. Mike Fitzpatrick (R-Pa.) proposed legislation . . .  that would cut National Security Agency (NSA) funding if it violates new surveillance rules aimed at preventing broad data collection on millions of people.

Fitzpatrick has also offered language to restrict the term “relevant” when it comes to data collection.  On the one hand, it seems odd for Congress to micromanage a spy agency.  On the other hand, no one has adequately explained how present safeguards keep the integrated Information Sharing Environment from engaging in the harms catalogued here and here. So we’re likely to see many blunt efforts to cut off its ability to collect and analyze data, even if data misuse is really the core problem.

0

The FTC and the New Common Law of Privacy

I recently posted a draft of my new article, The FTC and the New Common Law of Privacy (with Professor Woodrow Hartzog).

One of the great ironies about information privacy law is that the primary regulation of privacy in the United States has barely been studied in a scholarly way. Since the late 1990s, the Federal Trade Commission (FTC) has been enforcing companies’ privacy policies through its authority to police unfair and deceptive trade practices. Despite more than fifteen years of FTC enforcement, there is no meaningful body of judicial decisions to show for it. The cases have nearly all resulted in settlement agreements. Nevertheless, companies look to these agreements to guide their privacy practices. Thus, in practice, FTC privacy jurisprudence has become the broadest and most influential regulating force on information privacy in the United States – more so than nearly any privacy statute and any common law tort.

In this article, we contend that the FTC’s privacy jurisprudence is the functional equivalent to a body of common law, and we examine it as such. The article explores the following issues:

  • Why did the FTC, and not contract law, come to dominate the enforcement of privacy policies?
  • Why, despite more than 15 years of FTC enforcement, have there been hardly any resulting judicial decisions?
  • Why has FTC enforcement had such a profound effect on company behavior given the very small penalties?
  • Can FTC jurisprudence evolve into a comprehensive regulatory regime for privacy?

 

 

The claims we make in this article include:

  • The common view of FTC jurisprudence as thin — as merely enforcing privacy promises — is misguided. The FTC’s privacy jurisprudence is actually quite thick, and it has come to serve as the functional equivalent to a body of common law.
  • The foundations exist in FTC jurisprudence to develop a robust privacy regulatory regime, one that focuses on consumer expectations of privacy, that extends far beyond privacy policies, and that involves substantive rules that exist independently from a company’s privacy representations.

 

You can download the article draft here on SSRN.

Focusing on the Core Harms of Surveillance

CoreHarmsThe “summer of NSA revelations” rolls along, with a blockbuster finale today. In June, Jennifer Granick and Christopher Sprigman flatly declared the NSA criminal. Now the agency’s own internal documents (leaked by Snowden) appear to confirm thousands of legal violations.

Legal scholars will not be surprised by the day’s revelations, just as few surveillance experts were all that shocked by the breadth and depth of PRISM, PINWALE, MARINA, and other programs. Ray Ku called warrantless surveillance unconstitutional in 2010. Civil liberties groups and legal scholars warned us repeatedly about where Bush-era executive power theories would lead. As anyone familiar with Bruce Ackerman’s work might guess, pliable attorneys have rubber-stamped the telephony metadata program with a “white paper” that “fails to confront counterarguments and address contrary caselaw” and “cites cases that [are] relatively weak authority for its position.” There are no meaningful penalties in sight (perhaps because the OLC has prepared documents that function as a “get out of jail free” card for those involved).
Read More

0

Brave New World of Biometric Identification

120px-Fingerprint_scanner_identificationProfessor Margaret Hu’s important new article, “Biometric ID Cybersurveillance” (Indiana Law Journal), carefully and chillingly lays out federal and state government’s increasing use of biometrics for identification and other purposes. These efforts are poised to lead to a national biometric ID with centralized databases of our iris, face, and fingerprints. Such multimodal biometric IDs ostensibly provide greater security from fraud than our current de facto identifier, the social security number. As Professor Hu lays out, biometrics are, and soon will be, gatekeepers to the right to vote, work, fly, drive, and cross into our borders. Professor Hu explains that the FBI’s Next Generation Identification project will institute:

a comprehensive, centralized, and technologically interoperable biometric database that spans across military and national security agencies, as well as all other state and federal government agencies.Once complete, NGI will strive to centralize whatever biometric data is available on all citizens and noncitizens in the United States and abroad, including information on fingerprints, DNA, iris scans, voice recognition, and facial recognition data captured through digitalized photos, such as U.S. passport photos and REAL ID driver’s licenses.The NGI Interstate Photo System, for instance, aims to aggregate digital photos from not only federal, state, and local law enforcement, but also digital photos from private businesses, social networking sites, government agencies, and foreign and international entities, as well as acquaintances, friends, and family members.

Such a comprehensive biometric database would surely be accessed and used by our network of fusion centers and other hubs of our domestic surveillance apparatus that Frank Pasquale and I wrote about here.

Biometric ID cybersurveillance might be used to assign risk assessment scores and to take action based on those scores. In a chilling passage, Professor Hu describes one such proposed program:

FAST is currently under testing by DHS and has been described in press reports as a “precrime” program. If implemented, FAST will purportedly rely upon complex statistical algorithms that can aggregate data from multiple databases in an attempt to “predict” future criminal or terrorist acts, most likely through stealth cybersurveillance and covert data monitoring of ordinary citizens. The FAST program purports to assess whether an individual might pose a “precrime” threat through the capture of a range of data, including biometric data. In other words, FAST attempts to infer the security threat risk of future criminals and terrorists through data analysis.

Under FAST, biometric-based physiological and behavioral cues are captured through the following types of biometric data: body and eye movements, eye blink rate and pupil variation, body heat changes, and breathing patterns. Biometric- based linguistic cues include the capture of the following types of biometric data: voice pitch changes, alterations in rhythm, and changes in intonations of speech.Documents released by DHS indicate that individuals could be arrested and face other serious consequences based upon statistical algorithms and predictive analytical assessments. Specifically, projected consequences of FAST ‘can range from none to being temporarily detained to deportation, prison, or death.’

Data mining of our biometrics to predict criminal and terrorist activity, which is then used as a basis for government decision making about our liberty? If this comes to fruition, technological due process would certainly be required.

Professor Hu calls for the Fourth Amendment to evolve to meet the challenge of 24/7 biometric surveillance technologies. David Gray and I hopefully answer Professor Hu’s request in our article “The Right to Quantitative Privacy” (forthcoming Minnesota Law Review). Rather than asking how much information is gathered in a particular case, we argue that Fourth Amendment interests in quantitative privacy demand that we focus on how information is gathered.  In our view, the threshold Fourth Amendment question should be whether a technology has the capacity to facilitate broad and indiscriminate surveillance that intrudes upon reasonable expectations of quantitative privacy by raising the specter of a surveillance state if deployment and use of that technology is left to the unfettered discretion of government. If it does not, then the Fourth Amendment imposes no limitations on law enforcement’s use of that technology, regardless of how much information officers gather against a particular target in a particular case. By contrast, if it does threaten reasonable expectations of quantitative privacy, then the government’s use of that technology amounts to a “search,” and must be subjected to the crucible of Fourth Amendment reasonableness, including judicially enforced constraints on law enforcement’s discretion.

 

5

Letting the Air Out

The NSA and the rest of our surveillance state apparatus is shrouded in secrecy. As captured in Frank Pasquale’s superb forthcoming book, governmental surveillance is a black box. Gag orders prevent Internet companies from talking about their participation in PRISM; nearly everything revealing is classified; the Executive Branch is telling us half truths or no truths. To counter massive governmental overreach, Bradley Manning, Edward Snowden, and others have exposed some sunlight on our surveillance state. That sunlight isn’t coming from those who are betraying the country, but those who are trying to save it, at least that’s what many registered voters think. According to a Quinnipiac poll released today, American voters say “55 – 34 percent” that NSA consultant Edward Snowden is a “whistleblower rather than a traitor.” According to the assistant director of the Quinnipiac University Polling Institute, “Most American voters think positively of Edward Snowden,” at least they did before he accepted asylum in Russia. From July 28 to July 31, 1,468 registered voters were surveyed on the phone. These sorts of leaks seem inevitable, at least culturally given our so-called commitment to openness and transparency. The leakers/whistleblowers are trying to nudge the Executive Branch to honor its commitments to the Fourth Amendment, the sentiments of the Church Report, and the Administration’s 2009 Openness and Transparency memo. Let’s see if letting the air out moves us closer to the kind of country we say we are.

H/T: Yale ISP’s Christina Spiesel for the Quinnipiac Poll

What Should be the Penalties for Misuse of Surveillance Data?

The Privacy and Civil Liberties Oversight Board (PCLOB) is holding a “Workshop Regarding Surveillance Programs Operated Pursuant to Section 215 of the USA PATRIOT Act and Section 702 of the Foreign Intelligence Surveillance Act.” Many luminaries in the privacy community are participating. I’m sure they will have great ideas about rendering PRISM, PINWALE, MARINA, et al. more subject to oversight.

But I have heard very little on what the appropriate penalties should be for misuse of surveillance data. In the health care world, we have some pretty clear precedents. For instance, a researcher served four months in prison for snooping into medical records in 2003. Imagine a very similar incident happened in the NSA context—say, an analyst abused his or her access to the data to learn details about an acquaintance who exhibited no suspicious characteristics. What should be the penalty? Feel free to comment below, or to submit ideas directly to the PCLOB.

0

Prism and Its Relationship to Clouds, Security, Jurisdiction, and Privacy

In January I wrote a piece, “Beyond Data Location: Data Security in the 21st Century,” for Communications of the ACM. I went into the current facts about data security (basic point: data moving often helps security) and how they clash with jurisdiction needs and interests. As part of that essay I wrote:

A key hurdle is identifying when any government may demand data. Transparent policies and possibly treaties could help better identify and govern under what circumstances a country may demand data from another. Countries might work with local industry to create data security and data breach laws with real teeth as a way to signal that poor data security has consequences. Countries should also provide more room for companies to challenge requests and reveal them so the global market has a better sense of what is being sought, which countries respect data protection laws, and which do not. Such changes would allow companies to compete based not only on their security systems but their willingness to defend customer interests. In return companies and computer scientists will likely have to design systems with an eye toward the ability to respond to government requests when those requests are proper. Such solutions may involve ways to tag data as coming from a citizen of a particular country. Here, issues of privacy and freedom arise, because the more one can tag and trace data, the more one can use it for surveillance. This possibility shows why increased transparency is needed, for at the very least it would allow citizens to object to pacts between governments and companies that tread on individual rights.

Prism shows just how much a new balance is needed. There are many areas to sort to reach that balance. They are too many to explore in blog post. But as I argued in the essay, I think that pulling in engineers (not just industry ones), law enforcement, civil society groups, and oh yes, lawyers to look at what can be done to address the current imbalance is the way to proceed.

3

Employers and Schools that Demand Account Passwords and the Future of Cloud Privacy

Passwords 01In 2012, the media erupted with news about employers demanding employees provide them with their social media passwords so the employers could access their accounts. This news took many people by surprise, and it set off a firestorm of public outrage. It even sparked a significant legislative response in the states.

I thought that the practice of demanding passwords was so outrageous that it couldn’t be very common. What kind of company or organization would actually do this? I thought it was a fringe practice done by a few small companies without much awareness of privacy law.

But Bradley Shear, an attorney who has focused extensively on the issue, opened my eyes to the fact that the practice is much more prevalent than I had imagined, and it is an issue that has very important implications as we move more of our personal data to the Cloud.

The Widespread Hunger for Access

Employers are not the only ones demanding social media passwords – schools are doing so too, especially athletic departments in higher education, many of which engage in extensive monitoring of the online activities of student athletes. Some require students to turn over passwords, install special software and apps, or friend coaches on Facebook and other sites. According to an article in USA Today: “As a condition of participating in sports, the schools require athletes to agree to monitoring software being placed on their social media accounts. This software emails alerts to coaches whenever athletes use a word that could embarrass the student, the university or tarnish their images on services such as Twitter, Facebook, YouTube and MySpace.”

Not only are colleges and universities engaging in the practice, but K-12 schools are doing so as well. A MSNBC article discusses the case of a parent’s outrage over school officials demanding access to a 13-year old girl’s Facebook account. According to the mother, “The whole family is exposed in this. . . . Some families communicate through Facebook. What if her aunt was going through a divorce or had an illness? And now there’s these anonymous people reading through this information.”

In addition to private sector employers and schools, public sector employers such as state government agencies are demanding access to online accounts. According to another MSNBC article: “In Maryland, job seekers applying to the state’s Department of Corrections have been asked during interviews to log into their accounts and let an interviewer watch while the potential employee clicks through posts, friends, photos and anything else that might be found behind the privacy wall.”

Read More

3

Overturning the Third-Party Doctrine by Statute: Hard and Harder

Privacy advocates have disliked the third-party doctrine at least from the day in 1976 when the Supreme Court decided U.S. v. Miller.  Anyone who remembers the Privacy Protection Study Commission knows that its report was heavily influenced by Miller.  My first task in my long stint as a congressional staffer was to organize a hearing to receive the report of the Commission in 1977.  In the introduction to the report, the Commission called the date of the decision “a fateful day for personal privacy.”

Last year, privacy advocates cheered when Justice Sonia Sotomayor’s concurrence in U.S. v. Jones asked if it was time to reconsider the third-party doctrine.  Yet it is likely that it would take a long time before the Supreme Court revisits and overturns the third-party doctrine, if ever.  Sotomayor’s opinion didn’t attract a single other Justice.

Can we draft a statute to overturn the third-party doctrine?  That is not an easy task, and it may be an unattainable goal politically.  Nevertheless, the discussion has to start somewhere.  I acknowledge that not everyone wants to overturn Miller.  See Orin Kerr’s The Case For the Third-party Doctrine.  I’m certainly not the first person to ask the how-to-do-it question.  Dan Solove wrestled with the problem in Digital Dossiers and the Dissipation of Fourth Amendment Privacy.

I’m going at the problem as if I were still a congressional staffer tasked with drafting a bill.  I see right away that there is precedent.  Somewhat remarkably, Congress partly overturned the Miller decision in 1978 when it enacted The Right to Financial Privacy Act, 12 U.S.C. § 3401 et seq.  The RFPA says that if the federal government wants to obtain records of a bank customer, it must notify the customer and allow the customer to challenge the request.

The RFPA is remarkable too for its exemptions and weak standards.  The law only applies to the federal government and not to state and local governments.  (States may have their own laws applicable to state agencies.)  Bank supervisory agencies are largely exempt.  The IRS is exempt.  Disclosures required by federal law are exempt.  Disclosures for government loan programs are exempt.  Disclosures for grand jury subpoenas are exempt.  That effectively exempts a lot of criminal law enforcement activity.  Disclosures to GAO and the CFPB are exempt.  Disclosures for investigations of crimes against financial institutions by insiders are exempt.  Disclosures to intelligence agencies are exempt.  This long – and incomplete – list is the first hint that overturning the third-party doctrine won’t be easy.

We’re not done with the weaknesses in the RFPA.  A customer who receives notice of a government request has ten days to challenge the request in federal court.  The customer must argue that the records sought are not relevant to the legitimate law enforcement inquiry identified by the government in the notice.  The customer loses if there is a demonstrable reason to believe that the law enforcement is legitimate and a reasonable belief that the records sought are relevant to that inquiry.  Relevance and legitimacy are weak standards, to say the least.  Good luck winning your case.

Who should get the protection of our bill?  The RFPA gives rights to “customers” of a financial institution.  A customer is an individual or partnership of five or fewer individuals (how would anyone know?).  If legal persons also receive protection, a bill might actually attract corporate support, along with major opposition from every regulatory agency in town.  It will be hard enough to pass a bill limited to individuals.  The great advantage of playing staffer is that you can apply political criteria to solve knotty policy problems.  I’d be inclined to stick to individuals.

Read More