Category: Privacy (Law Enforcement)

0

Privacy, Masks and Religion

Basking & masking. In China, where sun tan is negatively stigmatized, beach goers wear masks.

One of the most significant developments for privacy law over the past few years has been the rapid erosion of privacy in public. As recently as a decade ago, we benefitted from a fair degree of de facto privacy when walking the streets of a city or navigating a shopping mall. To be sure, we were in plain sight; someone could have seen and followed us; and we would certainly be noticed if we took off our clothes. After all, a public space was always less private than a home. Yet with the notable exception of celebrities, we would have generally benefitted from a fair degree of anonymity or obscurity. A great deal of effort, such as surveillance by a private investigator or team of FBI agents, was required to reverse that. [This, by the way, isn’t a post about US v. Jones, which I will write about later].

 

Now, with mobile tracking devices always on in our pockets; with GPS enabled cars; surveillance cameras linked to facial recognition technologies; smart signage (billboards that target passersby based on their gender, age, or eventually identity); and devices with embedded RFID chips – privacy in public is becoming a remnant of the past.

 

Location tracking is already a powerful tool in the hands of both law enforcement and private businesses, offering a wide array of localized services from restaurant recommendations to traffic reports. Ambient social location apps, such as Glancee and Banjo, are increasingly popular, creating social contexts based on users’ location and enabling users to meet and interact.

 

Facial recognition is becoming more prevalent. This technology too can be used by law enforcement for surveillance or by businesses to analyze certain characteristics of their customers, such as their age, gender or mood (facial detection) or downright identify them (facial recognition). One such service, which was recently tested, allows individuals to check-in to a location on Facebook through facial scanning.

 

Essentially, our face is becoming equivalent to a cookie, the ubiquitous online tracking device. Yet unlike cookies, faces are difficult to erase. And while cellular phones could in theory be left at home, we very rarely travel without them. How will individuals react to a world in which all traces of privacy in public are lost?

Read More

2

United States v. Skinner: Developments in the Surveillance State and a Response

It’s not news to CoOp readers that Fourth Amendment law is in a state of confusion over how to deal with ever-expanding capacities of state agents to collect information about our movements and activities using a range of surveillance technologies.  My colleague David Gray and I have spent lots of time thinking and writing about the fog surrounding this issue in light of United States v. Jones.  So we write this post together — Professor David Gray is my brilliant colleague who has been a guest for us in the past.  So here is what is on our minds:

The Supreme Court avoided a four-square engagement with these issues last term in Jones by rehabilitating a long-forgotten, but not lost, property-based test of Fourth Amendment search.  For most of us, however, the real action in the opinion was in the concurrences, which make clear that five justices are ready to hold that we may have a reasonable expectation of privacy in massive aggregates of data, even if not that is not true for the constituent parts.  The focus of the academic debate after Jones, including a really fascinating session at the Privacy Law Scholars Conference in June, has largely focused on the pros and cons of the “mosaic” theory, which would assess Fourth Amendment interests in quantitative privacy on a case-by-case basis by asking whether law enforcement had gathered too much information on their subject in the course of their investigation.  Justice Alito, writing for himself and three others, appeared to endorse the mosaic theory in Jones, and therefore would have held that law enforcement engaged in a Fourth Amendment search by using a GPS-enabled tracking device to monitor Jones’s movements over public streets for 28 days, generating over 2,000 pages of data along the way.

Before the ink was dry in Jones, Orin Kerr was out with a powerful critique.  Orin’s concerns, which Justice Scalia seems to share, are doctrinal and practical.  Christopher Slobogin has since offered a very thoughtful defense of the mosaic theory, which comes complete with a model statute complete with commentary (take notice Chief Justice Roberts!).  Professor Gray and I just posted an article on SSRN arguing that, by focusing on the mosaic theory, much of the conversation about technology and the Fourth Amendment has gone badly wrong after Jones.  The Sixth Circuit’s opinion in United States v. Skinner confirms the worst of our concerns.  Another nod to Orin Kerr for putting a spotlight on this decision over at the Volokh Conspiracy.

The question put to the court in Skinner was whether the “use of the GPS location information emitted from [Skinner’s] cell phone was a warrantless search that violated the Fourth Amendment . . . .”  Writing for himself and Judge Clay, Judge Rogers held that “Skinner did not have a reasonable expectation of privacy in the data emanating from his cell phone that showed its location” in the same way that “the driver of a getaway car has no expectation of privacy in the particular combination of colors of his car’s paint.”  Because the officers tracking Skinner only did so for three days, Judge Rogers also saw no quantitative privacy interest at stake.

Skinner is confusing in many ways.  The court is not entirely clear on what tracking technology was used, how it was used, which line of Fourth Amendment doctrine it relied upon, or how its holding can be reconciled with Kyllo.  For now, let’s bypass those issues to focus on what we take to be a dangerous implication of Skinner and perhaps the mosaic theory as well.  According to Judge Rogers, none of us has “a reasonable expectation of privacy in the inherent external locatability of a tool that he or she bought.”  That is, there is absolutely no Fourth Amendment prohibition on law enforcement’s using the GPS devices installed in our phone, cars, and computers, or trilateration between cellular towers to track any of us at anytime.  Because there are no real practical limitations on the scope of surveillance that these technologies can achieve, Judge Rogers’s holding licenses law enforcement to track us all of the time.  The mosaic theory might step in if the government tracks any one of us for too long, but it preserves the possibility that, at any given time, any of us or all of us may be subject to close government surveillance.

We think that something has gone terribly wrong if the Fourth Amendment is read as giving license to a surveillance state.  As we argue in our article, programs of broad and indiscriminate surveillance have deleterious effects on our individual development and our collective democratic processes.  These concerns are familiar in the information privacy law context, where we have spent nearly fifty years talking about  dataveillance and digital dossiers, but they have clear footing in the Fourth Amendment as well.  More precisely, we argue that a fundamental purpose of the Fourth Amendment is to serve as a bulwark against the rise of a surveillance state.  It should be read as denying law enforcement officers unfettered access to investigative technologies that are capable of facilitating broad programs of indiscriminate surveillance.  GPS-enabled tracking is pretty clearly one of these technologies, and therefore should be subject to the crucible of Fourth Amendment reasonableness—at least on our technology-centered approach to quantitative privacy.

3

Social Media and Chat Monitoring

Suppose a system could help alert people to online sexual predators? Many might like that. But suppose that same system could allow people to look for gun purchasers, government critics, activists of any sort; what would we say then? The tension between these possibilities is before us. Mashable reports that Facebook and other platforms are now monitoring chats to see whether criminal activity is suspected. The article focuses on the child predator use case. Words are scanned for danger signals. Then “The software pays more attention to chats between users who don’t already have a well-established connection on the site and whose profile data indicate something may be wrong, such as a wide age gap. The scanning program is also ‘smart’ — it’s taught to keep an eye out for certain phrases found in the previously obtained chat records from criminals including sexual predators.” After a flag is raised a person decides whether to notify police. The other uses of such a system are not discussed in the article. Yet again, we smash our heads against the speech, security, privacy walls. I expect some protests and some support for the move. Blood may spill on old battlegrounds. Nonetheless, I think that the problems the practice creates merit the fight. The privacy harms and the speech harms mean that even if there are small “false positives” in the sexual predator realm, why a company gets to decide to notify police, how the system might be co-opted for other uses, and the affect on people’s ability to talk online should be sorted as social platforms start to implement monitoring systems.

0

Lend me your ears, no really. I need them to ID you.

Researcher Mark Nixon at the University of Southampton “believes that using photos of individual ears matched against a comparative database could be as distinctive a form of identification as fingerprints.”

According to the University’s news site the claim is that: “Using ears for identification has clear advantages over other kinds of biometric identification, as, once developed, the ear changes little throughout a person’s life. This provides a cradle-to-grave method of identification.”

Ok so they are not taking ears. The method involves cameras, scans, and techniques you may know about from facial recognition. This article has a little more detail. As an A.I. system it probably is pretty cool. Still, it sounds so odd that I wonder whether this work has considered the whole piercing, large gauge trend. I can imagine security that now requires removing ear decorations regardless of what they are made of. Also if really used for less invasive ID, will wearing earmuffs be cause to think someone is hiding or should we remember that folks get cold. For the sci-fi inclined, bet that a movie will entail cutting off an ear for identification just like past films have involved cutting off fingers and hands to fake an identity.

2

Big Data Brokers as Fiduciaries

In a piece entitled “You for Sale,” Sunday’s New York Times raised important concerns about the data broker industry.  Let us add some more perils and seek to reframe the debate about how to regulate Big Data.

Data brokers like Acxiom (and countless others) collect and mine a mind-boggling array of data about us, including Social Security numbers, property records, public-health data, criminal justice sources, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, online musings, browsing habits culled by behavioral advertisers, and the gold mine of drug- and food-store records.  They scrape our social network activity, which with a little mining can reveal our undisclosed sexual preferences, religious affiliations, political views, and other sensitive information.  They may integrate video footage of our offline shopping.  With the help of facial-recognition software, data mining algorithms factor into our dossiers the over-the-counter medicines we pick up, the books we browse, and the pesticides we contemplate buying for our backyards.  Our social media influence scores may make their way into the mix.  Companies, such as Klout, measure our social media influence, usually on a scale from one to 100.  They use variables like the number of our social media followers, frequency of updates, and number of likes, retweets, and shares.  What’s being tracked and analyzed about our online and offline behavior is accelerating – with no sign of slowing down and no assured way to find out.

As the Times piece notes, businesses buy data-broker dossiers to classify those consumers worth pursuing and those worth ignoring (so-called “waste”).  More often those already in an advantaged position get better deals and gifts while the less advantaged get nothing.  The Times piece rightly raised concerns about the growing inequality that such use of Big Data produces.  But far more is at stake.

Government is a major client for data brokers.  More than 70 fusion centers mine data-broker dossiers to detect crimes, “threats,” and “hazards.”  Individuals are routinely flagged as “threats.”  Such classifications make their way into the “information-sharing environment,” with access provided to local, state, and federal agencies as well as private-sector partners.  Troublingly, data-broker dossiers have no quality assurance.  They may include incomplete, misleading, and false data.  Let’s suppose a data broker has amassed a profile on Leslie McCann.  Social media scraped, information compiled, and videos scanned about “Leslie McCann” might include information about jazz artist “Les McCann” as well as information about criminal with a similar name and age.  Inaccurate Big Data has led to individuals’ erroneous inclusion on watch lists, denial of immigration applications, and loss of public benefits.  Read More

0

The Right to Be Forgotten: A Criminal’s Best Friend?

By now, you’ve likely heard about the the proposed EU regulation concerning the right to be forgotten.  The drafters of the proposal expressed concern for  social media users who have posted comments or photographs that they later regretted. Commissioner Reding explained: “If an individual no longer wants his personal data to be processed or stored by a data controller, and if there is no legitimate reason for keeping it, the data should be removed from their system.”

Proposed Article 17 provides:

[T]he data subject shall have the right to obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data, especially in relation to personal data which are made available by the data subject while he or she was a child, where one of the following grounds applies . . . .

Where the controller referred to in paragraph 1 has made the personal data public, it shall take all reasonable steps, including technical measures, in relation to data for the publication of which the controller is responsible, to inform third parties which are processing such data, that a data subject requests them to erase any links to, or copy or replication of that personal data. Where the controller has authorised a third party publication of personal data, the controller shall be considered responsible for that publication.

The controller shall carry out the erasure without delay, except to the extent that the retention of the personal data is necessary: (a) for exercising the right of freedom of expression in accordance with Article 80; (b) for reasons of public interest in the area of public health in accordance with Article 81; (c) for historical, statistical and scientific research purposes in accordance with Article 83; (d) for compliance with a legal obligation to retain the personal data by Union or Member State law to which the controller is subject . . . . Read More

1

BRIGHT IDEAS: Q&A with Bruce Schneier about Liars and Outliers

Bruce Schneier has recently published a new book, Liars and Outliers: Enabling the Trust that Society Needs to Thrive (Wiley 2012).  Bruce is a renowned security expert, having written several great and influential books including Secrets and Lies and Beyond Fear.

Liars and Outliers is a fantastic book, and a very ambitious one — an attempt to conceptualize trust and security.  The book is filled with great insights, and is a true achievement. And it’s a fun read too.  I recently conducted a brief interview with Bruce about the book:

Q (Solove): What is the key idea of your book?

A (Schneier): Liars and Outliers is about trust in society, and how we induce it. Society requires trust to function; without it, society collapses. In order for people to have that trust, other people must be trustworthy. Basically, they have to conform to the social norms; they have to cooperate. However, within any cooperative system there is an alternative defection strategy, called defection: to be a parasite and take advantage of others’ cooperation.

Too many parasites can kill the cooperative system, so it is vital for society to keep defectors down to a minimum. Society has a variety of mechanisms to do this. It all sounds theoretical, but this model applies to terrorism, the financial crisis of 2008, Internet crime, the Mafia code of silence, market regulation…everything involving people, really.

Understanding the processes by which society induces trust, and how those processes fail, is essential to solving the major social and political problems of today. And that’s what the book is about. If I could tie policymakers to a chair and make them read my book, I would.

Okay, maybe I wouldn’t.

Q: What are a few of the conclusions from Liars and Outliers that you believe are the most important and/or provocative?

A: That 100% cooperation in society is impossible; there will always be defectors. Moreover, that more security isn’t always worth it. There are diminishing returns — spending twice as much on security doesn’t halve the risk — and the more security you have, the more innocents it accidentally ensnares. Also, society needs to trust those we entrust with enforcing trust; and the more power they have, the more easily they can abuse it. No one wants to live in a totalitarian society, even if it means there is no street crime.

More importantly, defectors — those who break social norms — are not always in the wrong. Sometimes they’re morally right, only it takes a generation before people realize it. Defectors are the vanguards of social change, and a society with too much security and too much cooperation is a stagnant one.

Read More

0

Stanford Law Review Online: How the War on Drugs Distorts Privacy Law

Stanford Law Review

The Stanford Law Review Online has just published an Essay by Jane Yakowitz Bambauer entitled How the War on Drugs Distorts Privacy Law. Professor Yakowitz analyzes the opportunity the Supreme Court has to rewrite certain privacy standards in Florida v. Jardines:

The U.S. Supreme Court will soon determine whether a trained narcotics dog’s sniff at the front door of a home constitutes a Fourth Amendment search. The case, Florida v. Jardines, has privacy scholars abuzz because it presents two possible shifts in Fourth Amendment jurisprudence. First, the Court might expand the physical spaces rationale from Justice Scalia’s majority opinion in United States v. Jones. A favorable outcome for Mr. Jardines could reinforce that the home is a formidable privacy fortress, protecting all information from government detection unless that information is visible to the human eye.

Alternatively, and more sensibly, the Court may choose to revisit its previous dog sniff cases, United States v. Place and Illinois v. Caballes. This precedent has shielded dog sniffs from constitutional scrutiny by finding that sniffs of luggage and a car, respectively, did not constitute searches. Their logic is straightforward: since a sniff “discloses only the presence or absence of narcotics, a contraband item,” a search incident to a dog’s alert cannot offend reasonable expectations of privacy. Of course, the logical flaw is equally obvious: police dogs often alert when drugs are not present, resulting in unnecessary suspicionless searches.

She concludes:

Jardines offers the Court an opportunity to carefully assess a mode of policing that subjects all constituents to the burdens of investigation and punishment, not just the “suspicious.” Today, drug-sniffing dogs are unique law enforcement tools that can be used without either individualized suspicion or a “special needs” checkpoint. Given their haphazard deployment and erratic performance, police dogs deserve the skepticism many scholars and courts have expressed. But the wrong reasoning in Jardines could fix indefinitely an assumption that police technologies and civil liberties are always at odds. This would be unfortunate. New technologies have the potential to be what dogs never were—accurate and fair. Explosive detecting systems may eventually meet the standards for this test, and DNA-matching and pattern-based data mining offer more than mere hypothetical promise. Responsible use of these emerging techniques requires more transparency and even application than police departments are accustomed to, but decrease in law enforcement discretion is its own achievement. With luck, the Court will find a search in Jardines while avoiding a rule that reflexively hampers the use of new technologies.

Read the full article, How the War on Drugs Distorts Privacy Law by Jane Yakowitz Bambauer, at the Stanford Law Review Online.

1

Cybersecurity Legislation and the Privacy and Civil Liberties Oversight Board

Along with a lot of other privacy folks, I have a lot of concerns about the cybersecurity legislation moving through Congress.  I had an op-ed in The Hill yesterday going through some of the concerns, notably the problems with the over broad  “information sharing” provisions.

Writing the op-ed, though, prompted me to highlight one positive step that should happen in the course of the cybersecurity debate.  The Privacy and Civil Liberties Oversight Board was designed in large part to address information sharing.  This past Wednesday, the Senate Judiciary Committee had the hearing to consider the bipartisan slate of five nominees.

Here’s the point.  The debate on CISPA and other cybersecurity legislation has highlighted all the information sharing that is going on already and that may be going on in the near future.  The PCLOB is the institution designed to oversee problems with information sharing.  So let’s confirm the nominees and get the PCLOB up and running as soon as possible.

The quality of the nominees is very high.  David Medine, nominated to be Chair, helped develop the FTC’s privacy approach in the 1990′s and has worked on privacy compliance since, so he knows what should be done and what is doable.  Jim Dempsey has been at the Center of Democracy and Technology for over 15 years, and is a world-class expert on government, privacy, and civil liberties.  Pat Wald is the former Chief Judge of the DC Circuit.  Her remarkably distinguished career includes major experience on international human rights issues.  I don’t have experience with the other two nominees, but the hearing exposed no red flags for any of them.

The debates about cybersecurity legislation show the centrality of information sharing to how government will respond to cyber-threats.  So we should have the institution in place to make sure that the information sharing is done in a lawful and sensible way, to be effective and also to protect privacy and civil liberties.

4

Stanford Law Review Online: The Dead Past

Stanford Law Review

The Stanford Law Review Online has just published Chief Judge Alex Kozinski’s Keynote from our 2012 Symposium, The Dead Past. Chief Judge Kozinski discusses the privacy implications of our increasingly digitized world and our role as a society in shaping the law:

I must start out with a confession: When it comes to technology, I’m what you might call a troglodyte. I don’t own a Kindle or an iPad or an iPhone or a Blackberry. I don’t have an avatar or even voicemail. I don’t text.

I don’t reject technology altogether: I do have a typewriter—an electric one, with a ball. But I do think that technology can be a dangerous thing because it changes the way we do things and the way we think about things; and sometimes it changes our own perception of who we are and what we’re about. And by the time we realize it, we find we’re living in a different world with different assumptions about such fundamental things as property and privacy and dignity. And by then, it’s too late to turn back the clock.

He concludes:

Judges, legislators and law enforcement officials live in the real world. The opinions they write, the legislation they pass, the intrusions they dare engage in—all of these reflect an explicit or implicit judgment about the degree of privacy we can reasonably expect by living in our society. In a world where employers monitor the computer communications of their employees, law enforcement officers find it easy to demand that internet service providers give up information on the web-browsing habits of their subscribers. In a world where people post up-to-the-minute location information through Facebook Places or Foursquare, the police may feel justified in attaching a GPS to your car. In a world where people tweet about their sexual experiences and eager thousands read about them the morning after, it may well be reasonable for law enforcement, in pursuit of terrorists and criminals, to spy with high-powered binoculars through people’s bedroom windows or put concealed cameras in public restrooms. In a world where you can listen to people shouting lurid descriptions of their gall-bladder operations into their cell phones, it may well be reasonable to ask telephone companies or even doctors for access to their customer records. If we the people don’t consider our own privacy terribly valuable, we cannot count on government—with its many legitimate worries about law-breaking and security—to guard it for us.

Which is to say that the concerns that have been raised about the erosion of our right to privacy are, indeed, legitimate, but misdirected. The danger here is not Big Brother; the government, and especially Congress, have been commendably restrained, all things considered. The danger comes from a different source altogether. In the immortal words of Pogo: “We have met the enemy and he is us.”

Read the full article, The Dead Past by Alex Kozinski, at the Stanford Law Review Online.