Site Meter

Category: Privacy (ID Theft)

P
0

The FTC and the New Common Law of Privacy

I’m pleased to announce that my article with Professor Woodrow Hartzog, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583 (2014), is now out in print.  You can download the final published version at SSRN.  Here’s the abstract:

One of the great ironies about information privacy law is that the primary regulation of privacy in the United States has barely been studied in a scholarly way. Since the late 1990s, the Federal Trade Commission (FTC) has been enforcing companies’ privacy policies through its authority to police unfair and deceptive trade practices. Despite over fifteen years of FTC enforcement, there is no meaningful body of judicial decisions to show for it. The cases have nearly all resulted in settlement agreements. Nevertheless, companies look to these agreements to guide their privacy practices. Thus, in practice, FTC privacy jurisprudence has become the broadest and most influential regulating force on information privacy in the United States — more so than nearly any privacy statute or any common law tort.

In this Article, we contend that the FTC’s privacy jurisprudence is functionally equivalent to a body of common law, and we examine it as such. We explore how and why the FTC, and not contract law, came to dominate the enforcement of privacy policies. A common view of the FTC’s privacy jurisprudence is that it is thin, merely focusing on enforcing privacy promises. In contrast, a deeper look at the principles that emerge from FTC privacy “common law” demonstrates that the FTC’s privacy jurisprudence is quite thick. The FTC has codified certain norms and best practices and has developed some baseline privacy protections. Standards have become so specific they resemble rules. We contend that the foundations exist to develop this “common law” into a robust privacy regulatory regime, one that focuses on consumer expectations of privacy, extends far beyond privacy policies, and involves a full suite of substantive rules that exist independently from a company’s privacy representations.

I
0

4 Points About the Target Breach and Data Security

There seems to be a surge in data security attacks lately. First came news of the Target attack. Then Neiman Marcus. Then the U.S Courts. Then Michael’s. Here are four points to consider about data security:

1. Beware of fraudsters engaging in post-breach fraud.

After the Target breach, fraudsters sent out fake emails purporting to be from Target about the breach and trying to trick people into providing personal data. It can be hard to distinguish the real email from an organization having a data breach from a fake one by fraudsters. People are more likely to fall prey to a phishing scheme because they are anxious and want to take steps to protect themselves. Post-breach trickery is now a growing technique of fraudsters, and people must be educated about it and be on guard.

2. Credit card fraud and identity theft are not the same.

The news media often conflates credit card fraud with identity theft. Although there is one point of overlap, for the most part they are very different. Credit card fraud involving the improper use of credit card data can be stopped when the card is cancelled and replaced. An identity theft differs because it involves the use of personal information such as Social Security number, birth date, and other data that cannot readily be changed. It is thus much harder to stop identity theft. The point of overlap is when an identity thief uses a person’s data to obtain a credit card. But when a credit card is lost or stolen, or when credit card data is leaked or improperly accessed, this is credit card fraud, and not identity theft.

3. Data breaches cause harm.

What’s the harm when data is leaked? This question has confounded courts, which often don’t recognize a harm. If your credit card is just cancelled and replaced, and you don’t pay anything, are you harmed? If your data is leaked, but you don’t suffer from identity theft, are you harmed? I believe that there is a harm. The harm of credit card fraud is that it can take a long time to replace all the credit card information in various accounts. People have card data on file with countless businesses and organizations for automatic charges and other transactions. Replacing all this data can be a major chore. People’s time has a price. That price will vary, but it rarely is zero.

Read More

Surveillance Man 02
0

10 Reasons Why Privacy Matters

Why does privacy matter? Often courts and commentators struggle to articulate why privacy is valuable. They see privacy violations as often slight annoyances. But privacy matters a lot more than that. Here are 10 reasons why privacy matters.

1. Limit on Power

Privacy is a limit on government power, as well as the power of private sector companies. The more someone knows about us, the more power they can have over us. Personal data is used to make very important decisions in our lives. Personal data can be used to affect our reputations; and it can be used to influence our decisions and shape our behavior. It can be used as a tool to exercise control over us. And in the wrong hands, personal data can be used to cause us great harm.

2. Respect for Individuals

Privacy is about respecting individuals. If a person has a reasonable desire to keep something private, it is disrespectful to ignore that person’s wishes without a compelling reason to do so. Of course, the desire for privacy can conflict with important values, so privacy may not always win out in the balance. Sometimes people’s desires for privacy are just brushed aside because of a view that the harm in doing so is trivial. Even if this doesn’t cause major injury, it demonstrates a lack of respect for that person. In a sense it is saying: “I care about my interests, but I don’t care about yours.”

3. Reputation Management

Privacy enables people to manage their reputations. How we are judged by others affects our opportunities, friendships, and overall well-being. Although we can’t have complete control over our reputations, we must have some ability to protect our reputations from being unfairly harmed. Protecting reputation depends on protecting against not only falsehoods but also certain truths. Knowing private details about people’s lives doesn’t necessarily lead to more accurate judgment about people. People judge badly, they judge in haste, they judge out of context, they judge without hearing the whole story, and they judge with hypocrisy. Privacy helps people protect themselves from these troublesome judgments.

Read More

0

The FTC and the New Common Law of Privacy

I recently posted a draft of my new article, The FTC and the New Common Law of Privacy (with Professor Woodrow Hartzog).

One of the great ironies about information privacy law is that the primary regulation of privacy in the United States has barely been studied in a scholarly way. Since the late 1990s, the Federal Trade Commission (FTC) has been enforcing companies’ privacy policies through its authority to police unfair and deceptive trade practices. Despite more than fifteen years of FTC enforcement, there is no meaningful body of judicial decisions to show for it. The cases have nearly all resulted in settlement agreements. Nevertheless, companies look to these agreements to guide their privacy practices. Thus, in practice, FTC privacy jurisprudence has become the broadest and most influential regulating force on information privacy in the United States – more so than nearly any privacy statute and any common law tort.

In this article, we contend that the FTC’s privacy jurisprudence is the functional equivalent to a body of common law, and we examine it as such. The article explores the following issues:

  • Why did the FTC, and not contract law, come to dominate the enforcement of privacy policies?
  • Why, despite more than 15 years of FTC enforcement, have there been hardly any resulting judicial decisions?
  • Why has FTC enforcement had such a profound effect on company behavior given the very small penalties?
  • Can FTC jurisprudence evolve into a comprehensive regulatory regime for privacy?

 

 

The claims we make in this article include:

  • The common view of FTC jurisprudence as thin — as merely enforcing privacy promises — is misguided. The FTC’s privacy jurisprudence is actually quite thick, and it has come to serve as the functional equivalent to a body of common law.
  • The foundations exist in FTC jurisprudence to develop a robust privacy regulatory regime, one that focuses on consumer expectations of privacy, that extends far beyond privacy policies, and that involves substantive rules that exist independently from a company’s privacy representations.

 

You can download the article draft here on SSRN.

0

Brave New World of Biometric Identification

120px-Fingerprint_scanner_identificationProfessor Margaret Hu’s important new article, “Biometric ID Cybersurveillance” (Indiana Law Journal), carefully and chillingly lays out federal and state government’s increasing use of biometrics for identification and other purposes. These efforts are poised to lead to a national biometric ID with centralized databases of our iris, face, and fingerprints. Such multimodal biometric IDs ostensibly provide greater security from fraud than our current de facto identifier, the social security number. As Professor Hu lays out, biometrics are, and soon will be, gatekeepers to the right to vote, work, fly, drive, and cross into our borders. Professor Hu explains that the FBI’s Next Generation Identification project will institute:

a comprehensive, centralized, and technologically interoperable biometric database that spans across military and national security agencies, as well as all other state and federal government agencies.Once complete, NGI will strive to centralize whatever biometric data is available on all citizens and noncitizens in the United States and abroad, including information on fingerprints, DNA, iris scans, voice recognition, and facial recognition data captured through digitalized photos, such as U.S. passport photos and REAL ID driver’s licenses.The NGI Interstate Photo System, for instance, aims to aggregate digital photos from not only federal, state, and local law enforcement, but also digital photos from private businesses, social networking sites, government agencies, and foreign and international entities, as well as acquaintances, friends, and family members.

Such a comprehensive biometric database would surely be accessed and used by our network of fusion centers and other hubs of our domestic surveillance apparatus that Frank Pasquale and I wrote about here.

Biometric ID cybersurveillance might be used to assign risk assessment scores and to take action based on those scores. In a chilling passage, Professor Hu describes one such proposed program:

FAST is currently under testing by DHS and has been described in press reports as a “precrime” program. If implemented, FAST will purportedly rely upon complex statistical algorithms that can aggregate data from multiple databases in an attempt to “predict” future criminal or terrorist acts, most likely through stealth cybersurveillance and covert data monitoring of ordinary citizens. The FAST program purports to assess whether an individual might pose a “precrime” threat through the capture of a range of data, including biometric data. In other words, FAST attempts to infer the security threat risk of future criminals and terrorists through data analysis.

Under FAST, biometric-based physiological and behavioral cues are captured through the following types of biometric data: body and eye movements, eye blink rate and pupil variation, body heat changes, and breathing patterns. Biometric- based linguistic cues include the capture of the following types of biometric data: voice pitch changes, alterations in rhythm, and changes in intonations of speech.Documents released by DHS indicate that individuals could be arrested and face other serious consequences based upon statistical algorithms and predictive analytical assessments. Specifically, projected consequences of FAST ‘can range from none to being temporarily detained to deportation, prison, or death.’

Data mining of our biometrics to predict criminal and terrorist activity, which is then used as a basis for government decision making about our liberty? If this comes to fruition, technological due process would certainly be required.

Professor Hu calls for the Fourth Amendment to evolve to meet the challenge of 24/7 biometric surveillance technologies. David Gray and I hopefully answer Professor Hu’s request in our article “The Right to Quantitative Privacy” (forthcoming Minnesota Law Review). Rather than asking how much information is gathered in a particular case, we argue that Fourth Amendment interests in quantitative privacy demand that we focus on how information is gathered.  In our view, the threshold Fourth Amendment question should be whether a technology has the capacity to facilitate broad and indiscriminate surveillance that intrudes upon reasonable expectations of quantitative privacy by raising the specter of a surveillance state if deployment and use of that technology is left to the unfettered discretion of government. If it does not, then the Fourth Amendment imposes no limitations on law enforcement’s use of that technology, regardless of how much information officers gather against a particular target in a particular case. By contrast, if it does threaten reasonable expectations of quantitative privacy, then the government’s use of that technology amounts to a “search,” and must be subjected to the crucible of Fourth Amendment reasonableness, including judicially enforced constraints on law enforcement’s discretion.

 

1

“Brain Spyware”

As if we don’t have enough to worry about, now there’s spyware for your brain.  Or, there could be.  Researchers at Oxford, Geneva, and Berkeley have created a proof of concept for using commercially available brain-computer interfaces to discover private facts about today’s gamers. Read More

4

New Edition of Solove & Schwartz’s Privacy Law Fundamentals: Must-Read (and Check out the Video)

Privacy leading lights Dan Solove and Paul Schwartz have recently released the 2013 edition of Privacy Law Fundamentals, a must-have for privacy practitioners, scholars, students, and really anyone who cares about privacy.

Privacy Law Fundamentals is an essential primer of the state of privacy law, capturing the up-to-date developments in legislation, FTC enforcement actions, and cases here and abroad.  As Chief Privacy Officers like Intel’s David Hoffman and renown privacy practitioners like Hogan’s Chris Wolf and Covington’s Kurt Wimmer agree, Privacy Law Fundamentals is an “essential” and “authoritative guide” on privacy law, compact and incredibly useful.  For those of you who know Dan and Paul, their work is not only incredibly wise and helpful but also dispensed in person with serious humor.  Check out this You Tube video, “Privacy Law in 60 Seconds,” to see what I mean.  I think that Psy may have a run for his money on making us smile.

3

Laws Regulating PII

My co-author Sasha Romanosky asks me to post the following:

I am involved in a research project that examines state laws affecting the flow of personal information in some way. This information could relate to patients, employees, financial or retail customers, or even just individuals. And by “flow” we are interested in laws that affect the collection, use, storage, sale, sharing, disclosure, or even destruction of this information.

For example, some state laws require that companies notify you when your personal information has been hacked, while other state laws require notice if the firm plans to sell your information. In addition, laws in other
states restrict the sale of personal health information; enable law enforcement to track cell phone usage without a warrant; or prohibit the collection of a customer’s zip code during a credit card purchase.

Given the huge variation among states in their information laws, we would like to ask readers of Concurring Opinions to help us collect examples of such laws. You are welcome to either post a response to this blog entry or
reply to me directly at sromanos at cmu dot edu.

Thank you!

Sasha is a good guy, and a really careful researcher. Let’s help him!

0

Lend me your ears, no really. I need them to ID you.

Researcher Mark Nixon at the University of Southampton “believes that using photos of individual ears matched against a comparative database could be as distinctive a form of identification as fingerprints.”

According to the University’s news site the claim is that: “Using ears for identification has clear advantages over other kinds of biometric identification, as, once developed, the ear changes little throughout a person’s life. This provides a cradle-to-grave method of identification.”

Ok so they are not taking ears. The method involves cameras, scans, and techniques you may know about from facial recognition. This article has a little more detail. As an A.I. system it probably is pretty cool. Still, it sounds so odd that I wonder whether this work has considered the whole piercing, large gauge trend. I can imagine security that now requires removing ear decorations regardless of what they are made of. Also if really used for less invasive ID, will wearing earmuffs be cause to think someone is hiding or should we remember that folks get cold. For the sci-fi inclined, bet that a movie will entail cutting off an ear for identification just like past films have involved cutting off fingers and hands to fake an identity.

2

Big Data Brokers as Fiduciaries

In a piece entitled “You for Sale,” Sunday’s New York Times raised important concerns about the data broker industry.  Let us add some more perils and seek to reframe the debate about how to regulate Big Data.

Data brokers like Acxiom (and countless others) collect and mine a mind-boggling array of data about us, including Social Security numbers, property records, public-health data, criminal justice sources, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, online musings, browsing habits culled by behavioral advertisers, and the gold mine of drug- and food-store records.  They scrape our social network activity, which with a little mining can reveal our undisclosed sexual preferences, religious affiliations, political views, and other sensitive information.  They may integrate video footage of our offline shopping.  With the help of facial-recognition software, data mining algorithms factor into our dossiers the over-the-counter medicines we pick up, the books we browse, and the pesticides we contemplate buying for our backyards.  Our social media influence scores may make their way into the mix.  Companies, such as Klout, measure our social media influence, usually on a scale from one to 100.  They use variables like the number of our social media followers, frequency of updates, and number of likes, retweets, and shares.  What’s being tracked and analyzed about our online and offline behavior is accelerating – with no sign of slowing down and no assured way to find out.

As the Times piece notes, businesses buy data-broker dossiers to classify those consumers worth pursuing and those worth ignoring (so-called “waste”).  More often those already in an advantaged position get better deals and gifts while the less advantaged get nothing.  The Times piece rightly raised concerns about the growing inequality that such use of Big Data produces.  But far more is at stake.

Government is a major client for data brokers.  More than 70 fusion centers mine data-broker dossiers to detect crimes, “threats,” and “hazards.”  Individuals are routinely flagged as “threats.”  Such classifications make their way into the “information-sharing environment,” with access provided to local, state, and federal agencies as well as private-sector partners.  Troublingly, data-broker dossiers have no quality assurance.  They may include incomplete, misleading, and false data.  Let’s suppose a data broker has amassed a profile on Leslie McCann.  Social media scraped, information compiled, and videos scanned about “Leslie McCann” might include information about jazz artist “Les McCann” as well as information about criminal with a similar name and age.  Inaccurate Big Data has led to individuals’ erroneous inclusion on watch lists, denial of immigration applications, and loss of public benefits.  Read More