Category: Privacy

Posner
6

On Privacy, Free Speech, & Related Matters – Richard Posner vs David Cole & Others

I’m exaggerating a little, but I think privacy is primarily wanted by people because they want to conceal information to fool others. Richard Posner

Privacy is overratedRichard Posner (2013)

 Much of what passes for the name of privacy is really just trying to conceal the disreputable parts of your conduct. Privacy is mainly about trying to improve your social and business opportunities by concealing the sorts of bad activities that would cause other people not to want to deal with you.Richard Posner (2014)

This is the seventh installment in the “Posner on Posner” series of posts on Seventh Circuit Judge Richard Posner. The first installment can be found here, the second here, the third here, the fourth here, the fifth here, and the sixth one here.

Privacy has been on Richard Posner’s mind for more than three-and-a-half decades. His views, as evidenced by the epigraph quotes above, have sparked debate in a variety of quarters, both academic and policy. In some ways those views seem oddly consistent with his persona – on the one hand, he is a very public man as revealed by his many writings, while on the other hand, he is a very private man about whom we know little of his life outside of the law save for a New Yorker piece on him thirteen years ago.

On the scholarly side of the privacy divide, his writings include:

  1. The Right of Privacy,” 12 Georgia Law Review 393 (1978)
  2. Privacy, Secrecy, and Reputation,” 28 Buffalo Law Review 1 (1979)
  3. The Uncertain Protection of Privacy by the Supreme Court,” 1979 Supreme Court Review 173
  4. The Economics of Privacy,” 71 The American Economic Review 405 (1981)
  5. Privacy,” Big Think (video clip, nd)
  6. Privacy is Overrated,” New York Daily News, April 28, 2014

For a sampling of Judge Posner’s opinion on privacy, go here (and search Privacy)

(Note: Some links will only open in Firefox or Chrome.)

_____________________

Privacy – “What’s the big deal?”

Privacy interests should really have very little weight when you’re talking about national security. The world is in an extremely turbulent state – very dangerous. — Richard Posner (2014)

Recently, Georgetown Law Center held a conference entitled “Cybercrime 2020: The Future of Online Crime and Investigations” (full C-SPAN video here). In the course of that event, Judge Posner joined with others in government, private industry, and in the legal academy to discuss privacy, the Fourth Amendment, and free speech, among other things. A portion of the exchange between Judge Posner and Georgetown law professor David Cole was captured on video.

Judge Richard Posner

Judge Richard Posner

Scene: The Judge sitting in his office, speaking into a video conference camera — As he rubbed his fingers across the page and looked down, Posner began: “I was thinking, listening to Professor Cole, what exactly is the information that he’s worried about?” Posner paused, as if to setup his next point: “I have a cell phone – iPhone 6 – so if someone drained my cell phone, they would find a picture of my cat [laughter], some phone numbers, some e-mail addresses, some e-mail texts – so what’s the big deal?”

He then glanced up from the text he appeared to be reading and spoke with a grin: “Other people must have really exciting stuff. [laughter] Could they narrate their adulteries or something like that?” [laughter] He then waved his hands in the air before posing a question to the Georgetown Professor.

“What is it that you’re worrying about?” Posner asked as if truly puzzled.

At that point, Cole leaned into his microphone and looked up at the video screen bearing the Judge’s image next to case reports on his left and the American flag on his right.

Cole: “That’s a great question, Judge Posner.”

Professor Cole continued, adding his own humor to the mix: “And I, like you, have only pictures of cats on my phone. [laughter] And I’m not worried about anything from myself, but I’m worried for others.”

On a more substantive note, Cole added: “Your question, which goes back to your original statement, . . . value[s] . . . privacy unless you have something to hide. That is a very, very shortsighted way of thinking about the value [of privacy]. I agree with Michael Dreeben: Privacy is critical to a democracy; it is critical to political freedom; [and] it is critical to intimacy.”

The sex video hypothetical

And then with a sparkle in his spectacled eye, Cole stated: “Your question brings to mind a cartoon that was in the New Yorker, just in the last couple of issues, where a couple is sitting in bed and they have video surveillance cameras over each one of them trained down on the bed [Cole holds his hands above his head to illustrate the peering cameras]. And the wife says to the husband: ‘What are you worried about if you’ve got nothing to hide, you’ve got nothing to fear.’”

Using the cartoon as his conceptual springboard, Cole moved on to his main point: “It seems to me that all of us, whether we are engaged in entirely cat-loving behavior, or whether we are going to psychiatrists, or abortion providers, or rape crises centers, or Alcoholics Anonymous, or have an affair – all of us have something to hide. Even if you don’t have anything to hide, if you live a life that could be entirely transparent to the rest of the world, I still think the value of that life would be significantly diminished if it had to be transparent.”

Without missing a beat, Cole circled back to his video theme: “Again you could say, ‘if you’ve got nothing to hide, and you’re not engaged in criminal activity, let’s put video cameras in every person’s bedroom. And let’s just record the video, 24/7, in their bedroom. And we won’t look at it until we have reason to look at it. You shouldn’t be concerned because . . .’”

At this point, Posner interrupted: “Look, that’s a silly argument.”

Cole: “But it’s based on a New Yorker cartoon.”

The Judge was a tad miffed; he waved his right hand up and down in a dismissive way: “The sex video, that’s silly!Waving his index finger to emphasize his point, he added: “What you should be saying, [what] you should be worried about [are] the types of revelation[s] of private conduct [that] discourage people from doing constructive things. You mentioned Alcoholics Anonymous . . .”

Cole: “I find sex to be a constructive thing.”

Obviously frustrated, Posner raised his palms up high in protest: “Let me finish, will you please?”

Cole: “Sure.”

Posner: “Look, that was a good example, right? Because you can have a person who has an alcohol problem, and so he goes to Alcoholics Anonymous, but he doesn’t want this to be known. If he can’t protect that secret,” Posner continued while pointing, “then he’s not going to go to Alcoholics Anonymous. That’s gonna be bad. That’s the sort of thing you should be concerned about rather than with sex videos. . . . [The Alcoholics Anonymous example] is a good example of the kind of privacy that should be protected.”

David Cole

Professor David Cole

Privacy & Politics 

Meanwhile, the audience listened and watched on with its attention now fixed on the Georgetown professor.

Cole: “Well, let me give you an example of sex privacy. I think we all have an interest in keeping our sex lives private. That’s why we close doors into our bedroom, etc. I think that’s a legitimate interest, and it’s a legitimate concern. And it’s not because you have something wrong you want to hide, but because intimacy requires privacy, number one. And number two: think about the government’s use of sex information with respect to Dr. Martin Luther King. They investigated him, intruded on his privacy by bugging his hotel rooms to learn [about his] affair, and then sought to use that – and the threat of disclosing that affair – to change his behavior. Why? Because he was an active, political, dissident fighting for justice.”

“We have a history of that,” he added. “Our country has a history of that; most countries have a history of that; and that’s another reason the government will use information – that doesn’t necessarily concern [it] – to target people who [it is] concerned about . . . – not just because of their alcohol problem [or] not just because of their sexual proclivities – but because they have political views and political ideas that the government doesn’t approve of.”

At this point the moderator invited the Judge to respond.

Posner: “What happened to cell phones? Do you have sex photos on your cell phones?”

Cole: “I imagine if Dr. Martin Luther King was having an affair in 2014, as opposed to the 1960s, his cell phone, his smart phone, would have quite a bit of evidence that would lead the government to that affair. He’d have call logs; he might have texts; he might have e-mails – all of that would be on the phone.”

The discussion then moved onto the other panelists.

Afterwards, and writing on the Volokh Conspiracy blog, Professor Orin Kerr, who was one of the participants in the conference, summed up his views of the exchange this way:

“I score this Cole 1, Posner 0.”

The First Amendment — Enter Glenn Greenwald Read More

European Parliament Resolution on Google

The European Parliament voted 384 – 174 today in favor of a “resolution on Supporting Consumer Rights in the Digital Single Market.” The text of the resolution:

Stresses that all internet traffic should be treated equally, without discrimination, restriction or interference, independently of its sender, receiver, type, content, device, service or application;

Notes that the online search market is of particular importance in ensuring competitive conditions within the Digital Single Market, given the potential development of search engines into gatekeepers and their possibility of commercialising secondary exploitation of obtained information; therefore calls on the Commission to enforce EU competition rules decisively, based on input from all relevant stakeholders and taking into account the entire structure of the Digital Single Market in order to ensure remedies that truly benefit consumers, internet users and online businesses; furthermore calls on the Commission to consider proposals with the aim of unbundling search engines from other commercial services as one potential long-term solution to achieve the previously mentioned aims;

Stresses that when using search engines, the search process and results should be unbiased in order to keep internet search non-discriminatory, to ensure more competition and choice for users and consumers and to maintain the diversity of sources of information; therefore notes that indexation, evaluation, presentation and ranking by search engines must be unbiased and transparent, while for interlinked services, search engines must guarantee full transparency when showing search results; calls on Commission to prevent any abuse in the marketing of interlinked services by operators of search engines;

Some in the US tech press has played this up as an incipient effort to “break up” Google, with predictable derision at “technopanic.” (Few tend to reflect on whether the 173 former firms listed here really need to be part of one big company.) But the resolution’s linking of net and search neutrality suggests other regulatory approaches (prefigured in my 2008 paper Internet Nondiscrimination Principles: Commercial Ethics for Carriers and Search Engines). I’ve developed these ideas over the years, and I hope my recently released book‘s chapters on search and digital regulation will be of some use to policymakers. Without some regulatory oversight and supervision, our black box society will only get more opaque.

FTC 01
1

Should the FTC Be Regulating Privacy and Data Security?

This post was co-authored with Professor Woodrow Hartzog.

This past Tuesday the Federal Trade Commission (FTC) filed a complaint against AT&T for allegedly throttling the Internet of its customers even though they paid for unlimited data plans. This complaint was surprising for many, who thought the Federal Communications Commission (FCC) was the agency that handled such telecommunications issues. Is the FTC supposed to be involved here?

This is a question that has recently been posed in the privacy and data security arenas, where the FTC has been involved since the late 1990s. Today, the FTC is the most active federal agency enforcing privacy and data security, and it has the broadest reach. Its fingers seem to be everywhere, in all industries, even those regulated by other agencies, such as in the AT&T case. Is the FTC going too far? Is it even the FTC’s role to police privacy and data security?

The Fount of FTC Authority

The FTC’s source of authority for privacy and data security comes from some specific statutes that give the FTC regulatory power. Examples include the Children’s Online Privacy Protection Act (COPPA) where the FTC regulates online websites collecting data about children under 13 and the Gramm-Leach-Bliley Act (GLBA) which governs financial institutions.

But the biggest source of the FTC’s authority comes from Section 5 of the FTC Act, where the FTC can regulate “unfair or deceptive acts or practices in or affecting commerce.” This is how the FTC has achieved its dominant position.

Enter the Drama

Until recently, the FTC built its privacy and security platform with little pushback. All of the complaints brought by the FTC for unfair data security practices quickly settled. However, recently, two companies have put on their armor, drawn their swords, and raised the battle cry. Wyndham Hotels and LabMD have challenged the FTC’s authority to regulate data security. These are more than just case-specific challenges that the FTC got the facts wrong or that the FTC is wrong about certain data security practices. Instead, these challenges go to whether the FTC should be regulating data security under Section 5 in the first place. And the logic of these challenges could also potentially extend to privacy as well.

The first dispute involving Wyndham Hotels has already resulted in a district court opinion affirming the FTC’s data protection jurisprudence. The second dispute over FTC regulatory authority involving LabMD is awaiting trial.

In the LabMD case, LabMD is contending that the U.S. Department of Health and Human Services (HHS) — not the FTC — has the authority to regulate data security practices affecting patient data regulated by HIPAA.

With Wyndham, and especially LabMD, the drama surrounding the FTC’s activities in data protection has gone from 2 to 11. The LabMD case has involved the probable shuttering of business, a controversial commissioner recusal, a defamation lawsuit, a House Oversight committee investigation into the FTC’s actions, and an entire book written by the LabMD’s CEO chronicling his view of the conflict. And the case hasn’t even been tried yet!

The FTC Becomes a Centenarian

And so, it couldn’t be more appropriate that this year, the FTC celebrates its 100th birthday.

To commemorate the event, the George Washington Law Review is hosting a symposium titled “The FTC at 100: Centennial Commemorations and Proposals for Progress,” which will be held on Saturday, November 8, 2014, in Washington, DC.

The lineup for this event is really terrific, including U.S. Supreme Court Justice Steven Breyer, FTC Chairwoman Edith Ramirez, FTC Commissioner Joshua Wright, FTC Commissioner Maureen Ohlhausen, as well as many former FTC officials.

FTC 03 GW

Some of the participating professors include Richard Pierce, William Kovacic, David Vladeck, Howard Beales, Timothy Muris, and Tim Wu, just to name a few.

At the event, we will be presenting our forthcoming article:

The Scope and Potential of FTC Data Protection
83 George Washington Law Review (forthcoming 2015)

So Is the FTC Overreaching?

Short answer: No. In our paper, The Scope and Potential of FTC Data Protection, we argue that the FTC not only has the authority to regulate data protection to the extent it has been doing, but it also has the authority to expand its reach much more. Here are some of our key points:

* The FTC has a lot of power. Congress gave the FTC very broad and general regulatory authority by design to allow for a more nimble and evolutionary approach to the regulation of consumer protection.

* Overlap in agency authority is inevitable. The FTC’s regulation of data protection will inevitably overlap with other agencies and state law given the very broad jurisdiction in Section 5, which spans nearly all industries. If the FTC’s Section 5 power were to stop at any overlapping regulatory domain, the result would be a confusing, contentious, and unworkable regulatory system with boundaries constantly in dispute.

* The FTC’s use of a “reasonable” standard for data security is quite reasonable. Critics of the FTC have attacked its data security jurisprudence as being too vague and open-ended; the FTC should create a specific list of requirements. However, there is a benefit to mandating reasonable data security instead of a specific, itemized checklist. When determining what is reasonable, the FTC has often looked to industry standards. Such an approach allows for greater flexibility in the face of technological change than a set of rigid rules.

* The FTC performs an essential role in US data protection. The FTC’s current scope of data protection authority is essential to the United States data protection regime and should be fully embraced. The FTC’s regulation of data protection gives the U.S. system of privacy law needed legitimacy and heft. Without the FTC’s data protection enforcement authority, the E.U. Safe Harbor agreement and other arrangements that govern the international exchange of personal information would be in jeopardy. The FTC can also harmonize discordant privacy-related laws and obviate the need for new laws.

* Contrary to the critics, the FTC has used its powers very conservatively. Thus far, the FTC has been quite modest in its enforcement, focusing on the most egregious offenders and enforcing the most widespread industry norms. The FTC should push the development of the norms a little more (though not in an extreme or aggressive way).

* The FTC can and should expand its enforcement, and there are areas in need of improvement. The FTC now sits atop an impressive body of jurisprudence. We applaud its efforts and believe it can and should do even more. But as it grows into this role of being the data protection authority for the United States, some gaps in its power need to be addressed and it can improve its processes and transparency.

The FTC currently plays the role as the primary regulator of privacy and data security in the United States. It reached this position in part because Congress never enacted comprehensive privacy regulation and because some kind of regulator was greatly needed to fill the void. The FTC has done a lot so far, and we believe it can and should do more.

If you want more detail, please see our paper, The Scope and Potential of FTC Data Protection. And with all the drama about the FTC these days, please contact us if you want to option the movie rights.

Cross-posted on LinkedIn

Reining in the Data Brokers

I’ve been alarmed by data brokers’ ever-expanding troves of personal information for some time. My book outlines the problem, explaining how misuse of data undermines equal opportunity. I think extant legal approaches–focusing on notice and consent–put too much of a burden on consumers. This NYT opinion piece sketches an alternate approach:

[D]ata miners, brokers and resellers have now taken creepy classification to a whole new level. They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed.

***

Privacy protections in other areas of the law can and should be extended to cover consumer data. The Health Insurance Portability and Accountability Act, or Hipaa, obliges doctors and hospitals to give patients access to their records. The Fair Credit Reporting Act gives loan and job applicants, among others, a right to access, correct and annotate files maintained by credit reporting agencies.

It is time to modernize these laws by applying them to all companies that peddle sensitive personal information. If the laws cover only a narrow range of entities, they may as well be dead letters. For example, protections in Hipaa don’t govern the “health profiles” that are compiled and traded by data brokers, which can learn a great deal about our health even without access to medical records.

There’s more online, but given the space constraints, I couldn’t go into all the details that the book discloses. I hope everyone enjoys the opinion piece, and that it whets appetites for the book!

The Right to be Forgotten: Not an Easy Question

I’ve previously written on regulation of European data processing here. I’ll be presenting on the “right to be forgotten” (RtbF) in Chicago this Spring. I’ll be writing a series of posts here to prepare for that lecture.

Julia Powles offers an excellent summary of the right in question. As she explains, the European Court of Justice (ECJ) has ruled that, “in some circumstances—notably, where personal information online is inaccurate, inadequate, irrelevant, or excessive in relation to data-processing purposes—links should be removed from Google’s search index.” The Costeja case which led to this ruling involved Google’s prominent display of results relating to the plaintiff’s financial history.

Unfortunately, some US commentators’ views are rapidly congealing toward a reflexively rejectionist position when it comes to such regulation of search engine results–despite the Fair Credit Reporting Act’s extensive regulation of consumer reporting agencies in very similar situations. Jeffrey Toobin’s recent article mentions some of these positions. For example, Jules Polonetsky says, “The decision will go down in history as one of the most significant mistakes that Court has ever made.” I disagree, and I think the opposite result would itself have been far more troubling.

Internet regulation must recognize the power of certain dominant firms to shape impressions of individuals. Their reputational impact can be extraordinarily misleading and malicious, and the potential for harm is only growing as hacking becomes more widespread. Consider the following possibility: What if a massive theft of medical records occurs, the records are made public, and then shared virally among different websites? Are the critics of the RtbF really willing to just shrug and say, “Well, they’re true facts and the later-publishing websites weren’t in on the hack, so leave them up”? And in the case of future intimate photo hacks, do we simply let firms keep the photos available in perpetuity?
Read More

Enter Privacy Profession 01
2

Advice on How to Enter the Privacy Profession

Over at LinkedIn, I have a long post with advice for how law students can enter into the privacy profession.   I hope that this post can serve as a useful guide to students who want to pursue careers in privacy.

The privacy law field is growing dramatically, and demand for privacy lawyers is high.  I think that many in the academy who don’t follow privacy law, cyberlaw, or law and technology might not realize what’s going on in the field.  The field is booming.

The International Association of Privacy Professionals (IAPP), the field’s primary association, has been growing by about 30% each year.  It now has more than 17,000 members.  And this is only a subset of privacy professionals, as many privacy officials in healthcare aren’t members of IAPP and instead are members of the American Health Information Management Association (AHIMA) or the Health Care Compliance Association (HCCA).

There remains a bottleneck at the entry point to the field, but that can be overcome.  Once in the club, the opportunities are plentiful and there’s the ability to rise quickly.   I’ve been trying to push for solutions to make entry into the field easier, and this is an ongoing project of mine.

If you have students who are interested in entering the privacy law profession, please share my post with them.  I hope it will help.

How We’ll Know the Wikimedia Foundation is Serious About a Right to Remember

The “right to be forgotten” ruling in Europe has provoked a firestorm of protest from internet behemoths and some civil libertarians.* Few seem very familiar with classic privacy laws that govern automated data systems. Characteristic rhetoric comes from the Wikimedia Foundation:

The foundation which operates Wikipedia has issued new criticism of the “right to be forgotten” ruling, calling it “unforgivable censorship.” Speaking at the announcement of the Wikimedia Foundation’s first-ever transparency report in London, Wikipedia founder Jimmy Wales said the public had the “right to remember”.

I’m skeptical of this line of reasoning. But let’s take it at face value for now. How far should the right to remember extend? Consider the importance of automated ranking and rating systems in daily life: in contexts ranging from credit scores to terrorism risk assessments to Google search rankings. Do we have a “right to remember” all of these-—to, say, fully review the record of automated processing years (or even decades) after it happens?

If the Wikimedia Foundation is serious about advocating a right to remember, it will apply the right to the key internet companies organizing online life for us. I’m not saying “open up all the algorithms now”—-I respect the commercial rationale for trade secrecy. But years or decades after the key decisions are made, the value of the algorithms fades. Data involved could be anonymized. And just as Asssange’s and Snowden’s revelations have been filtered through trusted intermediaries to protect vital interests, so too could an archive of Google or Facebook or Amazon ranking and rating decisions be limited to qualified researchers or journalists. Surely public knowledge about how exactly Google ranked and annotated Holocaust denial sites is at least as important as the right of a search engine to, say, distribute hacked medical records or credit card numbers.

So here’s my invitation to Lila Tretikov, Jimmy Wales, and Geoff Brigham: join me in calling for Google to commit to releasing a record of its decisions and data processing to an archive run by a third party, so future historians can understand how one of the most important companies in the world made decisions about how it ordered information. This is simply a bid to assure the preservation of (and access to) critical parts of our cultural, political, and economic history. Indeed, one of the first items I’d like to explore is exactly how Wikipedia itself was ranked so highly by Google at critical points in its history. Historians of Wikipedia deserve to know details about that part of its story. Don’t they have a right to remember?

*For more background, please note: we’ve recently hosted several excellent posts on the European Court of Justice’s interpretation of relevant directives. Though often called a “right to be forgotten,” the ruling in the Google Spain case might better be characterized as the application of due process, privacy, and anti-discrimination norms to automated data processing.

0

Privacy and Data Security Harms

Privacy Harm 01

I recently wrote a series of posts on LinkedIn exploring privacy and data security harms.  I thought I’d share them here, so I am re-posting all four of these posts together in one rather long post.

I. PRIVACY AND DATA SECURITY VIOLATIONS: WHAT’S THE HARM?

“It’s just a flesh wound.”

Monty Python and the Holy Grail

Suppose your personal data is lost, stolen, improperly disclosed, or improperly used. Are you harmed?

Suppose a company violates its privacy policy and improperly shares your data with another company. Does this cause a harm?

In most cases, courts say no. This is the case even when a company is acting negligently or recklessly. No harm, no foul.

Strong Arguments on Both Sides

Some argue that courts are ignoring serious harms caused when data is not properly protected and used.

Yet others view the harm as trivial or non-existent. For example, given the vast number of records compromised in data breaches, the odds that any one instance will result in identity theft or fraud are quite low.

Read More

5

What’s ailing the right to be forgotten (and some thoughts on how to fix it)

The European Court of Justice’s recent “right to be forgotten” ruling is going through growing pains.  “A politician, a pedophile and a would-be killer are among the people who have already asked Google to remove links to information about their pasts.”  Add to that list former Merill Lynch Executive Stan O’Neal, who requested that Google hide links to an unflattering BBC News articles about him.

Screen Shot 2014-07-09 at 9.21.19 AMAll told, Google “has removed tens of thousands of links—possibly more than 100,000—from its European search results,” encompassing removal requests from 91,000 individuals (apparently about 50% of all requests are granted).  The company has been pulled into discussions with EU regulators about its implementation of the rules, with one regulator opining that the current system “undermines the right to be forgotten.”

The list of questions EU officials recently sent Google suggests they are more or less in the dark about the way providers are applying the ECJ’s ruling.  Meanwhile, European companies like forget.me (pictured) are looking to reap a profit from the uncertainty surrounding the application of these new rules.  The quote at the end of the Times article sums up the current state of affairs:

“No one really knows what the criteria is,” he said, in reference to Google’s response to people’s online requests. “So far, we’re getting a lot of noes. It’s a complete no man’s land.”

What (if anything) went wrong? As I’ll argue* below, a major flaw in the current implementation is that it puts the initial adjudication of right to be forgotten decisions in the hands of search engine providers, rather than representatives of the public interest.  This process leads to a lack of transparency and potential conflicts of interest in implementing what may otherwise be sound policy.

The EU could address these problems by reforming the current procedures to limit search engine providers’ discretion in day-to-day right to be forgotten determinations.  Inspiration for such an alternative can be found in other areas of law regulating the conduct of third party service providers, including the procedures for takedown of copyright-infringing content under the DMCA and those governing law enforcement requests for online emails.

I’ll get into more detail about the current implementation of the right to be forgotten and some possible alternatives after the jump.

Read More

5

Carrie Goldberg: IT’S CLEAR: CREATING AMATEUR PORN WITHOUT A PARTICIPANT’S KNOWLEDGE IS ILLEGAL IN NY

This post is by Carrie Goldberg who is the founding attorney at C. A. Goldberg, PLLC in Brooklyn, New York focusing on litigation relating to electronic sexual privacy invasions. She is a volunteer attorney at The Cyber Civil Rights Initiative and its End Revenge Porn campaign.Carrie

Earlier this year, the New York City tabloids and “Saturday Night Live” poked fun at a story about a handsome former Wall Street financial advisor who, after being indicted for recording himself having sex without the women’s permission, blamed the taping on his hyper-vigilant “doggie cam.”

Last week the story re—emerged with an interview by two of the three 30-something year old victims complaining that they’d been wrongly portrayed by the media and the defendant’s high profile criminal team as jealous stalkers when in reality their energetic efforts to reach him was upon discovery of the videos and centered around begging him to destroy them. The humiliation sustained during the ongoing criminal process, such as being forced to view the sex videos alongside the jurists, is palpable.

Many New Yorkers may be unaware that recording yourself having sex without the other person’s knowledge constitutes a sex crime in the state (NY Penal § 250.45) and also breaches our federal video voyeurism laws (18 USCA § 1801). With the proliferation of smart phones and tablets enabling people to­ secretly videotape sexual encounters – including apps that allow for stealth recording – this law is increasingly violated. The harm to victims is palpable and real. It’s deeply humiliating to be turned into an object of pornography without consent.

In 2003, then-Governor George E. Pataki signed New York’s unlawful surveillance statute, known as Stephanie’s Law, making it illegal to use a device to secretly record or broadcast a person undressing or having sex when that person has a reasonable expectation of privacy. The statute is named for Stephanie Fuller, whose landlord taped her using a camera hidden in the smoke detector above her bed. Read More