Site Meter

Tagged: Privacy

2

Need an alternative to the third party doctrine? Look backwards, not forward. (Part I)

500px-Folder_home.svg

In light of the renewed discussion on the future of the third party doctrine on this blog and elsewhere (much of it attributable to Riley), I’d like to focus my next couple of posts on the oft-criticized rule, with the aim of exploring a few questions that will hopefully be interesting* to readers. For the purpose of these posts, I’m assuming readers are familiar with the third party doctrine and the arguments for and against it.

I’ll start with the following question: Let’s assume the Supreme Court decides to scale back the third party doctrine.  Where in the Court’s Fourth Amendment jurisprudence should the Justices look for an alternative approach?  I think this is an interesting and important question in light of the serious debate, both in academia and on the Supreme Court, about the third party doctrine’s effect on privacy in the information age.

One answer, which may represent the conventional wisdom, is that there simply is nothing in the Supreme Court’s existing precedent that supports a departure from the Court’s all or nothing approach to Fourth Amendment rights in Smith and Miller.  According to this answer, the Court’s only choice if it wishes to “reconsider” the third party doctrine is to create new, technology specific rules that address the problems of the day.  (I’ve argued elsewhere that existing Fourth Amendment doctrine doesn’t bind the Court to rigid applications of its existing rules in the face of new technologies.)

A closer look at the Court’s Fourth Amendment jurisprudence suggests another option, however. The Supreme Court has not applied the underlying rationale from its third party doctrine cases to all forms of government intrusion.  Indeed, for almost a century the Supreme Court has been willing to depart from the all or nothing approach in another Fourth Amendment context: government searches of dwellings and homes.  As I’ll discuss below, the Supreme Court has used various tools—including the implied license rule in last year’s Jardines, the standard of “common understandings,” and the scope of consent rules in co-habitant cases—to allow homeowners, cohabitants, tenants, hotel-guests, overnight guests, and the like maintain Fourth Amendment rights against the government even though they have given third parties access to the same space.

In other words, it is both common sense and black letter law that a person can provide third parties access to his home for a particular purpose without losing all Fourth Amendment rights against government intrusion. Letting the landlord or the maid into your home for a limited purpose doesn’t necessarily give the police a license to enter without a warrant—even if the police persuade the landlord or the maid to let them in. Yet the Court has abandoned that type of nuance in the context of informational privacy, holding that sharing information with a third party means forgoing all Fourth Amendment rights against government access to that information (a principle that has eloquently been described as the “secrecy paradigm”). As many have noted, this rule has had a corrosive effect on Fourth Amendment rights in a world where sensitive information is regularly shared with third parties as a matter of course.

Why has the Court applied such a nuanced approach to Fourth Amendment rights when it comes to real property and the home, but not when it comes to informational privacy?  And have changes in technology undermined some of the rationale justifying this divergence? These are questions I’ll explore further in Part II of this post; in the meantime I’d love to hear what readers think about them. I’ll spend the rest of this post providing some additional background on the Court’s approach to privacy in the context of real property searches.

More after the jump.

Read More

0

The data retention judgment, the Irish Facebook case, and the future of EU data transfer regulation

On April 8 the Court of Justice of the European Union (CJEU) announced its judgment in the case C-293/12 and C-594/12 Digital Rights Ireland. Based on EU fundamental rights law, the Court invalidated the EU Data Retention Directive, which obliged telecommunications service providers and Internet service providers in the EU to retain telecommunications metadata and make it available to European law enforcement authorities under certain circumstances. The case illustrates both the key role that the EU Charter of Fundamental Rights plays in EU data protection law, and the CJEU’s seeming disinterest in the impact of its recent data protection rulings on other fundamental rights. In addition, the recent referral to the CJEU by an Irish court of a case involving data transfers by Facebook under the EU-US Safe Harbor holds the potential to further tighten EU rules for data transfers, and to reduce the possibility of EU-wide harmonization in this area.

In considering the implications of Digital Rights Ireland for the regulation of international data transfers, I would like to focus on a passage occurring towards the end of the judgment, where the Court criticizes the Data Retention Directive as follows (paragraph 68):

“[I]t should be added that that directive does not require the data in question to be retained within the European Union, with the result that it cannot be held that the control, explicitly required by Article 8(3) of the Charter, by an independent authority of compliance with the requirements of protection and security, as referred to in the two previous paragraphs, is fully ensured. Such a control, carried out on the basis of EU law, is an essential component of the protection of individuals with regard to the processing of personal data…”

This statement caught many observers by surprise. The CJEU is famous for the concise and self-referential style of its opinions, and the case revolved around the legality of the Directive in general, not around whether data stored under it could be transferred outside the EU. This issue was also not raised in the submission of the case to the Court, and first surfaced in the advisory opinion issued by one of the Court’s advocates-general prior to the judgment (see paragraph 78 of that Opinion).

In US constitutional law, the question “does the constitution follow the flag?” generally arises in the context of whether the Fourth Amendment to the US Constitution applies to government activity overseas (e.g., when US law enforcement abducts a fugitive abroad and brings him back to the US). In the context discussed here, the question is rather whether EU data protection law applies to personal data as they are transferred outside the EU, i.e., “whether the EU flag follows EU data”. As I explained in my book on the regulation of transborder data flows that was published last year by Oxford University Press, in many cases EU data protection law remains applicable to personal data transferred to other regions. For example, in introducing its proposed reform of EU data protection law, the European Commission stated in 2012 that one of its key purposes is to “ensure a level of protection for data transferred out of the EU similar to that within the EU”.

EU data protection law is based on constitutional provisions protecting fundamental rights (e.g., Article 8 of the EU Charter of Fundamental Rights), and the CJEU has emphasized in cases involving the independence of the data protection authorities (DPAs) in Austria, Germany, and Hungary that control of data processing by an independent DPA is an essential element of the fundamental right to data protection (without ever discussing independent supervision in the context of data processing outside the EU). In light of those previous cases, the logical consequence of the Court’s statement in Digital Rights Ireland would seem to be that fundamental rights law requires oversight of data processing by the DPAs also with regard to the data of EU individuals that are transferred to other regions.

This conclusion raises a number of questions. For example, how can it be reconciled with the fact that the enforcement jurisdiction of the DPAs ends at the borders of their respective EU Member States (see Article 28 of the EU Data Protection Directive 95/46)? If supervision by the EU DPAs extends already by operation of law to the storage of EU data in other regions, then why do certain EU legal mechanisms in addition force the parties to data transfers to explicitly accept the extraterritorial regulatory authority of the DPAs (e.g., Clause 5(e) of the EU standard contractual clauses of 2010)? And how does the Court’s statement fit with its 2003 Lindqvist judgment, where it held that EU data protection law should not be interpreted to apply to the entire Internet (see paragraph 69 of that judgment)? The offhand way in which the Court referred to DPA supervision over data processing outside the EU in the Digital Rights Ireland judgment gives the impression that it was unaware of, or disinterested in, such questions.

On June 18 the Irish High Court referred a case to the CJEU that may develop further its line of thought in the Digital Rights Ireland judgment. The High Court’s judgment in Schrems v. Data Protection Commissioner involved a challenge by Austrian student Max Schrems to the transfer of personal data to the US by Facebook under the Safe Harbor. The High Court announced that it would refer to the CJEU the questions of whether the European Commission’s adequacy decision of 2000 creating the Safe Harbor should be re-evaluated in light of the Charter of Fundamental Rights and widespread access to data by US law enforcement, and of whether the individual DPAs should be allowed to determine whether the Safe Harbor provides adequate protection (see paragraphs 71 and 84). The linkage between the two cases is evidenced by the Irish High Court’s frequent citation of Digital Rights Ireland, and by the CJEU’s conclusion that interference with the right to data protection caused by widespread data retention for law enforcement purposes without notice being given to individuals was “particularly serious” (see paragraph 37 of Digital Rights Ireland and paragraph 44 of Schrems v. Data Protection Commissioner). The High Court also criticized the Safe Harbor and the system of oversight of law enforcement data access in the US as failing to provide oversight “carried out on European soil” (paragraph 62), which seems inspired by paragraph 68 of the Digital Rights Ireland judgment.

The Irish referral to the CJEU also holds implications for the possibility of harmonized EU rules regarding international data transfers. If each DPA is allowed to override Commission adequacy decisions based on its individual view of what the Charter of Fundamental Rights requires, then there would be no point to such decisions in the first place (and the current disagreement over the “one stop shop” in the context of the proposed EU General Data Protection Regulation shows the difficulty of reaching agreement on pan-European rules where fundamental rights are at stake). Also, one wonders if other data transfer mechanisms beyond the Safe Harbor could also be at risk (e.g., standard contractual clauses, binding corporate rules, etc.), given that they also allow data to be turned over to non-EU law enforcement authorities. The proposed EU General Data Protection Regulation could eliminate some of these risks, but its passage is still uncertain, and the interpretation by the Court of the role of the Charter of Fundamental Rights would still be relevant under it. Whatever the CJEU eventually decides, it seems inevitable that the case will result in a tightening of EU rules on international data transfers.

The referral by the Irish High Court also raises the question (which the High Court did not address) of how other important fundamental rights, such as freedom of expression and the right to communicate internationally (meaning, in essence, the freedom to communicate on the Internet), should be balanced with the right to data protection. In its recent jurisprudence, the CJEU seems to regard data protection as a “super right” that has preference over other ones; thus, in its recent judgment in the case C-131/12 Google Spain v. AEPD and Mario Costeja Gonzalez involving the “right to be forgotten”, the Court never even refers to Article 11 of the Charter of Fundamental Rights that protects freedom of expression and the right to “receive and impart information and ideas without interference by public authority and regardless of frontiers”. In its zeal to protect personal data transferred outside the EU, it is important that the CJEU not forget that, as it has stated in the past, data protection is not an absolute right, and must be considered in relation to its function in society (see, for example, Joined Cases C-92/09 and C-93/09 Volker und Markus Schecke, paragraph 48), and that there must be some territorial limit to EU data protection law, if it is not to become a system of universal application that applies to the entire world (as the Court held in Lindqvist). Thus, there is an urgent need for an authoritative and dispassionate analysis of the territorial limits to EU data protection law, and of how a balance can be struck between data protection and other fundamental rights, guidance which unfortunately the CJEU seems unwilling to provide.

0

EU and US data privacy rights: six degrees of separation

The EU and the US have often engaged in a “tit for tat” exchange with regard to their respective systems of privacy protection. For example, EU academics have criticized US law as reflecting a “civil rights” approach that only affords data privacy rights to its own citizens, whereas US commentators have argued that privacy protection in the EU is less effective than its status as a fundamental right would suggest.

I am convinced that neither the EU nor the US properly understands each other’s approach to data privacy. This is not surprising, given that a sophisticated understanding of the two legal systems requires language skills and comparative legal knowledge that few people have on either side of the Atlantic. The close cultural and historical ties between the EU and the US may also make mutual understanding more difficult, since concepts that seem similar on the surface seem may actually be quite different in reality.

I like to think of the difference between the EU and US concepts of data privacy rights as reflecting the differing epistemological views of the rationalist philosophers (e.g., Descartes) versus those of the empiricists (e.g., Hume and Locke) who influenced development of the legal systems in Europe and the US. EU data protection law derives normative rules based mainly on reason and deduction (as do the rationalists), while US privacy law bases legal rules more on evidence drawn from experience (like the empiricists). It is thus no surprise that the law and economics approach that is so influential in US jurisprudence is largely unknown in EU data protection law, while the more dogmatic, conceptual approach of EU law would seem strange to many US lawyers. An illustration is provided by the recent judgment of the Court of Justice of the European Union dealing with the “right to be forgotten” (C-131/12 Google Spain v AEPD and Mario Costeja Gonzalez), where the Court’s argumentation was largely self-referential and it took little notice of the practical implications of its judgment.

Here is a brief discussion of six important areas of difference between data privacy law in the EU and US, with a particular focus on their systems of constitutional rights:

Omnibus vs sectoral approach: The EU has an overarching legal framework for data privacy that covers all areas of data processing, based on EU constitutional law (e.g. the EU Charter of Fundamental Rights), the European Convention on Human Rights, the EU Data Protection Directive, national law, and other sources. In the US, there is no single legal source protecting data privacy at all levels, and legal regulation operates more at a sectoral level (e.g., focusing on specific areas such as children’s privacy, bank data etc).

Constitutional rights as the preferred method of protection: The US Supreme Court has interpreted the US Constitution to create a constitutional right to privacy in certain circumstances. However, from a US viewpoint, constitutional rights are only one vehicle to protect data privacy. Commentators have described the strengths of the US system for privacy protection as comprising a myriad of factors, including “an emergent privacy profession replete with a rapidly expanding body of knowledge, training, certification, conferences, publications, web-tools and professional development; self regulatory initiatives; civil society engagement; academic programs with rich, multidisciplinary research agendas; formidable privacy practices in leading law and accounting firms; privacy seals; peaking interest by the national press; robust enforcement by Federal and State regulators, and individual and class litigation”. In contrast, in the EU the key factor underlying data protection is its status as a fundamental right (see, e.g., Article 1 of the EU General Data Protection Regulation proposed by the European Commission in 2012).

Different conceptions of rights: In the US, a constitutional right must by definition derive from the US Constitution, while in the EU, fundamental rights are considered “general principles of law” that apply to all human beings within EU jurisdiction even if they do not derive from a specific constitutional source. The concept of fundamental rights in the EU is thus broader and more universal than that of constitutional rights in the US.

Positive and negative rights: In the US, privacy is generally protected as a “negative” right that obliges the government to refrain from taking actions that would violate constitutional rights. In the EU the state also has a constitutional obligation to affirmatively protect privacy rights (see the next point below).

Requirement of state action: US law protects constitutional rights only against government action, while in the EU the state also has a duty under certain circumstances to protect the privacy of individuals against violations by nongovernmental actors. An example from outside the area of privacy is provided by the decisions of the European Court of Human Rights (ECHR) in Case of Z and Others v. United Kingdom (2001) and the US Supreme Court in DeShaney v. Winnebago County (1989). Both cases involved the issue of whether the state has a duty under constitutional law to protect a child against abuse by its parents; in essence, the ECHR answered “yes” and the US Supreme Court answered “no”.

Requirement of “harm”: In the EU, the processing of personal data is generally prohibited absent a legal basis, and the CJEU has ruled that a data protection violation does not depend on “whether the information communicated is of a sensitive character or whether the persons concerned have been inconvenienced in any way” (para. 75 of the Rechnungshof case of 2003). In the US data processing is generally allowed unless it causes some harm or is otherwise restricted by law.

The EU and US systems of privacy rights have each developed in a democratic system of government based on the rule of law, and have been shaped by unique cultural and historical factors, so there is little point in debating which one is “better”. However, the fact that the two systems are anchored in their constitutional frameworks does not mean that practical measures cannot be found to bridge some of the differences between them; I am part of a group (the EU-US “Privacy Bridges” project) that is trying to do just that. The two systems may also influence each other and grow closer together over time. For example, the call for enactment of a “consumer privacy bill of rights” in the framework for protection of consumer privacy released by the White House in February 2012 seems to have been inspired in part by the status of data protection as a fundamental right in EU law.

The central role played by constitutional factors in the EU and US systems of data privacy rights means it is essential that more attention be given to the study of privacy law from a comparative constitutional perspective. For example, I wonder why there is so little opportunity in US law schools to study EU data protection law, and vice-versa? Efforts must be increased on both sides of the Atlantic to better understand each other’s systems for protecting data privacy rights.

1

The right to be forgotten and the global reach of EU data protection law

It is a pleasure to be a guest blogger on Concurring Opinions during the month of June. I will be discussing issues and developments relating to European data protection and privacy law, from an international perspective.

Let me begin with a recent case of the Court of Justice of the European Union (CJEU) that has received a great deal of attention. In its judgment of May 13 in the case C-131/12 Google Spain v AEPD and Mario Costeja Gonzalez, the Court recognized a “right to be forgotten” with regard to Internet search engine results based on the EU Data Protection Directive 95/46. This judgment by the highest court in the EU demonstrates that, while it is understandable that data protection law be construed broadly so that individuals are not deprived of protection, it is also necessary to specify some boundaries to define when it does not apply, if EU data protection law is not to become a kind of global law applicable to the entire Internet.

I have already summarized the case elsewhere, and here will only deal with its international jurisdictional aspects. It involved a claim brought by an individual in Spain against both the US parent company Google Inc, and its subsidiary Google Spain. The latter company, which has separate legal personality in Spain, acts as a commercial agent for the Google group in that country, in particular with regard to the sale of online advertising on the search engine web site www.google.com operated by Google Inc. via its servers in California.

The CJEU applied EU data protection law to the Google search engine under Article 4(1)(a) of the Directive, based on its finding that Google Spain was “inextricably linked” to the activities of Google Inc. by virtue of its sale of advertising space on the search engine site provided by Google Inc, even though Google Spain had no direct involvement in running the search engine. In short, the Court found that data processing by the search engine was “carried out in the context of the activities of an establishment of the controller” (i.e., Google Spain).

Since the Court applied EU law based on the activities of Google Spain, it did not discuss the circumstances under which EU data protection law can be applied to processing by data controllers established outside the EU under Article 4(1)(c) of the Directive (see paragraph 61 of the judgment), though the Court did emphasize the broad territorial applicability of EU data protection law (paragraph 54). Since the right to be forgotten has effect on search engines operated from computers located outside the EU, I consider this to be a case of extraterritorial jurisdiction (or extraterritorial application of EU law: I am aware of the distinction between applicable law and jurisdiction, but will use “jurisdiction” here as a shorthand to refer to both).

The Court did not limit its holding to claims brought by EU individuals, or to search engines operated under specific domains. An individual seeking to assert a right under the Directive need not be a citizen of an EU Member State, or have any particular connection with the EU, as long as the act of data processing on which his or her claim is based is subject to EU data protection law under Article 4. The Directive states that EU data protection law applies regardless of an individual’s nationality or residence (see Recital 2), and it is widely recognized that it may apply to entities outside the EU.

Thus, it seems that there would be no impediment under EU law, for example, to a Chinese citizen in China who uses a US-based Internet search engine with a subsidiary in the EU asserting the right to be forgotten against the EU subsidiary with regard to results generated by the search engine (note that Article 3(2) of the proposed EU General Data Protection Regulation would limit the possibility of asserting the right to be forgotten by individuals without any connection to the EU, since the application of EU data protection law would be limited to “data subjects residing in the Union”). Since only the US entity running the search engine would have the power to amend the search results, in effect the Chinese individual would be using EU data protection law as a vehicle to bring a claim against the US entity. The judgment therefore potentially applies EU data protection law to the entire Internet, a situation that was not foreseen when the Directive was enacted (as noted by the Court in paragraphs 69-70 of its 2003 Lindqvist judgment). It could lead to forum shopping and “right to be forgotten tourism” by individuals from around the world (much as UK libel laws have lead to criticisms of “libel tourism“).

It is likely that the judgment will be interpreted more restrictively than this. For example, the UK Information Commissioner’s office has announced that it will focus on “concerns linked to clear evidence of damage and distress to individuals” in enforcing the right to be forgotten. However, if one takes the position that Article 16 the Treaty on the Foundation of the European Union (TFEU) has direct effect, then the ability of individual DPAs to limit the judgment to situations where some “damage or distress” has occurred seems legally doubtful (see paragraph 96, where the Court remarked that the right to be forgotten applies regardless of whether inclusion of an individual’s name in search results “causes prejudice”). Google has also recently announced a procedure for individuals to remove their names from search results under certain circumstances, and the way that online services deal with implementation of the judgment will be crucial in determining its territorial scope in practice.

In any event, the Court’s lack of concern with the territorial application of the judgment demonstrates an inward-looking attitude that fails to take into account the global nature of the Internet. It also increases the need for enactment of the proposed Regulation, in order to provide some territorial limits to the right to be forgotten.

0

Over-Parenting Goes International

The thought of hiring a private detective in this age of relatively accessible electronic surveillance seems a bit retro, like a black-and-white scene from a smoky film noire. But it has been enjoying a surprising comeback in recent years, with parents who hire private investigators to spy on their children.

In an article titled Over-Parenting, my co-author Gaia Bernstein and I identified a trend of legal adoption of intensive parenting norms. We cautioned against society legally sanctioning a single parenting style – namely, intensive parenting – while deeming potentially neglectful other parenting styles which could be perfectly legitimate. We also pointed out that involved parenting is class-biased, since it is costly, and not all parents can afford the technology that would enable them to be intensive parents, such as purchasing GPS enabled smartphones for their kids. We argued that when intensive parenting is used for children who do not need it, it becomes over-parenting. Not all children need the same level of involvement in their lives; one of the most important roles of parents is to prepare their children for independent life, and over-parenting might thwart that role. Finally, we speculated that the cultural model for intensive parenting originates in media depictions of upper-middle class families, and that how these families are portrayed in movies and TV shows influences real-life parents.

Well, I’m sad to report that over-parenting is not a unique American phenomenon. Last year, for example, a Chinese newspaper reported that parents in china are increasingly becoming more involved in their children’s lives by hiring private investigators to check whether the children use drugs, drink alcohol or have sex. In Israel some parents are doing the same, especially during the long summer break, during which bored teenagers, many parent fear, are prone to engage in such activities (if you read Hebrew, you can read the story here). I am sure that some American parents do the same.

Leaving aside the class question (are parents who cannot afford a private eye neglectful?), what does this say about parents’ role as educators? Or about the level of trust (or distrust) between those parents and their children? It used to be that a spouse would hire a private investigator because they thought that their partner was having an affair. Nowadays, a growing chunk of a private investigator’s work involved parents spying on their children. Can’t we say that the fact that parents feel that they need to spy on their children already testifies to their limited parental skills?

6

PETs, Law and Surveillance

In Europe, privacy is considered a fundamental human right. Section 8 of the European Convention of Human Rights (ECHR) limits the power of the state to interfere in citizens’ privacy, ”except such as is in accordance with the law and is necessary in a democratic society”. Privacy is also granted constitutional protection in the Fourth Amendment to the United States Constitution. Both the ECHR and the US Constitution establish the right to privacy as freedom from government surveillance (I’ll call this “constitutional privacy”). Over the past 40 years, a specific framework has emerged to protect informational privacy (see here and here and here and here); yet this framework (“information privacy”) provides little protection against surveillance by either government or private sector organizations. Indeed, the information privacy framework presumes that a data controller (i.e., a government or business organization collecting, storing and using personal data) is a trusted party, essentially acting as a steward of individual rights. In doing so, it overlooks the fact that organizations often have strong incentives to subject individuals to persistent surveillance; to monetize individuals’ data; and to maximize information collection, storage and use.

Read More

0

More on government access to private sector data

Last week I blogged here about a comprehensive survey on systematic government access to private sector data, which will be published in the next issue of International Data Privacy Law, an Oxford University Press law journal edited by Christopher Kuner. Several readers have asked whether the results of the survey are available online. Well, now they are – even before publication of the special issue. The project, which was organized by Fred Cate and Jim Dempsey and supported by The Privacy Projects, covered government access laws in AustraliaCanadaChinaGermanyIsraelJapanUnited Kingdom and United States.

Peter Swire’s thought provoking piece on the increased importance of government access to the cloud in an age of encrypted communications appears here. Also see the special issue’s editorial, by Fred, Jim and Ira Rubinstein.

 

2

On systematic government access to private sector data

The Sixth Circuit Court of Appeals has recently decided in United States v. Skinner that police does not need a warrant to obtain GPS location data for mobile phones. The decision, based on the holding of the Supreme Court in US v. Jones, highlights the need for a comprehensive reform of rules on government access to communications non-contents information (“communications data”). Once consisting of only a list of phone numbers dialed by a customer (a “pen register”), communications data have become rife with personal information, including location, clickstream, social contacts and more.

To a non-American, the US v. Jones ruling is truly astounding in its narrow scope. Clearly, the Justices aimed to sidestep the obvious question of expectation of privacy in public spaces. The Court did hold that the attachment of a GPS tracking device to a vehicle and its use to monitor the vehicle’s movements constitutes a Fourth Amendment “search”. But it based its holding not on the persistent surveillance of the suspect’s movements but rather on a “trespass to chattels” inflicted when a government agent ever-so-slightly touched the suspect’s vehicle to attach the tracking device. In the opinion of the Court, it was the clearly insignificant “occupation of property” (touching a car!) rather than the obviously weighty location tracking that triggered constitutional protection.

Suffice it to say, that to an outside observer, the property infringement appears to have been a side issue in both Jones and Skinner. The main issue of course is government power to remotely access information about an individual’s life, which is increasingly stored by third parties in the cloud. In most cases past – and certainly present and future – there is little need to trespass on an individual’s property in order to monitor her every move. Our lives are increasingly mediated by technology. Numerous third parties possess volumes of information about our finances, health, online endeavors, geographical movements, etc. For effective surveillance, the government typically just needs to ask.

This is why an upcoming issue of International Data Privacy Law (IDPL) (an Oxford University Press law journal), which is devoted to systematic government access to private sector data, is so timely and important. The special issue covers rules on government access in multiple jurisdictions, including the US, UK, Germany, Israel, Japan, China, India, Australia and Canada.

Read More

3

Big Data for All

Much has been written over the past couple of years about “big data” (See, for example, here and here and here). In a new article, Big Data for All: Privacy and User Control in the Age of Analytics, which will be published in the Northwestern Journal of Technology and Intellectual Property, Jules Polonetsky and I try to reconcile the inherent tension between big data business models and individual privacy rights. We argue that going forward, organizations should provide individuals with practical, easy to use access to their information, so they can become active participants in the data economy. In addition, organizations should be required to be transparent about the decisional criteria underlying their data processing activities.

The term “big data” refers to advances in data mining and the massive increase in computing power and data storage capacity, which have expanded by orders of magnitude the scope of information available for organizations. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data.

Data creates enormous value for the world economy, driving innovation, productivity, efficiency and growth. In the article, we flesh out some compelling use cases for big data analysis. Consider, for example, a group of medical researchers who were able to parse out a harmful side effect of a combination of medications, which were used daily by millions of Americans, by analyzing massive amounts of online search queries. Or scientists who analyze mobile phone communications to better understand the needs of people who live in settlements or slums in developing countries.

Read More

1

On Reverse Engineering Privacy Law

Michael Birnhack, a professor at Tel Aviv University Faculty of Law, is one of the leading thinkers about privacy and data protection today (for some of his previous work see here and here and here; he’s also written a deep, thoughtful, innovative book in Hebrew about the theory of privacy. See here). In a new article, Reverse Engineering Informational Privacy Law, which is about to be published in the Yale Journal of Law & Technology, Birnhack sets out to unearth the technological underpinnings of the EU Data Protection Directive (DPD). The DPD, enacted in 1995 and currently undergoing a process of thorough review, is surely the most influential legal instrument concerning data privacy all over the world. It has been heralded by proponents as “technology neutral” – a recipe for longevity in a world marked by rapid technological change. Alas, Birnhack unveils the highly technology-specific fundamentals of the DPD, thereby putting into doubt its continued relevance.

 

The first part of Birnhack’s article analyzes what technological neutrality of a legal framework means and why it’s a sought after trait. He posits that the idea behind it is simple: “the law should not name, specify or describe a particular technology, but rather speak in broader terms that can encompass more than one technology and hopefully, would cover future technologies that are not yet known at the time of legislation.” One big advantage is flexibility (the law can apply to a broad, continuously shifting set of technologies); consider the continued viability of the tech-neutral Fourth Amendment versus the obviously archaic nature of the tech-specific ECPA . Another advantage is the promotion of innovation; tech-specific legislation can lock-in a specific technology thereby stifling innovation.

 

Birnhack continues by creating a typology of tech-related legislation. He examines factors such as whether the law regulates technology as a means or as an end; whether it actively promotes, passively permits or directly restricts technology; at which level of abstraction it relates to technology; and who is put in charge of regulation. Throughout the discussion, Birnhack’s broad, rich expertise in everything law and technology is evident; his examples range from copyright and patent law to nuclear non-proliferation.

Read More