Archives

Kate Crawford and Jason Schultz, The Due Process Dilemma: Big Data and Predictive Privacy Harms

Kate Crawford and Jason Schultz, The Due Process Dilemma: Big Data and Predictive Privacy Harms

Comment by: Bryan Cunningham

PLSC 2013

Workshop draft abstract:

The rise of “big data” analytics poses new challenges for privacy advocates. Unlike previous computational models that exploit personally identifiable information (PII) directly, such as behavioral targeting, big data often exploits PII indirectly. By analyzing primarily metadata, such as a set of predictive or aggregated findings without displaying or distributing the underlying PII, big data approaches often operate outside of current privacy protections (Rubinstein 2013; Tene and Polonetsky 2012). However, this does not mean big data is without substantial privacy risks. For example, the risks of bias or discrimination based on the inappropriate inclusion or exclusion of personal data about an individual still persists — a risk we call “predictive privacy harm.” Last year, the trans-national retailer Target was shown to be using data mining techniques to predict which female customers were pregnant, even if they had not announced it publicly (Duhigg, 2012). Such predictive analysis and categorization poses a threat for those individuals who are labeled, especially when it is based on underlying PII and performed without either their knowledge or consent.

Currently, individuals have a right to see and review records pertaining to them in areas such as health and credit information. But these existing systems are inadequate to meet current ‘big data’ challenges: FIPS and other notice-and-choice regimes fail to protect against data analytics in part because individuals are rarely aware of how their individual data is being used to their detriment. Therefore, we propose a new approach to predictive privacy harms — that of a right to “data due process.” In the Anglo-American legal tradition, due process prohibits the deprivation of an individual’s rights without affording her access to certain basic procedural components of the adjudication process — including the rights to see and contest the evidence at issue, the right to appeal any adverse decision, the right to know the allegations presented and be heard on the issues they raise. While some current privacy regimes offer nominal due process-like mechanisms, such as the right to audit one’s personal data record, these rarely include all of the necessary components to guarantee fair outcomes and arguably many do not even apply to big data systems. A more rigorous system is needed, particularly given the inherent analytical assumptions and methodological biases built into many big data systems (boyd and Crawford 2012). Applying the concept of due process to big data and and its associated predictive privacy harms, we assert that individuals who are “judged” by big data should have similar rights to those judged by the courts with respect to how their personal data has played a role in such adjudications.


boyd, d and Crawford, K. 2012. “Critical Questions for Big Data”, Information, Communication and Society, Volume 15, no 5, pp 662-679.

Duhigg, Charles. 2012. “How Companies Learn Your Secrets, New York Times, Feb 16, 2012.

Rubinstein, Ira. (Forthcoming). “Big Data: The End of Privacy or a New Beginning?”, International Data Privacy Law.

Tene, Omer & Polonetsky, Jules. (Forthcoming). “Big Data for All: Privacy and User Control in the Age of Analytics”, Northwestern Journal of Technology and Intellectual Property 11

Scott Mulligan and Alexandra Grossman, SOPA, PIPA, HADOPI and Privacy, the Alphabet Soup Experience: What America Might (or Might Not) Learn from the Europeans About Protecting Consumers’ Privacy and Internet Freedom from Intrusive Monitoring by Third Parties (and the Government).

Scott Mulligan and Alexandra Grossman, SOPA, PIPA, HADOPI and Privacy, the Alphabet Soup Experience:  What America Might (or Might Not) Learn from the Europeans About Protecting Consumers’ Privacy and Internet Freedom from Intrusive Monitoring by Third Parties (and the Government).

Comment by: Jason Schultz

PLSC 2012

Workshop draft abstract:

In early 2012, the United States Congress seemed determined to move forward with two controversial copyright and trademark enforcement bills, the “Stop Online Piracy Act” (SOPA, H.R. 3261)  and the “Protect IP Act” (PIPA, S. 968). Though those bills have largely been set aside in the face of a considerable backlash, Congress has more recently considered slightly watered-down versions of similar legislation, including the “Online Protection and Enforcement of Digital Trade Act,” (OPEN, S. 2029).   Each of these proposed bills, much like previously-enacted counterpart laws in France (the Creation and Internet Rights Law, or “Haute autorité pour la diffusion des œuvres et la protection des droits sur Internet,” HADOPI),  SOPA, PIPA and OPEN each attempt to address the problem of Internet-based intellectual property (IP) piracy, particularly from overseas sources. However, because of France’s philosophy of strong information privacy protection, while the US traditionally has had one of strong IP protection, these disparate approaches would assumedly obtain different results because each country’s context and understanding of information privacy is so different.

While laudable in their effort to address this wide-ranging and complex 21st century problem, each of these laws nevertheless present unique challenges to individual privacy. In their current form, they require ISPs, social networking sites and other content platforms to proactively monitor and screen individual users’ content and traffic, and then to actively censor their users to prevent them from posting, sharing or linking to words, images or other content which might violate another’s IP rights. Thus dramatically shifting the enforcement burden and commensurate liabilities, website operators and ISPs who fail to act promptly could be blacklisted and prosecuted, and the proposed legislation would even empower the U.S. Attorney General to block infringing websites or users based anywhere in the world.

Unfortunately, these bills’ and laws’ legal and technical solutions are very similar to mechanisms that authoritarian regimes use to censor and spy on their citizens, to repress “undesirable” voices and to enable private interests, acting under government authority, to suppress speech, comment, criticism and public debate of those private interests or of the government.   Ironically, it is often copyright enforcers who, using it as a weapon, attempt to make privacy claims for themselves when attempting to protect corporate or personal interests, typically when the owner could not sustain an unlawful interception, trade secrets misappropriation, or invasion of privacy claim.   Further, in terms of the third party doctrine, these new laws offer governments unparalleled and unprecedented opportunities to collect private information by and about individual citizens, outsourcing this mandatory data collection to the third party providers who, acting under color of these laws, amass vast quantities of personal information which may be of interest to the government for law enforcement or counterterrorism purposes.

This article will examine the proposed bills in the United States, their specific implementation and enforcement mechanisms, and compare them to the previously-enacted laws in France. Necessary pre-enactment changes to those laws there briefly revived a philosophy regarding individual privacy and freedom, especially on the Internet.  In so doing, the French further committed themselves to viewing privacy and freedom of expression as fundamental human rights, while the American approach remains bluntly oriented toward corporate and government interests.  However, with subsequent decisions by various French courts, the French legal system once again swung in favor of intellectual property rights, at the expense of the information privacy and freedom of individual Internet users.  This article will examine the similarities and differences between the two legal systems, reveal the differing approaches to intellectual property and privacy on opposite sides of the Atlantic and suggest a new approach that would better protect personal privacy and Internet freedom in a democratic society, on either continent.

Mary D. Fan, Regulating Sex and Privacy in a Casual Encounters Culture

Mary D. Fan, Regulating Sex and Privacy in a Casual Encounters Culture

Comment by: Jason Schultz

PLSC 2011

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1796534

Workshop draft abstract:

The regulation of sex and disease is a cultural and political flashpoint and persistent problem that law’s antiquated arsenal has been hard-pressed and clumsily adapted to effectively address.  The need for attention is demonstrated by arresting data – for example, that one in four women aged 14 to 19 has been infected with at least one sexually transmitted disease (STD), that managing STDs costs an estimated $15.9 billion a year, and that syphilis, once near eradication, is on the rise again as are HIV and other STDs.  Public health officials on the front lines have called for paradigm changes to tackle the enormous challenge.  Controversial proposals have circulated, such as mass HIV screening for everyone aged 13 to 64, STD testing in high schools, mandatory HIV screening, strict liability in tort for transmission, and criminalizing first-time sex without a condom. The article argues that a less intrusive, more narrowly tailored and efficient avenue of regulation has been overlooked because of the blinders of old paradigms of sex and privacy.

The article contends that we should shift our focus away from the cumbersome costly hammers and perverse incentives of criminal and tort law and focus on adapting privacy law and culture to changes in how we meet and mate today.  Information-sharing innovations would better deter without as much intrusion and avoid perverse victim-chilling and incentives against acquiring knowledge or seeking help.  Turning to adjusting privacy law rather than criminal and tort law would better help safeguard sexual autonomy and ameliorate the information deficit in the increasingly prevalent casual sex culture and Internet-mediated marketplace for sex and love.