Archives

Kate Crawford and Jason Schultz, The Due Process Dilemma: Big Data and Predictive Privacy Harms

Kate Crawford and Jason Schultz, The Due Process Dilemma: Big Data and Predictive Privacy Harms

Comment by: Bryan Cunningham

PLSC 2013

Workshop draft abstract:

The rise of “big data” analytics poses new challenges for privacy advocates. Unlike previous computational models that exploit personally identifiable information (PII) directly, such as behavioral targeting, big data often exploits PII indirectly. By analyzing primarily metadata, such as a set of predictive or aggregated findings without displaying or distributing the underlying PII, big data approaches often operate outside of current privacy protections (Rubinstein 2013; Tene and Polonetsky 2012). However, this does not mean big data is without substantial privacy risks. For example, the risks of bias or discrimination based on the inappropriate inclusion or exclusion of personal data about an individual still persists — a risk we call “predictive privacy harm.” Last year, the trans-national retailer Target was shown to be using data mining techniques to predict which female customers were pregnant, even if they had not announced it publicly (Duhigg, 2012). Such predictive analysis and categorization poses a threat for those individuals who are labeled, especially when it is based on underlying PII and performed without either their knowledge or consent.

Currently, individuals have a right to see and review records pertaining to them in areas such as health and credit information. But these existing systems are inadequate to meet current ‘big data’ challenges: FIPS and other notice-and-choice regimes fail to protect against data analytics in part because individuals are rarely aware of how their individual data is being used to their detriment. Therefore, we propose a new approach to predictive privacy harms — that of a right to “data due process.” In the Anglo-American legal tradition, due process prohibits the deprivation of an individual’s rights without affording her access to certain basic procedural components of the adjudication process — including the rights to see and contest the evidence at issue, the right to appeal any adverse decision, the right to know the allegations presented and be heard on the issues they raise. While some current privacy regimes offer nominal due process-like mechanisms, such as the right to audit one’s personal data record, these rarely include all of the necessary components to guarantee fair outcomes and arguably many do not even apply to big data systems. A more rigorous system is needed, particularly given the inherent analytical assumptions and methodological biases built into many big data systems (boyd and Crawford 2012). Applying the concept of due process to big data and and its associated predictive privacy harms, we assert that individuals who are “judged” by big data should have similar rights to those judged by the courts with respect to how their personal data has played a role in such adjudications.


boyd, d and Crawford, K. 2012. “Critical Questions for Big Data”, Information, Communication and Society, Volume 15, no 5, pp 662-679.

Duhigg, Charles. 2012. “How Companies Learn Your Secrets, New York Times, Feb 16, 2012.

Rubinstein, Ira. (Forthcoming). “Big Data: The End of Privacy or a New Beginning?”, International Data Privacy Law.

Tene, Omer & Polonetsky, Jules. (Forthcoming). “Big Data for All: Privacy and User Control in the Age of Analytics”, Northwestern Journal of Technology and Intellectual Property 11