Archives

Christopher Slobogin, Making the Most of United States v. Jones in a Surveillance Society: A Statutory Implementation of Mosaic Theory

Christopher Slobogin, Making the Most of United States v. Jones in a Surveillance Society: A Statutory Implementation of Mosaic Theory

Comment by: Susan Freiwald

PLSC 2013

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2098002

Workshop draft abstract:

In the Supreme Court’s recent decision in United States v. Jones, a majority of the Justices appeared to recognize that under some circumstances aggregation of information about an individual through governmental surveillance can amount to a Fourth Amendment search. If adopted by the Court, this notion—sometimes called “mosaic theory”—could bring about a radical change to Fourth Amendment jurisprudence, not just in connection with surveillance of public movements—the issue raised in Jones—but also with respect to the government’s increasingly pervasive record-mining efforts. One reason the Court might avoid the mosaic theory is the perceived difficulty of implementing it. This article provides, in the guise of a model statute, a means of doing so. More specifically, this article explains how proportionality reasoning and political process theory can provide concrete guidance for the courts and police in connection with physical and data surveillance.

Susan Freiwald & Sylvain Métille, Simply More Privacy Protective: Law Enforcement Surveillance in Switzerland as compared to in the United States

Susan Freiwald & Sylvain Métille, Simply More Privacy Protective: Law Enforcement Surveillance in Switzerland as compared to in the United States

Comment by: Stephen Henderson

PLSC 2012

Workshop draft abstract:

Calls for reform of the American laws governing electronic surveillance have heightened as the principal federal law, the Electronic Communications Privacy Act (“ECPA”), has approached its twenty-fifth birthday this year.  Passed in 1986 to bring communications surveillance into the electronic age, the ECPA has not been meaningfully updated since the advent of the World Wide Web. Courts currently disagree over whether the statute even applies to surveillance using mobile technology, years after cell phones have become ubiquitous in American’s lives.  Switzerland, by contrast, has recently updated its laws to cover surveillance technology. In January of 2011, the Swiss enacted a brand new statute, the Swiss Criminal Procedure Code (CrimPC). Substantively, CrimPC imposes similar procedural requirements on law enforcement agents’ use of a variety of investigatory techniques. That nearly uniform treatment stands in stark contrast to the ECPA, which uses a complicated set of categories and rules to make surveillance law in the United States exceedingly difficult to understand and apply. More importantly, Swiss law precludes the use of surveillance techniques not authorized and regulated by CrimPC, while in the United States, a tremendous amount of what the Swiss consider to be surveillance takes place outside the confines of the applicable surveillance laws.

Even if Congress were to amend the ECPA by passing the most privacy protective of the current bills proposed, resulting U.S. law would achieve neither the uniformity nor all the privacy protective features of CrimPC. In short, Swiss law involves judges in many more types of surveillance and in a much more active way than any of the current proposals in this country would do. Swiss law also requires, as U.S. law does not and would not even after amendment, that clear notice must be given in almost all cases to those targeted by surveillance, once done.

This paper describes the passage of CrimPC and its key provisions, which govern the surveillance of mail and telecommunications, collection of user identification data, use of technical surveillance equipment, the surveillance of contacts with a bank, use of undercover agents and the surveillance through physical observation of people and places accessible to the general public. It contrasts those provisions with current U.S. law. The discussion puts the proposals for U.S. law reform in perspective, and sheds light on two radically different approaches to regulating law enforcement surveillance of communications technologies.

Paul Ohm, Big Data & Privacy

Paul Ohm, Big Data & Privacy

Comment by: Susan Freiwald

PLSC 2011

Workshop draft abstract:

We are witnessing a sea change in the way we threaten and protect information privacy. The rise of Big Data—meaning powerful new methods of data analytics directed at massive, highly interconnected databases of information—will exacerbate privacy problems and put particular pressure on privacy regulation. The laws, regulations, and enforcement mechanisms we have developed in the first century of information privacy law are fundamentally hampered by the special features of Big Data. Big Data will force us to rethink how we regulate privacy.

To do that, we first need to understand what has changed, by surveying Big Data and cataloging what is new. Big Data includes powerful techniques for reidentification, the focus of my last Article, but it encompasses much more. Two features of Big Data, in particular, interfere with the way we regulate privacy. First, Big Data produces results that defy human intuition and resist prediction. The paradigmatic output of Big Data is the surprising correlation. Second, the underlying mechanisms that make Big Data work are often inscrutable to human understanding. Big Data reveals patterns and correlations, not mental models. B is correlated with A, Big Data reveals, but it cannot tell us why, and given the counter-intuitiveness of the result, we are sometimes left unable even to guess.

Big Data’s surprising correlations and inscrutability will break the two predominant methods we use to regulate privacy today, what I call the “bad data list” approach and the Fair Information Practice Principles approach. Both approaches rely on transparency and predictability, two things that Big Data fundamentally resists. Neither regulatory method can survive Big Data, and we cannot salvage either using only small tweaks and extensions. We need to start over.

Susan Freiwald, Fourth Amendment Protection for Stored Cell Site Location Information

Susan Freiwald, Fourth Amendment Protection for Stored Cell Site Location Information

Comment by: Katherine Strandburg

PLSC 2010

Workshop draft abstract:

Lower courts have split on whether agents need to obtain a warrant prior to obtaining real-time or prospective information from cell phone service providers about the cell phone towers used by a targeted subscriber.  Such Cell Site Location Information (“CSLI”) may divulge detailed information about a person’s whereabouts and travels throughout the day, because cell phones may register frequently with nearby cell towers to direct incoming and outgoing calls, text and data.  While courts have analogized between real-time access to CSLI and electronic surveillance, only recently did a Magistrate Judge in Pennsylvania (recognize that access to historical CSLI poses the same risk of abuse as real-time access and requires the same meaningful judicial oversight to satisfy the Fourth Amendment.  (534 F. Supp. 2d 585) She denied, in an opinion joined by three other magistrate judges, the government’s request for historical CSLI without a warrant based on probable cause, and her order was upheld, without opinion, by the District Court. (2008 WL 4191511)  My paper would elaborate on the arguments that I made as an amicus curiae in two briefs: One in the District Court of Pennsylvania in favor of affirming the Magistrate Judge’s decision and one in the Third Circuit in favor of affirming the District Court’s order.  Briefly, the distinction between historical data and real time or prospective data is practically arbitrary, because agents may regularly request records of immediately past use and thereby use “historical” orders effectively to obtain real-time information.  As a substantive matter, methods to obtain historical CSLI may be just as hidden, indiscriminate, and effectively continuous (in that they covers a period of time) as the methods used to Wiretap. CSLI should be subject to a reasonable expectation of privacy (your recent great work supports this) and its acquisition is quite intrusive.  (I have argued elsewhere that The Supreme Court and lower courts have found that the Fourth Amendment requires the highest level of judicial oversight when the government uses a surveillance method that is: hidden, continuous, indiscriminate and intrusive.)  Doctrinally, the beeper cases do not shed much light on the question, but to the extent they do, they support requiring at least a warrant.  The same may be said for the Miller case, which I argue does not support the broad “third-party rule” that is claimed for it and does not support access to historical CSLI on less than a warrant either.  Depending on what happens in the Third Circuit and when, the paper can either discuss the oral argument or the actual decision and assess it against my own views of what the law is and should be.

Jerry Kang, Self-Analytic Privacy

Jerry Kang, Self-Analytic Privacy

Comment by: Susan Freiwald

PLSC 2009

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1729332

Workshop draft abstract:

[1]        Recent technological innovations present a new problem in the information privacy space:  the privacy of self-analytics.   By “self-analytics,” we mean the collection and processing of data by an individual about that individual in order to increase her self-knowledge for diagnostics, self-improvement, and self-awareness.   Think Google analytics, but as applied to the self and not to one’s website.  In this Article, we describe this new problem space and engage in a transdisciplinary analysis, focusing on the case study of locational traces.

[2]        In undertaking this analysis, we are mindful of what has become the standard script for privacy analyses in the law reviews-(i) identify some new threatening technology; (ii) trot out a parade-of-horribles; (iii) explain why the “market” has not already solved the problem; (iv) recommend some changes in code and law that accord with the author’s values.  This script is standard for sensible reasons, but we aim to go farther.

[3]        In particular, we make two theoretical contributions.  In addition to defining a new category of personal data called “self-analytics,” we distinguish between micro and macro definitions of privacy-the former focused on individual choice regarding or consent to personal data processing, and the latter using instead a system-wide measure of the “speed” of personal data flow.   The macro “system-speed” definition is offered to supplement, not replace, the traditional micro “individual-control” definition.  Still, this supplemental conception of information privacy has substantial consequences.  Indeed, we go so far as to suggest that the nearly exclusively micro- approach to privacy hasbeen a fundamental privacy error.

[4]        In addition to the theoretical interventions, we aim to concrete in our recommendations.  In particular, we provide the design specifications, both technical and legal, of a new intermediary called the “data vault,” which we believe is best suited to solve the privacy problem of self-analytics.   As we make this case, we hope to exhibit the values of a genuinely transdisciplinary engagement across law, engineering, computer science, and technology studies when focusing on solving a concrete problem.