Archives

Babak Siavoshy, Fourth Amendment Regulation of Information Processing

Babak Siavoshy, Fourth Amendment Regulation of Information Processing

Comment by: Stephen Henderson

PLSC 2013

Workshop draft abstract:

When (if ever) should the processing, analysis, or manipulation of evidence—rather than its mere collection—by the government trigger the Fourth Amendment?  This essay addresses some of the difficult line-drawing problems that arise from this question.

The Fourth Amendment protects the people from unreasonable government searches and seizures of persons, houses, papers, and effects.  Increasingly, however, government entities gather information not by rummaging through peoples’ things, but rather by using technology to process, analyze, or manipulate evidence or information that is already in the government’s hands or otherwise exposed.  For instance, the government may uncover information about a person by analyzing DNA he “abandoned” on the sidewalk or a discarded coffee cup; it might learn what happens in his house by processing the heat signatures emanating from its walls; or it might learn his habits by stringing together the pattern of his “public” movement using thousands of data points from cameras, government weather satellites, or automatic license plate readers. In each of these cases, the physical form of what is collected—DNA, heat, or visual information exposed to the public—is either exposed or already in the government’s hands.  It is the government’s use of technology to process, analyze, and enhance what is collected that makes the evidence useful, and that raises potential privacy concerns.

One response to these developments—perhaps representing the conventional wisdom—is that there are few, if any, constitutional limits on the government’s ability to manipulate evidence it could otherwise legally obtain.  Advocates of this position correctly note that judicially imposed limitations on information processing create difficult line drawing problems (how do we distinguish between acceptable information processing and unacceptable information processing?) and risk tying the hands of law enforcement by arbitrarily restricting the use of technology in investigations.  Accordingly, the conventional wisdom makes a strong argument that the government’s use of technology to manipulate, process, or analyze evidence—where there is no obvious collection problem—does not and should not trigger the Fourth Amendment.

This essay argues that the conventional wisdom on information processing under the Fourth Amendment is both misplaced and overstated.  It is misplaced because it adopts a wooden construction and application of the Fourth Amendment (an otherwise flexible provision) and one that risks significantly undermining the Amendment’s effectiveness and purpose, particularly in light of advancements in technology that allow the government to get the information it wants without engaging in conduct that looks like a Fourth Amendment search or seizure.  The conventional wisdom on information processing is also overstated, because it assumes that courts have hereto been unwilling to impose constitutional limitations on information processing conduct by the government.  In fact, information the issue is not new to the courts.  The judges and justices who shape Fourth Amendment law have grappled with what is essentially technologically-enhanced information processing conduct in cases as varied as Kyllo v. United States, Skinner v. Railway Executives Labor Association, Walter v. United States, United States v. Jones, and even Katz v. United States.  An overview of these and other cases suggests, first, that courts are willing to impose Fourth Amendment limitations on some information-processing conduct—or at the very least, that courts acknowledge that such conduct raises a Fourth Amendment question.  Second, it suggests a number of different solutions to the legitimate line drawing and other concerns raised by advocates of the view that information processing should not, by itself, trigger the Fourth Amendment. While there are no perfect solutions, the essay suggest a theoretical framework and a path forward for evaluating the Fourth Amendment implications of the increasing use of technologically-enhanced information processing by the government.

Stephen Henderson and Kelly Sorensen, Search, Seizure, and Immunity: Second-Order Normative Authority, Kentucky v.King, and Police-Created Exigent Circumstances

Stephen Henderson and Kelly Sorensen, Search, Seizure, and Immunity: Second-Order Normative Authority, Kentucky v.King, and Police-Created Exigent Circumstances

Comment by: Marcia Hofmann

PLSC 2013

Workshop draft abstract:

A paradigmatic aspect of a paradigmatic kind of right is that the person holding the right is the only one who can alienate it.  Rights are constraints that protect individuals, and while individuals can consent to waive many or even all rights, the normative source of that waiving is normally taken to be the individual herself.

This moral feature – immunity – is usually in the background of discussions about rights.  We want to bring it into the foreground here.  This foregrounding is especially timely in light of a recent U.S. Supreme Court decision, Kentucky v. King (2011), concerning search and seizure rights.  An entailment of the Court’s decision is that, at least in some cases, a right can be removed by the intentional actions of the very party against whom the right supposedly protects the rights holder.  We will argue that the Court’s decision is mistaken.  The police officers in the case before the Court were not morally permitted, and should not be legally permitted, to intentionally create the very circumstances that result in the removal of an individual’s right against forced, warrantless search and seizure.  In Fourth Amendment terms, the Court was wrong to reject the doctrine of police-created exigency.

An embedded concern is this.  Law enforcement officers and others are able to create circumstances that transform, or in some cases seem to transform, a person into a kind of wrongdoer who was not one before.  There are moral constraints against creating the circumstances that transform persons in certain ways.  We will note some of these constraints as well.

Susan Freiwald & Sylvain Métille, Simply More Privacy Protective: Law Enforcement Surveillance in Switzerland as compared to in the United States

Susan Freiwald & Sylvain Métille, Simply More Privacy Protective: Law Enforcement Surveillance in Switzerland as compared to in the United States

Comment by: Stephen Henderson

PLSC 2012

Workshop draft abstract:

Calls for reform of the American laws governing electronic surveillance have heightened as the principal federal law, the Electronic Communications Privacy Act (“ECPA”), has approached its twenty-fifth birthday this year.  Passed in 1986 to bring communications surveillance into the electronic age, the ECPA has not been meaningfully updated since the advent of the World Wide Web. Courts currently disagree over whether the statute even applies to surveillance using mobile technology, years after cell phones have become ubiquitous in American’s lives.  Switzerland, by contrast, has recently updated its laws to cover surveillance technology. In January of 2011, the Swiss enacted a brand new statute, the Swiss Criminal Procedure Code (CrimPC). Substantively, CrimPC imposes similar procedural requirements on law enforcement agents’ use of a variety of investigatory techniques. That nearly uniform treatment stands in stark contrast to the ECPA, which uses a complicated set of categories and rules to make surveillance law in the United States exceedingly difficult to understand and apply. More importantly, Swiss law precludes the use of surveillance techniques not authorized and regulated by CrimPC, while in the United States, a tremendous amount of what the Swiss consider to be surveillance takes place outside the confines of the applicable surveillance laws.

Even if Congress were to amend the ECPA by passing the most privacy protective of the current bills proposed, resulting U.S. law would achieve neither the uniformity nor all the privacy protective features of CrimPC. In short, Swiss law involves judges in many more types of surveillance and in a much more active way than any of the current proposals in this country would do. Swiss law also requires, as U.S. law does not and would not even after amendment, that clear notice must be given in almost all cases to those targeted by surveillance, once done.

This paper describes the passage of CrimPC and its key provisions, which govern the surveillance of mail and telecommunications, collection of user identification data, use of technical surveillance equipment, the surveillance of contacts with a bank, use of undercover agents and the surveillance through physical observation of people and places accessible to the general public. It contrasts those provisions with current U.S. law. The discussion puts the proposals for U.S. law reform in perspective, and sheds light on two radically different approaches to regulating law enforcement surveillance of communications technologies.

Matthew Tokson, Automation and the Fourth Amendment

Matthew Tokson, Automation and the Fourth Amendment

Comment by: Stephen Henderson

PLSC 2010

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1471517

Workshop draft abstract:

Most Internet users are not aware that many Internet Service Providers collect data about their customers’ online activities and sell it to third party marketers.   Yet, remarkably, many users that are aware of their Providers’ invasive practices remain unconcerned, and very few users change their behavior in order to protect their privacy.  This presents several problems for scholars who propose that users have a reasonable expectation of privacy in personal online data.

This article posits that Internet users are largely unconcerned that their ISPs have access to intimate forms of online communications data (from emails to web surfing data to associated subscriber information) because in virtually every case no other human being will ever use or even see such data.  Instead, all of the operations involving data that can be traced to an individual user are carried out by computers performing automated tasks on databases of customer information.  Because the information is never viewed by a person, the user never perceives a privacy harm or privacy risk.

However, the Supreme Court has held that voluntary disclosure of one’s personal information to either an employee or the automated equipment of a third-party corporation eliminates a reasonable expectation of privacy in that information.  This article examines how this aspect of the Court’s third-party doctrine threatens to eviscerate criminal and civil privacy protections for online content.  It discusses the failures of many courts and scholars to distinguish between disclosure to automated systems and disclosure to human beings when determining the legal protection that electronic data should receive.  The article proposes that the automated equipment rationale can be and must be limited to the context of telephone number switching, and challenges the misconception of privacy that lies behind the Court’s over-aggressive application of the third party doctrine.  It concludes by analyzing whether the reasonable expectation of privacy test as developed by Katz and its progeny is destined to be dramatically underprotective of privacy whenever it is applied to the complex and ever-changing technological framework of Internet communications and personal data.

Stephen Henderson, Government Access to Private Records

Stephen Henderson, Government Access to Private Records

Comment by: Chris Slobogin

PLSC 2009

Workshop draft abstract:

Although there is room for debate regarding whether the rule is truly monolithic, so far as the provider of information is concerned, there is little to no Fourth Amendment protection for information provided to a third party.  But of course there remain significant legal protections for certain types of third-party information.  A good number of states have constitutionally rejected the federal doctrine, and are working out a more protective constitutional jurisprudence.  And all fifty states and the federal government provide statutory restrictions on government access to certain information in the hands of third parties.  So, the question is not whether the law should provide such restriction, but instead when and how it should do so.  These Standards seek to bring needed uniformity and clarity to the law by providing aspirational “best practices” standards regulating government access to private information in the hands of institutional third parties.  Although very significant decisions are still being made, this includes creating a “privacy hierarchy” of third party information, including articulating how to populate that hierarchy, and then assigning restraints to the various types of information.  While “more private” information is obviously generally deserving of greater restriction upon access, there are difficult decisions to be made regarding how best to enable effective investigations: if there is no way to differentiate different stages of law enforcement activity in an administrable manner, then only relatively light restrictions will be possible.  Moreover, given that law enforcement is increasingly creating databases of information it obtains, it is necessary to craft restrictions on the dissemination and use of third party information previously gathered.  The Standards will address these, and possibly other, concerns.