Archives

Kirsty Hughes, A Behavioural Understanding of Privacy as a Right to Respect for Barriers

Kirsty Hughes, A Behavioural Understanding of Privacy as a Right to Respect for Barriers

Comment by: Jens Grossklags

PLSC 2010

Workshop draft abstract:

The existing scholarship has tended to focus upon the identification of privacy interests and problems.  However, when one examines human behaviour it is apparent that privacy is highly subjective and that it is experienced in various forms. Drawing upon theories of privacy developed in the behavioural sciences the paper argues that we need a theory of privacy, which reflects the way that privacy is experienced.  Privacy experiences are mutually created ones.  They require an individual to successfully mobilise privacy barriers to prevent others from accessing him or her and they are dependent upon others respecting those barriers.  Thus privacy barriers play a fundamental role in privacy experiences.

Samuel Rickless developed the original barrier theory in an article published in the San Diego Law Review in 2007.   Rickless’s theory is based upon the idea that we should respect those barriers that individuals use to prevent us from discovering personal facts about them.  The idea that privacy is concerned with the preservation of barriers is similar to accounts of privacy developed in the behavioural sciences, but this is not explored in Rickless’s account.  Moreover, Rickless’s theory is restricted to the preservation of private information.

The paper harnesses the insights of the behavioural sciences and builds upon Rickless’s work to develop a theory of privacy that reflects understandings of privacy experiences. The paper argues that the right to privacy can be explained as a right to respect for those barriers that individuals use to prevent others from accessing them.  Three types of privacy barriers are identified and analysed: (i) physical; (ii) behavioural; and (iii) normative. The paper argues that an invasion of privacy occurs when these barriers are penetrated.

Andrea Matwyshyn, Information Paradoxes

Andrea Matwyshyn, Information Paradoxes

Comment by: Fred Cate

PLSC 2010

Workshop draft abstract:

One of the long-standing conundrums in privacy law is the “privacy paradox” – consumers allege to value privacy and data security but yet are happy to share their personally identifiable information in exchange for convenience or low value consideration.     Meanwhile, the law regarding who “owns” this shared information also presents a paradox of sorts: while companies who generate databases of consumer information assert a protectable intellectual property interest in these databases, they simultaneously assert that the data subjects have no protectable interest in the shared data. This presents an information ownership paradox.     This article explores the tensions among copyright, tradesecret, contract law, and data privacy/security law inherent in these two paradoxes.

Borrowing ideas from the work of Pierre Bourdieu, copyright and contract, this article alleges that no paradox necessarily exists in either scenario:  each side’s position is rooted in the same desire to control use.  The rights of both companies and individuals with respect to information can be recharacterized as rights to selectively embed data into economic contexts.   As such, this article crafts an approach to resolving the privacy paradox and information ownership paradox, and it proposes a legal regime for redress of information harms. It argues that the two dominant legal approaches to categorizing aggregated information bundles about humans — as fully alienable property, on the one hand, and as an absolute dignitary right of control, on the other hand – need a theoretical middle ground focused on control of context.   This new approach recognizes that the value of information is inherently socially embedded, not individual.  Without causing any upheaval to existing intellectual property rights in databases, a strong data protection regime can exist through blending legal approaches found in copyright and contract.   Concretely, the proposed approach involves three elements. First, state legislatures should provide consumers and licensors with a right of deletion in instances of a data steward’s information loss.  Second, breaches of privacy policies should be allowed to proceed as breach of contract actions. The burden of proof in cases of harms arising from information loss should be shifted to the information steward, while affording that steward an affirmative defense of reasonable data care.  Finally,  this new approach calls for states to assign a minimum statutory value for information harms, modeled on copyright law.  Such an approach would not only assist consumers in defending their right to embed data but also offer companies a right of recourse when they are forced to internalize costs imposed on them through third parties’ failed data stewardship.

Carol M. Bast & Cynthia A. Brown, A Contagion of Fear: Post-9/11 Alarm Expands Executive Branch Authority and Sanctions Prosecutorial Exploitation of America’s Privacy

Carol M. Bast & Cynthia A. Brown, A Contagion of Fear:  Post-9/11 Alarm Expands Executive Branch Authority and Sanctions Prosecutorial Exploitation of America’s Privacy

Comment by: Laura Donohue

PLSC 2010

Workshop draft abstract:

Following the attacks on September 11, 2001, the United States launched the “War on Terrorism” or “War on Terror” as the action has become known.  According to the Bush Administration, the phrase encompassed the nation’s military, political, legal and ideological conflict with Islamic extremism and extremists’ use of terrorism to propel their agenda.  Ironically, Al-Qaeda’s weapon, the use of fear as a means of coercion, in some respects now serves dually as a tool for some of our nation’s leaders.  The very same influence of fear that served as the terrorists’ objective is also used by government leaders to coerce expanded government power at the expense of individual liberties.  In essence, the fear created by the terrorists has become a contagion of fear with the accompanying contagion-like effects.  The influence of that fear, particularly of further terrorist attacks, is used by American leaders to justify subverting the nation’s constitutional freedoms and guarantees, the very same freedoms and guarantees that Operation Enduring Freedom is fighting to protect. As a result, many Americans are quick to support any government action which combats this threat and ensures national security.

After September 11, this contagion of fear stimulates the public support needed by government officials to further political and legal agendas that would otherwise be significantly more difficult, if not impossible, to achieve.  One such agenda concerns the issue of privacy and government surveillance, specifically as these are impacted by the Foreign Intelligence Surveillance Act (FISA). It seems that individuals are willing to cede communication privacy to the government in exchange for national security without realizing the ramifications of their actions.  This paper examines the need for balance between privacy and national security under FISA and policy considerations for the future.

Frank Pasquale, Reputation Regulation: Disclosure and the Challenge of Clandestinely Commensurating Computing

Frank Pasquale, Reputation Regulation: Disclosure and the Challenge of Clandestinely Commensurating Computing

Comment by: Tal Zarsky

PLSC 2010

Workshop draft abstract:

Reputational systems can never be rendered completely just, but legislators can take two steps toward fairness. The first is relatively straightforward: to assure that key decision makers reveal the full range of online sources they consult as they approve or deny applications for credit, insurance, employment, and college and graduate school admissions. Such disclosure will at least serve to warn applicants of the dynamic digital dossier they are accumulating in cyberspace. Effective disclosure requirements need to cover more than the users of reputational information—they should also apply to some aggregators as well. Just as banks have moved from consideration of a long-form credit report to use of a single commensurating credit score, employers and educators in an age of reputation regulation may turn to intermediaries which that combine extant indicators of reputation into a single scoring of a person. Since such scoring can be characterized as a trade secret, it may be even less accountable than the sorts of rumors and innuendo discussed above. Any proposed legislation will need to address the use of such reputation scores, lest black- box evaluations defeat its broader purposes of accountability and transparency.

Woodrow Hartzog, Privacy in an Age of Contracts

Woodrow Hartzog, Privacy in an Age of Contracts

Comment by: William McGeveran

PLSC 2010

Workshop draft abstract:

Traditionally, contracts have been most relevant in transactional contexts.  Yet, as websites became ubiquitous, so did terms of use, which brought communication and information into an age of contracts.  How have these digital agreements impacted our privacy?  Contracts require direct interactions constituting privity of contract, yet many technologies can be used to violate an individual’s privacy without such a relationship.  Under this logic, Warren and Brandeis dismissed contracts as a viable remedy for harms to privacy. Essentially, contacts provided no remedies against strangers.  With each advancement in communication technology, the potential distance between the individual and those who would violate their privacy has grown, seemingly culminating with the Internet’s obliteration of the ability to negotiate privacy.  This article attempts to organize and analyze the contemporary impact of contracts on privacy, both as binding agents and as evidence in extra-contractual contexts.  I argue that the reciprocal communication of the participatory web could actually alleviate the problem Warren and Brandeis identified: The web allows us greater control over self-disclosed information and gives us the ability to “dicker” for confidentiality.  If courts would legitimize explicit attempts by an individual to protect her privacy as part of an online agreement – such as taking advantage of offered website features like untagging photos, deleting personal information and increasing privacy settings – they would move one step closer to reclaiming contracts as a “meeting of the minds.”  This recognition would also revitalize contracts as a method for protecting privacy.

Jacqueline D Lipton, Righting Cyber Wrongs

Jacqueline D Lipton, Righting Cyber Wrongs

Comment by: Rafi Cohen-Amalgor

PLSC 2010

Workshop draft abstract:

In light of recent instances of cyberstalking and cyberharassment, particularly involving female victims,some commentators have argued that the law should do more to protect individual autonomyand privacyonline.   Others reject these views, suggesting that such developments would be undesirable, unnecessary, and potentially unconstitutional.  The proposed paper would argue for greater protections for individual privacy and autonomy online.  The author suggests protecting victims of reputational damage by providing new, more affordable avenues for redress than are currently available.  Existing literature has focused on judicial remediesand some market solutionsto remedy reputational damage, and have noted the limitations of each.  Problems identified include the fact that judicial remedies are time and cost-intensive,and it can be notoriously difficult to identify individual defendants,or, in the alternative, to proceed against operators of online services that host damaging content. Current market-based solutions are also problematic, because of the costs to victims.

This paper would advocate a multi-pronged approach to give victims of cyber-harassment better access to meaningful remedies.  The author advocates a combination of: (a) developing pro bono reputation defense services for online harassment; (b) developing public education programs to empower victims to combat such harassment themselves; and, (c) encouraging existing pro bono legal services to take on more cyber-harassment cases.  Advantages of this multi-pronged approach are that pro bono reputation services can utilize reputation protection tools currently utilized by for-profit services, but can do so at low or no cost to victims.  Public education likewise can empower victims to utilize these strategies for themselves at little to no cost.  Supplementing these approaches with a focus on reputation defense by pro bono legal services would capitalize on law’s expressive functions to send messages to the wider community about conduct that should not be tolerated online.

Alessandro Acquisti and Catherine E. Tucker, Guns, Privacy, and Crime

Alessandro Acquisti and Catherine E. Tucker, Guns, Privacy, and Crime

Comment by: Aaron Burstein

PLSC 2010

Workshop draft abstract:

In December 2008, a Memphis newspaper made publicly available an online, searchable database of all gun permit holders in Tennessee. The database included information such as the permit holder’s name, ZIP code, and his or her permit’s start and expiration dates. It did not receive much attention until, in February, an article about a parking argument that ended in a deadly shooting referred to it. The fierce debate which thereafter arose – with the NRA accusing the newspaper of a “hateful, shameful form of public irresponsibility,” and the newspaper standing by a “right to know” argument – exemplifies the complex interactions, and sometimes collisions, between privacy and other rights and needs. In this case, individual privacy rights collided with the collective right to know, and, arguably, with both individual and communal issues of security.

By preventing the release of personal data, individuals often hope to prevent harm to themselves. However, the publication of the gun permits data highlights one case where privacy and personal security may appear to be in conflict. Whereas gun rights advocates suggested that the publication exposed gun owners to risk (for instance, of criminals targeting houses known to hold guns, in order to steal them), those defending it argued that gun owners may be less likely to be targeted, precisely because the information was made publicly available. In this manuscript we attempt to quantify the actual impact that the publication of TN gun permits data had on 1) crime rates and 2) gun permit requests in the city of Memphis. Combining gun, crime, demographic, and location data from an array of sources and databases, we measured how rates of occurrences of different classes of crime changed, as function of the local density of gun ownership made public by the newspaper, before and after the publication of the database. Our results suggest that the publication of the database reduced more significantly the occurrence of violent crimes (such as robberies and burglaries) in ZIP codes with higher gun ownership density. At the same time, the publication was accompanied by a more significant percentage increase in gun permits requests in areas with pre-existing higher rates of gun ownership. To address concerns about unobserved heterogeneity, we also performed a falsification test by studying crime trends in a similar town (Jackson) in a neighboring state. We found no similar trends in crime during the time period in such town.

This paper contributes not just to the policy debate on the openness or secrecy of gun data (19 states allow the public to access gun permits information; other states either have no laws addressing the issue, or keep the information outside the public domain), but to the broader discourse on the boundaries and connections between privacy and security.

Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life

Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life

Comment by: Anita Allen

PLSC 2010

Workshop draft abstract:

Newly emerging socio-technical systems and practices, undergirded by digital media and information science and technology, have enabled massive transformations in the capacity to monitor behavior, amass and analyze personal information, and distribute, publish, communicate and disseminate it. These have spawned great social anxiety and, in turn, laws, policies, public interest advocacy, and technologies, framed as efforts to protect a right to privacy. A settled definition of this right, however, remains elusive. The book argues that common definitions as control over personal information or secrecy, that is, minimization of access to personal information, do not capture what people care about when they complain and protest that their right to privacy is under threat. Suggesting that this right may be circumscribed as applying only to special categories of “sensitive” information or PII does not avoid its pitfalls.

What matters to people is not the sharing of information — this is often highly valued — but the inappropriate sharing. Characterizing appropriate sharing is the heart of the book’s mission, turning to wisdom embodied in entrenched norms governing flows of personal information in society. The theory of contextual integrity offers “context-relevant informational norms” as a model for these social norms of information flow. It claims that in assessing whether a particular act or system or practice violates privacy, people are sensitive to the context in which these occur — e.g. healthcare, politics, religious practice, education, commerce — what types of information are in question, about whom it is, from whom it flows and to what recipients. What also matters are the terms of flow, called “transmission principles” that govern these flows, for example, with the consent of an information subject, whether in one direction or reciprocally, whether forced, given, bought and sold, and so on. Radical transformations in information are protested because they violate entrenched informational norms, that is, when they violate contextual integrity.

But not all technologies that induce novel flows are resisted, and contextual integrity would be unhelpfully conservative if it sweepingly deemed them morally wrong. The theory, therefore, also discriminates between those that are acceptable, even laudable, from those that are problematic. Inspired by the great work of other privacy theorists — past and contemporary — the theory suggests we examine how well novel flows serve diverse interests as well as important moral and political values compared with entrenched flows. The crux, however, lies in establishing how well these serve internal values, ends, and purpose of the relevant background contexts, ultimately, that is, how well they serve the integrity of society’s key structures and institutions.