Daniel J. Solove and Woodrow Hartzog, The FTC and the New Common Law of Privacy
Comment by: Gerald Stegmaier & Chris Jay Hoofnagle
Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2312913
Workshop draft comment:
One of the great ironies about information privacy law is that the primary regulation of privacy in the United States is not really law and has barely been studied in a scholarly way. Since the late 1990s, the Federal Trade Commission (FTC) has been enforcing companies’ privacy policies through its authority to police unfair and deceptive trade practices. Despite over fifteen years of FTC enforcement, there is no meaningful body of case law to show for it. The cases have nearly all resulted in settlement agreements. Nevertheless, companies look to these agreements to guide their decisions regarding privacy practices. Those involved with helping businesses comply with privacy law – from chief privacy officers to inside counsel to outside counsel – parse and analyze the FTC’s settlement agreements, reports, and activities as if they were pronouncements by the High Court.
In this article, we contend that the FTC’s privacy jurisprudence is the functional equivalent to a body of common law, and we examine it as such. The FTC has said quite a lot through its actions and settlement agreements. And FTC privacy jurisprudence is the broadest and most influential regulating force on information privacy in United States – more so than nearly any privacy statute and any common law tort. The statutory law regulating privacy is diffuse and discordant, and the common law torts fail to regulate the majority of activities concerning privacy. Despite the central governing role of the FTC’s privacy activity, it has not received much scholarly attention.
In Part I of this article, we discuss how the FTC’s actions function practically as a body of common law for privacy. In the late 1990s, it was far from clear that the body of law regulating privacy policies would come from the FTC and not from traditional contract and promissory estoppel. Though privacy policies often have all the indicia of enforceable promises, they have rarely been utilized as contracts. On the few occasions when contract law is invoked for privacy policies, it usually fails. We explore how and why the current state of affairs developed. In Part II, we examine the principles that emerge from this body of law. These principles extend far beyond merely honoring promises. We discuss how these principles compare to principles in other legal domains, such as contract law. In Part III, we explore the implications of these developments and the ways that this body of law could develop.
Fred Stutzman and Woodrow Hartzog, Obscurity by Design
Comment by: Travis Breaux
Workshop draft abstract:
Currently, the most pressing issue for privacy regulators is the accumulation and use of consumer data by companies, including social media providers. Post-hoc responses by regulators to privacy violations, including violations by social media providers, do not sufficiently protect consumer privacy. To enhance consumer privacy, regulators recommend that privacy protections be built into all phases of the technology development lifecycle. This approach, known as privacy-by-design (PbD), mandates companies to proactively address privacy concerns so as to produce positive privacy outcomes for users. Although well intentioned, PbD faces a number of challenges in implementation, including a lack of specificity and weak market forces motivating adoption. To date, applied PbD work has largely focused on back-end implementation principles, such as data minimization and security. Very little work has focused on integrating PbD into the design of interfaces or interaction. Additionally, although regulators have paid much attention to the potential harms committed by companies that hold personal information, the threat posed by other users has been largely neglected. In the context of social media, PbD has not yet addressed the “social.” In this work, we argue that the design of privacy in social media user interaction is an integral concern that necessitates policy coordination between site designers and administrators. In social media sites, the development of PbD practices for interaction are equally important to those developed for data storage and security. Of course, the design of PbD practices for interaction is challenging. Interaction varies by site, culture, and context, and is not necessarily amenable to formal engineering requirements. To address this challenge, we propose a novel, empirically grounded approach to PbD for social media interaction. Drawing on an established framework for online obscurity, which identifies a set of practices for how individuals shield their identity in online social interaction, we propose the four factors of online obscurity as a set of design and policy criteria for approaching PbD for user interaction in social media. We then illustrate how designers and administrators of sites can address these factors through a range of technical, policy, and behavioral “nudge” solutions. In doing so, our work improves PbD discourse by providing actionable, empirically grounded specifications that are both flexible and feasible to implement.
Woodrow Hartzog, The Life, Death, and Revival of Implied Confidentiality
Comment by: Patricia Abril
Workshop draft abstract:
Confidence is implied in many of our face-to-face relationships. Those seeking to disclose in confidence can close doors, speak in hushed tones, and rely on context and other signals to convey a trust in the recipient that has not been explicitly articulated. Yet, according to courts, the same usually cannot be said for our relationships on the Internet. Online relationships are frequently perceived by courts as missing the same implicit cues of confidentiality that are present in face-to-face relationships. Indeed, implied confidentiality is absent in the judicial analysis of Internet-related cases except in the most obvious scenarios. Yet it is clear that Internet users often have implicitly shared expectations of confidentiality. This article posits that the diminished legal relevance of implied confidentiality on the Internet is not solely attributable to the inherent differences between online and offline interaction. Rather, this article argues that implied confidentiality has not been refined enough to be a workable concept in online disputes. The absence of online implied confidentiality as a legal concept is a problem because courts are tasked with ascertaining the actual agreement or relationship between Internet users. Although courts have regularly found implied confidences between parties offline, their analyses have left insufficient direction for future courts to consistently apply doctrine across the myriad of factual scenarios. As a result, the concept of implied confidentiality has, as a practical matter, been rendered too flimsy to play a significant role in Internet jurisprudence. The purpose of this article is to mine the rich history of implied confidentiality doctrine in an attempt to refine the concept with a unifying decision-making framework. This article proposes a technology-neutral framework based on a review of case law to help courts ascertain the two most common and important judicial considerations in implied obligations of confidentiality – party perception and party inequality. A more nuanced framework will better enable the application of implied confidentiality in online disputes than the currently vague articulation of the concept. This framework is offered to demonstrate that the Internet need not spell the end of implied agreements and relationships of trust.
Woodrow Hartzog & Frederic Stutzman, The Case for Practical Obscurity
Comment by: Gaia Bernstein
Workshop draft abstract:
Courts have consistently misapplied the concept of practical obscurity online. Practical obscurity holds that information that is practically hidden, but generally accessible, should be treated as functionally private. Critics of practical obscurity argue that publicly accessible information cannot be classified as private. Courts mistakenly agree, holding that the unfettered ability of any hypothetical individual to find and access information on the Internet renders that information public, or ineligible for privacy protection. This article attempts to correct these misconceptions.
We propose using practical obscurity as a reliable metric for describing the privacy of online information. Obscurity of online information is not the exception – all information online is, to some extent, hidden. Therefore, a court’s analysis of what is “public” and “private” under various legal standards should not hinge on a dichotomous question of accessibility, but rather a determination of the degree of obscurity. Courts have created an arbitrary definition of “public information” by relying on easily identified lines drawn when users employ passwords. This understanding of privacy is out of line with normative expectation, as demonstrated by empirical research.
The mistreatment of practical obscurity is a result of courts’ reliance on technology to define what information is public. For example, courts’ reliance on passwords as the test of privacy entrenches a technologically-defined understanding of privacy, in which password-restricted disclosures are private, and all other disclosures online are public. This assumption is untenable for the Internet. Our lives are increasingly mediated through communication technologies; this interweaving of our online and offline lives has important implications for the ways we communicate, negotiate trust and gain confidence. Individuals employ a number of obfuscating techniques to create practical obscurity such as name variants, multiple profiles or identities, search invisibility and simple contextual separation when posting content. A huge portion of the public Internet, the so-called “Dark Web,” is completely hidden from search engines and only accessible by those with the right search terms, URL, or insider knowledge. Is this functionally clandestine information any different in practice than information protected by a password? This article aims to answer that question by providing a framework for a nuanced analysis of practical obscurity online.
Woodrow Hartzog, Privacy in an Age of Contracts
Comment by: William McGeveran
Workshop draft abstract:
Woodrow Hartzog, A Promissory Estoppel Theory for Confidential Disclosure in Online Communities
Comment by: Allyson Haynes
Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1473561
Workshop draft abstract: