Archives

Andrea M. Matwyshyn, Talking Data

Andrea M. Matwyshyn, Talking Data

Comment by: Andrew Selbst

PLSC 2013

Workshop draft abstract:

In the wake of Sorrell v. IMS Health, open questions remain regarding the limitations on privacy regulation imposed by the First Amendment.    A conceptual classification problem that is simultaneously also visible in other bodies of law has crept into the intersection of privacy and the First Amendment:  confusion over when (or whether) a data aggregator’s code (and its attached information) is a type of expressive, socially-embedded act of communication or a type of free-standing communicative yet regulable “product.”   This article argues that although the statute at issue in Sorrell failed First Amendment scrutiny, privacy regulation which restricts onward transfer of databases of consumer information – even transfers of anonymized data – if carefully crafted, can pass First Amendment scrutiny.    Through blending doctrinal First Amendment principles with the fundamental tenets of human subjects research protection imposed by Congress and the Department of Health and Human Services,  this article explains the doctrinal limits of the First Amendment on future consumer privacy laws and offers an example of a possible First Amendment-sensitive approach to protecting consumer privacy in commercial databases.

Andrea M. Matwyshyn, Repossessing the Disembodied Self: Rolling Privacy and Secured Transactions

Andrea M. Matwyshyn, Repossessing the Disembodied Self:  Rolling Privacy and Secured Transactions

Comment by: Diane Zimmerman

PLSC 2012

Workshop draft abstract:

Consumer privacy in commercial contexts is primarily governed by rolling form contracts that are usually amendable in the sole discretion of the drafter.   As these contracts have become longer and less readable over time and as consumers have become progressively more comfortable with the informational disembodiment of the self, concerns over fairness in privacy contracting grow.   These concerns loom particularly large when a company enters bankruptcy:  privacy contracts/terms of use may include a provision that allows for the disposition of consumer data in bankruptcy in a manner unfettered by privacy promises.  Alternatively, in the absence of FTC intervention, a bankruptcy court may attempt to facilitate sale of database assets by simply setting aside the privacy contracts with consumers.   Engaging with the contract literature, secured transactions literature and bankruptcy literature, this article argues that as progressively more sensitive consumer information becomes controlled by private companies, a fundamental tension arises:  databases frequently become the primary assets of companies, but yet their collateralization, repossession and disposition processes are uncertain as a matter of law.   Equally uncertain is the extent of companies’ continuing privacy obligations to consumers in bankruptcy.   This dynamic pushes borrowers, lenders, courts, consumers and the FTC into an unsustainable relationship in the innovation lifecycle.    The article concludes by proposing an amendment to the current law of secured transactions in databases that balances the privacy interests of consumers with enabling information entrepreneurship and capital formation.

Andrea Matwyshyn, Digital Childhood

Andrea Matwyshyn, Digital Childhood

Comment by: Joel Reidenberg

PLSC 2011

Workshop draft abstract:

The Children’s Online Privacy Protection act suffers from numerous shortcomings. Perhaps the most notable of these deficiencies is the lack of statutory coverage for children over the age of thirteen but below the legal age of contractual capacity.   This article argues that as a matter of contract theory and doctrine, children under the legal age of contractual capacity retain the right to ask that all contracts relating to their conduct online be deemed voidable.   As such, when a minor asks that an agreement (for a non-necessity) be set aside on the basis of lack of capacity, the other party can no longer derive benefit from the consideration paid by the minor, including her information.   A duty of deletion then pertains to the holders of the minor’s information as a matter of contract law.

Andrea Matwyshyn, Information Paradoxes

Andrea Matwyshyn, Information Paradoxes

Comment by: Fred Cate

PLSC 2010

Workshop draft abstract:

One of the long-standing conundrums in privacy law is the “privacy paradox” – consumers allege to value privacy and data security but yet are happy to share their personally identifiable information in exchange for convenience or low value consideration.     Meanwhile, the law regarding who “owns” this shared information also presents a paradox of sorts: while companies who generate databases of consumer information assert a protectable intellectual property interest in these databases, they simultaneously assert that the data subjects have no protectable interest in the shared data. This presents an information ownership paradox.     This article explores the tensions among copyright, tradesecret, contract law, and data privacy/security law inherent in these two paradoxes.

Borrowing ideas from the work of Pierre Bourdieu, copyright and contract, this article alleges that no paradox necessarily exists in either scenario:  each side’s position is rooted in the same desire to control use.  The rights of both companies and individuals with respect to information can be recharacterized as rights to selectively embed data into economic contexts.   As such, this article crafts an approach to resolving the privacy paradox and information ownership paradox, and it proposes a legal regime for redress of information harms. It argues that the two dominant legal approaches to categorizing aggregated information bundles about humans — as fully alienable property, on the one hand, and as an absolute dignitary right of control, on the other hand – need a theoretical middle ground focused on control of context.   This new approach recognizes that the value of information is inherently socially embedded, not individual.  Without causing any upheaval to existing intellectual property rights in databases, a strong data protection regime can exist through blending legal approaches found in copyright and contract.   Concretely, the proposed approach involves three elements. First, state legislatures should provide consumers and licensors with a right of deletion in instances of a data steward’s information loss.  Second, breaches of privacy policies should be allowed to proceed as breach of contract actions. The burden of proof in cases of harms arising from information loss should be shifted to the information steward, while affording that steward an affirmative defense of reasonable data care.  Finally,  this new approach calls for states to assign a minimum statutory value for information harms, modeled on copyright law.  Such an approach would not only assist consumers in defending their right to embed data but also offer companies a right of recourse when they are forced to internalize costs imposed on them through third parties’ failed data stewardship.

Ryan Calo, People Can Be So Fake: On the Limitations of Privacy and Technology Scholarship

Ryan Calo, People Can Be So Fake: On the Limitations of Privacy and Technology Scholarship

Comment by: Andrea Matwyshyn

PLSC 2009

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1458637

Workshop draft abstract:

Scholarship around privacy often reacts to and contextualizes advances in technology.  Scholars disagree about whether and how particular technologies implicate privacy, however, and certain technologies with potentially serious ramifications for privacy can avoid scrutiny entirely.

Consider a growing trend in user interface design, that of building “social” interfaces with natural language capabilities and other anthropomorphic signifiers.  An extensive literature in communications and psychology demonstrates that artificial social agents elicit strong subconscious reactions, including the feeling of being observed or evaluated. Adding a social layer to the technologies we use to investigate or navigate the world, or introducing apparent agents into spaces historically reserved for solitude, has important implications for privacy.  These techniques need not entail any collection, processing, or dissemination of information, however, and hence fall outside even the broadest and most meticulous contemporary accounts of privacy harm.

This paper argues for a new master test for privacy invasive technology.  Specifically, I argue that for any given technology, we should look to three interrelated factors: perception of observation, actual observation, and independent consequence.  Dissecting the effects of technology along these three lines will help clarify why, and to what extent, a given technology or technique implicates privacy.  This approach differs from the standard discussion of privacy invasive technology in terms of the collection, processing, and dissemination of information.  It has the advantage of capturing certain conservative intuitions espoused by courts and commentators, such as the view that the mere collection or processing of data by a computer can at most “threaten” privacy, and uncovers situations wherein notice itself triggers a violation.  Yet the approach is not reductionist overall: the proposed test elevates the importance of victim perspective and captures a previously undertheorized category of privacy harm.