Archives

Jules Polonetsky & Omer Tene, Privacy in the Age of Big Data: A Time for Big Decisions

Jules Polonetsky & Omer Tene, Privacy in the Age of Big Data: A Time for Big Decisions

Comment by: Ed Felten

PLSC 2012

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2149364

Workshop draft abstract:

We live in an age of “big data”. Data has become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses, government and individuals.[1] In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency and growth.[2] At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation.

Privacy advocates and data regulators increasingly decry the era of big data as they observe the growing ubiquity of data collection and increasingly robust uses of data enabled by powerful processors and unlimited storage. Researchers, businesses and entrepreneurs equally vehemently point to concrete or anticipated innovations that may be dependent on the default collection of large data sets. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information”, the role of consent, and the principles of purpose limitation and data minimization.

In our paper we intend to develop a model where the benefits of data for businesses and researchers are balanced with individual privacy rights. Such a model would help determine whether processing could be based on legitimate business interest or subject to individual consent and whether consent must be structured as opt-in or opt-out. In doing so, we will address questions such as: Is informed consent always the right standard for data collection? How should law deal with uses of data that may be beneficial to society or to individuals when individuals may decline to consent to those uses? Are there uses that provide high value and minimal risk where the legitimacy of processing may be assumed? What formula determines whether data value trumps individual consent?

Our paper draws on literature discussing behavioral economics, de-identification techniques, and consent models, to seek a solution to the big data quandary. Such a solution must enable privacy law to adapt to the changing market and technological realities without dampening innovation or economic efficiency.


[1] Kenneth Cukier, Data, data everywhere, The Economist, February 25, 2010, http://www.economist.com/node/15557443.

[2] McKinsey, Big data: The next frontier for innovation, competition, and productivity, June 2011, http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation.

 

Jane Winn, Technical Standards as Information Privacy Regulation

Jane Winn, Technical Standards as Information Privacy Regulation

Comment by: Ed Felten

PLSC 2010

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1118542

Workshop draft abstract:

Most information privacy laws are based on 20th century administrative law models, taking human conduct as the subject of regulation rather than the information architecture.  Such regulations are clearly inadequate to control how computer systems process information, and that inadequacy will become more acute as pervasive computing grows.  Technical standards may serve as a form of administrative law capable of directly targeting the information architecture as the subject of regulation.  A technical standard is defined by ISO as a “document, established by consensus and approved by a recognized body, that provides for common and repeated use, rules, guidelines or characteristics for activities or their results, aimed at the achievement of the optimum degree of order in a given context.”  The authority of technical standards as regulation has been both obscured and legitimated by the role of science and the technocratic professionalism in standard setting processes. More explicit systems for coordinating the work of conventional legal institutions and technical standard setting processes are needed to increase the effectiveness of information privacy laws.  As part of a more general movement away from state regulation and toward enforced self-regulation by the private sector, such explicit systems have already been developed in areas such as product and food safety, and are emerging in information technology arenas.  The Payment Card Industry Data Security Standard is part of a private self-regulatory system based on both legal rules and technical standards. Standardization of privacy impact assessments represents progress toward incorporation of technical standards into the framework of information privacy laws.