Archives

Frederik Zuiderveen Borgesius, A New Regulatory Approach for Behavioral Targeting

Frederik Zuiderveen Borgesius, A New Regulatory Approach for Behavioral Targeting

Comment by: Omer Tene

PLSC 2013

Workshop draft abstract:

Behavioral targeting forms the core of many privacy related questions on the Internet. It is an early example of ambient intelligence, technology that senses and anticipates people’s behavior to adapt the environment to their needs. This makes behavioral targeting a good case study to examine some of the difficulties that privacy law faces in the twenty-first century.

The paper concerns the following question. In the context of behavioral targeting, how could regulation be improved to protect privacy, without unduly restricting the freedom of choice of Internet users?

The paper explores two ways of privacy protection. The first focuses on empowering the individual, for example by requiring companies to obtain informed consent of the individual before data processing takes place. In Europe for instance, personal data “must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law” (article 8 of the Charter of Fundamental Rights of the European Union). The phrase “on the basis of the consent of the person” seems to be a loophole in the regime, as many Internet users click “I agree” to any statement that is presented to them. Insights from behavioral economics cast doubt on the effectiveness of the empowerment approach as a privacy protection measure.

The second approach focuses on protecting rather than empowering the individual. If aiming to empower people is not the right tactic to protect privacy, maybe specific prohibitions could be introduced. Some might say that the tracking of Internet users is not proportional to the purposes of marketers, and therefore should be prohibited

altogether. But less extreme measures can be envisaged. Perhaps different rules could apply to different circumstances. Are data gathered while Internet users are looking to buy shoes, less sensitive than data that reveal which books they consider buying or which online newspapers they read? Do truly innocent data exist?

One of the most difficult issues would be how to balance such prohibitions with personal autonomy, since prohibitions appear to limit people’s freedom of choice. The paper draws inspiration from consumer law, where similar problems arise. When should the law protect rather than empower the individual? A careful balance would have to be struck between protecting people and respecting their  freedom of choice. The paper concludes with recommendations to improve privacy protection, without unduly restricting people’s freedom of choice.

Jules Polonetsky and Omer Tene, A Theory of Creepy: Technology, Privacy and Shifting Social Norms

Jules Polonetsky and Omer Tene, A Theory of Creepy: Technology, Privacy and Shifting Social Norms

Comment by: Felix Wu

PLSC 2013

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2326830

Workshop draft abstract:

The rapid evolution of digital technologies has hurled to the forefront of public and legal discourse dense social and ethical dilemmas that we have hardly begun to map and understand. In the near past, general community norms helped guide a clear sense of ethical boundaries with respect to privacy. One does not peek into the window of a house even if it is left open. One does not hire a private detective to investigate a casual date or the social life of a prospective employee. Yet with technological innovation rapidly driving new models for business and socialization, we often have nothing more than a fleeting intuition as to what is right or wrong. Our intuition may suggest that it is responsible to investigate the driving record of the nanny who drives our child to school, since such tools are now readily available. But is it also acceptable to seek out the records of other parents in our child’s car pool or of a date who picks us up by car? Alas, intuitions and perceptions of “creepiness” are highly subjective and difficult to generalize as social norms are being strained by new technologies and capabilities. And businesses that seek to create revenue opportunities by leveraging newly available data sources face huge challenges trying to operationalize such subjective notions into coherent business and policy strategies.

This article presents a set of legal and social considerations to help individuals, businesses and policymakers navigate a world of new technologies and evolving norms. These considerations revolve around concepts that we have explored in prior work, including transparency; accessibility to information in usable format; and the elusive principle of context.

Jules Polonetsky & Omer Tene, Privacy in the Age of Big Data: A Time for Big Decisions

Jules Polonetsky & Omer Tene, Privacy in the Age of Big Data: A Time for Big Decisions

Comment by: Ed Felten

PLSC 2012

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2149364

Workshop draft abstract:

We live in an age of “big data”. Data has become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses, government and individuals.[1] In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency and growth.[2] At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation.

Privacy advocates and data regulators increasingly decry the era of big data as they observe the growing ubiquity of data collection and increasingly robust uses of data enabled by powerful processors and unlimited storage. Researchers, businesses and entrepreneurs equally vehemently point to concrete or anticipated innovations that may be dependent on the default collection of large data sets. In order to craft a balance between beneficial uses of data and individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information”, the role of consent, and the principles of purpose limitation and data minimization.

In our paper we intend to develop a model where the benefits of data for businesses and researchers are balanced with individual privacy rights. Such a model would help determine whether processing could be based on legitimate business interest or subject to individual consent and whether consent must be structured as opt-in or opt-out. In doing so, we will address questions such as: Is informed consent always the right standard for data collection? How should law deal with uses of data that may be beneficial to society or to individuals when individuals may decline to consent to those uses? Are there uses that provide high value and minimal risk where the legitimacy of processing may be assumed? What formula determines whether data value trumps individual consent?

Our paper draws on literature discussing behavioral economics, de-identification techniques, and consent models, to seek a solution to the big data quandary. Such a solution must enable privacy law to adapt to the changing market and technological realities without dampening innovation or economic efficiency.


[1] Kenneth Cukier, Data, data everywhere, The Economist, February 25, 2010, http://www.economist.com/node/15557443.

[2] McKinsey, Big data: The next frontier for innovation, competition, and productivity, June 2011, http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation.

 

Jules Polonetsky & Omer Tene, Advancing Transparency and Individual Control in the Use of Online Tracking Devices: A Response to Transatlantic Legal and Policy Developments

Jules Polonetsky & Omer Tene, Advancing Transparency and Individual Control in the Use of Online Tracking Devices: A Response to Transatlantic Legal and Policy Developments

Comment by: Catherine Dwyer

PLSC 2011

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1920505

Workshop draft abstract:

For over a decade, behavioral advertising has been a focus of privacy debates on both sides of the Atlantic. Industry actors maintain that targeted ads are essential to supporting the main business model online whereby users benefit from free content and services in return for being subjected to various advertisements. They assert that they do not cause any harm to users given that any data collected and used are anonymous and in compliance with data protection standards. Regulators and consumer advocates insist that many advertising or analytics companies are collecting and using personal data in a manner that does not comply with the principles of privacy laws. They maintain that the dignity of users is impacted by these hidden practices and questions about harm due to the use of data for purposes adverse to users remain unanswered.

The recent publication of the much anticipated FTC Staff Report on reform of the legal framework for privacy protection of consumers  and the Department of Commerce Green Paper on privacy and innovation in the Internet economy  has raised the stakes for both proponents and opponents of behavioral advertising and challenged the market to find solutions that are both privacy protective and commercially feasible. Moreover, the FTC’s proposal to implement a “do-not-track” mechanism echoes voices on the other side of the Atlantic calling for application of the e-Privacy Directive’s consent requirements through a browser based opt out.  Such similarities reinforce our conviction that user expectations, business requirements, and privacy regimes are converging all over the world.

Our paper will draw on literature discussing behavioral economics, privacy enhancing technologies and user-centric identity management to seek a solution to the behavioral advertising quandary. Such a solution must be acceptable by businesses, users and regulators on both sides of the Atlantic and be based on the premise that privacy regulation needs to adapt to the changing market and technological realities without dampening innovation or damaging the business model that makes the Internet thrive.

We will provide a taxonomy of the various mechanisms used by the online industry to track users (e.g., first and third party cookies; flash cookies; beacons; Stored Flash Objects; browser fingerprinting, deep packet inspection; and more); assess under legal, technical and business criteria the feasibility of existing and new proposals for compliance with the latest FTC and EU regulatory requirements; and explore various strategies for solutions such as browser defaults and add-ons, special marking of targeted ads, and short privacy policies.