Ira Rubinstein, Regulating Privacy by Design
Comment by: Marilyn Prosch & Ken Anderson
Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1837862
Workshop draft abstract:
Privacy officials in Europe and North America are embracing Privacy by Design (PbD) as never before. PbD is the idea that “building in” privacy throughout the design and development of products and services achieves better results than “bolting it on” as an afterthought. However enticing this idea may be, what does it mean? In the US, a very recent FTC Staff Report makes PbD one of three main components of a new privacy framework. According to the FTC, firms should adopt PbD by incorporating substantive protections into their development practices (such as data security, reasonable collection limitations, sound retention practices, and data accuracy) and implementing comprehensive data management procedures; the latter may also require a privacy impact assessment (PIA) where appropriate. In contrast, European privacy officials view PbD as also requiring the broad adoption of Privacy Enhancing Technologies (PETs), especially PETs that shield or reduce identification or minimize the collection of personal data. Despite the enthusiasm of privacy regulators, neither PbD nor PIAs nor PETs have yet to achieve widespread acceptance in the marketplace.
There are many reasons for this, not the least of which is a lack of clarity over the meaning of these terms, how they relate to one another, or what rules apply when a firm undertakes the PbD approach. In addition, Internet firms derive much of their profit from the collection and use of PII and therefore PbD may disrupt profitable activities or new business ventures. Although the European Commission sponsored a study of the economic costs and benefits of PETs, and the UK is looking at how to improve the business case for investing in PbD, the available evidence does not support the view that PbD pays for itself (except for a small group of firms who must protect privacy to maintain highly valued brands and avoid reputational damage). In the meantime, the regulatory implications of PbD are murky at best, not only for firms that might adopt this approach but for free riders as well. Indeed, discussion of the economic or regulatory incentives for PbD is sorely lacking in the FTC report.
This Article seeks to clarify the meaning of PbD and thereby suggest how privacy officials might develop appropriate regulatory incentives that offset the certain economic costs and uncertain privacy benefits of this new approach. It begins by developing an analytic framework around two sets of distinctions. First, it classifies PETs as substitutes or complements depending on their interaction with data protection or privacy law. Substitute PETs aim for zero-disclosure of PII, whereas complementary PETs enable greater user control over personal data through enhanced notice and choice. Second, it distinguishes two forms of PbD, one in which firms seek to build-in privacy protections either by using PETs or by relying on engineering approaches and related tools that implement FIPPs throughout both the product development and the data management lifecycles. Building on these distinctions, and using targeted advertising as its primary illustration, it then suggests how regulators might achieve better success in promoting the use of PbD by 1) identifying best practices in privacy design and development, including prohibited practices, required practices, and recommended practices; and 2) situating best practices within an innovative regulatory framework that a) promotes experimentation with new technologies and engineering practices; b) encourages regulatory agreements through stakeholder representation, face-to-face negotiations, and consensus-based decision making; and c) supports flexible, incentive-driven safe harbor mechanisms as defined by (newly enacted) privacy legislation.