Archives

Dawn E. Schrader, Dipayan Ghosh, William Schulze, and Stephen B. Wicker, Civilization and Its Privacy Discontents: The Personal and Public Price of Privacy

Dawn E. Schrader, Dipayan Ghosh, William Schulze, and Stephen B. Wicker, Civilization and Its Privacy Discontents: The Personal and Public Price of Privacy

Comment by: Heather Patterson

PLSC 2013

Workshop draft abstract:

Privacy awareness and privacy law ought to be built proactively and grounded in specific moral principles that protect our fundamental rights to live together, yet autonomously, in civilized society. In reality, people are willing to compromise their individual liberty in favor of peaceful societal co-existence. In this paper, we examine both the psychological need for the self to be connected to the outside world, and the simultaneous sense that the self wishes to have autonomy separate from that world; thus building loosely on Freud’s thesis from which this paper draws it’s title. We explore how people both want and fear opportunities that public (utility) collection of consumer data provide, even though they might know that control over, and regulation of, thought and behavior may ensue (e.g. Thaler & Sunstein; Wicker & Schrader, 2011).  What is the price, or value, placed on private information? Does the value of privacy shift as risks and benefits shift? Is this valuing influenced by media?

Power consumption metering offers a real-world contextualization in which a price is paid for private information. In order for the real-time prices to be broadcast to consumers who decide on their energy consumption, advanced technology is required for billing purposes. Temporally precise consumption levels are needed in order to charge consumers properly for their usage, so advanced technological monitoring records usage at short intervals and reports the fine granularity usage data. As these temporally precise data are directly reported to the utility, private details of the consumer’s life are effectively revealed, posing a risk of privacy violation. Cellular technology creates a similar context, one in which location information is given to a service provider in return for mobile communication services.  We therefore designed and conducted two national surveys to ascertain the value of personal privacy and the comparative social and economic costs of privacy impacts of the use of these two exemplary technologies.

Our paper examines consumers’ use of these technologies, whether or not they are aware of the privacy and security risks, what prices they are willing to pay to keep that risk at bay, and what they are willing to accept to give their private information.   What is the balance is between convenience, cost savings, and privacy protection?  We experimentally manipulated whether or not people would be persuaded by a media presentation that was designed to increase their awareness of privacy and security.  We further examined the economic cost-benefits and risk ratios and decisions to either adopt new privacy-aware measures/technologies or to change their behavior.  In essence, this paper examines the “tipping point” between personal privacy value and public offering cost.  We conclude by examining the price people are willing to pay to accept for privacy in relation to privacy law and policy, and make recommendations to limit corporate society, and protect individuals, from creating and accepting tempting risky behaviors that erode privacy rights.


Thaler, R. H, and Sunstein, C. R. (2008).  Nudge: Improving decisions about health, wealth and happiness.  New Haven, CT: Yale University Press.

Wicker, S. B. & Schrader, D. E. (2011).  Privacy Aware Design Principles for Information Networks.  Proceedings of the IEEE.  Issue 99, pp. 1-21. Digital Object identifier: 10.1109/JPROC.2010.2073670.

Heather Patterson and Helen Nissenbaum, Context-Dependent Expectations of Privacy in Self-Generated Mobile Health Data

Heather Patterson and Helen Nissenbaum, Context-Dependent Expectations of Privacy in Self-Generated Mobile Health Data

Comment by: Katie Shilton

PLSC 2013

Workshop draft abstract:

Rapid developments in health self-quantification via ubiquitous computing point to a future in which individuals will collect health-relevant information using smart phone apps and health sensors, and share that data online for purposes of self-experimentation, community building, and research. However, online disclosures of intimate bodily details coupled with growing contemporary practices of data mining and profiling may lead to radically inappropriate flows of fitness, personal habit, and mental health information, potentially jeopardizing individuals’ social status, insurability, and employment opportunities. In the absence of clear statutory or regulatory protections for self-generated health information, its privacy and security rest heavily on robust individual data management practices, which in turn rest on users’ understandings of information flows, legal protections, and commercial terms of service. Currently, little is known about how individuals understand their privacy rights in self-generated health data under existing laws or commercial policies, or how their beliefs guide their information management practices. In this qualitative research study, we interview users of popular self-quantification fitness and wellness services, such as Fitbit, to learn (1) how self-tracking individuals understand their privacy rights in self-generated health information versus clinically generated medical information; (2) how user beliefs about perceived privacy protections and information flows guide their data management practices; and (3) whether commercial and clinical data distribution practices violate users’ context-dependent informational norms regarding access to intimate details about health and personal well-being. Understanding information sharing attitudes, behaviors, and practices among self-quantifying individuals will extend current conceptions of context-dependent information flows to a new and developing health-related environment, and may promote appropriately privacy-protective health IT tools, practices, and policies among sensor and app developers and policy makers.