Archives

Alessandro Acquisti, Laura Brandimarte, and Jeff Hancock, Are There Evolutionary Roots To Privacy Concerns?

Alessandro Acquisti, Laura Brandimarte, and Jeff Hancock, Are There Evolutionary Roots To Privacy Concerns?

Comment by: Dawn Schrader

PLSC 2013

Workshop draft abstract:

We present a series of experiments aimed at investigating potential evolutionary roots of privacy concerns.

Numerous factors determine our different reactions to offline and online threats. An act that appears inappropriate in one context (watching somebody undressing in their bedroom) is natural in another (on the beach); the physical threat of a stranger following us in the street is more ominous than the worst consequences of an advertiser knowing what we do online; common sense and social conventions tell us that genuine Rolexes are not sold at street corners – but fake Bank of America websites are found at what seem like the right URLs. There is, however, one crucial parallel that connects the scenarios we just described: our responses to threats in the physical world are sensitive to stimuli which we have evolved to recognize as signals of danger. Those signals are absent, subdued, or manipulated, in cyberspace. The “evolutionary” conjecture we posit and experimentally investigate is that privacy (as well as security) decision making in cyberspace may be inherently more difficult than privacy and security decision making in the physical world, because – among other reasons – online we lack, or are less exposed to, the stimuli we have evolved to employ offline as means of detection of potential threats.

Through a series of lab experiments, we are investigating this conjecture indirectly, by measuring the impact that the presence, absence, or changes to an array of stimuli in the physical world (which are mostly unconsciously processed) will have on security and privacy behavior in cyberspace.

 

Our approach focuses on the mostly unconsciously processed stimuli that influence security and privacy behavior in the offline world, and is posited on an evolutionary conjecture: Human beings have evolved sensorial systems selected to detect and recognize threats in their environment via physical, “external” stimuli. These stimuli, or cues, often carry information about the presence of others in one’s space or territory. The evolutionary advantages of being able to process and react to such stimuli are clear: by using these signals to assess threats in their physical proximity, humans reduce the chance of being preyed upon (Darwin, 1859; Schaller, Faulkner, Park, Neuberg & Kenrick, 2005). Under this conjecture, the modern, pre-information age notion of privacy may be an evolutionary by-product of the search for security. Such evolutionary explanation for privacy concerns may help explain why – despite the wide and diverse array of privacy attitudes and behaviors across time and geography – evidence of a desire for privacy, broadly constructed, can be found across most cultures. Furthermore, since in cyberspace, those signals are absent, subdued, or manipulated, generating an evolutionary “deficit,” such an evolutionary story may explain why privacy concerns that would normally be activated in the offline world are suppressed online, and defense behaviors are hampered.

The research we are conducting, therefore, combines lessons from disciplines that have been recently applied to privacy and security (such as usability, economics, or behavioral decision research) with lessons and methodologies from evolutionary psychology (Buss, 1991, 1995). While this gendered, evolutionary perspective is not without criticism, it can explain several patterns in online dating behavior. Women, for example, are more likely to include dated and otherwise deceptive photos in their profile than men (Hancock & Toma, 2009). Physical attractiveness also plays a role, with attractive daters lying less in their profiles and judging those who do lie more harshly than unattractive daters (Toma & Hancock, 2010). Indeed, extant cyber-research has been criticized for ignoring the evolutionary pressures that may shape online behaviors (see Kock, 2004), such as humans’ ability to cognitively adapt to new media, and their evolutionary preferences for certain media characteristics (e.g., synchronicity, collocation).

While we cannot directly test the evolutionary conjecture that the absence of stimuli, which humans have evolved to detect for assessing threats (including cues to the presence of other humans), contributes to our propensity to fall for cyberattacks or online privacy violations, we can test, through a series of human subjects experiments we have started piloting, how the presence, absence, or modifications in an array of stimuli in the physical world affect security and privacy behavior in cyberspace. The term “stimuli,” in the parlance of this proposal, is akin to the term “cues” as used in psychology and cognitive science. Our experiments focus on three types of such stimuli:

S1)    sensorial stimuli: auditory, visual, olfactory cues of the physical proximity of other human beings;
S2)    environmental stimuli: cues that signal to an individual certain characteristics of the physical environment in which the individual is located, such as crowdedness or familiarity;
S3)    observability stimuli: cues that signal whether the individual is possibly being surveilled.

The three categories are not meant as mutually exclusive (for instance, it is through our senses that we receive cues about the environment). Our experiments capture how manipulations of the stimuli in the physical environment of the subject influence both her privacy behavior in cyberspace. Privacy behavior is operationalized in terms of individuals’ propensity to disclose personal or sensitive information, as in previous experiments by the authors.

Yang Wang, Pedro Giovanni Leon, Kevin Scott, Xiaoxuan Chen, Alessandro Acquisti, and Lorrie Faith Cranor, Privacy Soft-paternalism: Facebook Users’ Reactions to Privacy Nudges

Yang Wang, Pedro Giovanni Leon, Kevin Scott, Xiaoxuan Chen, Alessandro Acquisti, and Lorrie Faith Cranor, Privacy Soft-paternalism: Facebook Users’ Reactions to Privacy Nudges

Comment by: Andrew Clearwater

PLSC 2013

Workshop draft abstract:

Anecdotal evidence and scholarly research have shown that a significant portion of Internet users experience regrets over disclosures they have made online. To explore ways to help individuals avoid or lessen regrets associated with online mistakes, we employed lessons from behavioral decision research and soft- paternalism to develop three Facebook interfaces that “nudge” users to consider the content and context of their online disclosures more carefully before posting. We implemented three nudging interfaces: profile picture, timer, and timer plus sentiment meter.

The picture nudge was designed to remind Facebook users of which individuals are in the audience for their posts. Depending on the particular post privacy setting, users were shown five profile pictures randomly selected from the pool of those who could see their posts. These profile pictures appeared under the status-updates and comment text boxes when users started typing. The timer

nudge was designed to encourage users to stop and think. The warning message

“You will have 10 seconds to cancel after you post the update” with a yellow background was displayed under the status-updates and comment text boxes when users started typing. After clicking on the “Post’” button, users were given the options to “Cancel” or “Edit” their post before it was automatically published after 10 seconds. The third nudge added a sentiment meter to the timer nudge, and the content of each post was analyzed by our sentiment algorithm. This nudge was designed to help make users more aware of how others might perceive their posts. For posts with a positive or negative score a warning message “Other people may perceive your post as {Very Positive, Positive, Negative, Very Negative}” was displayed during the countdown timer.

We tested these nudges in a 3-week field trial with 21 Facebook users, and conducted 13 follow-up interviews. By triangulating system logs of participants’ behavioral data with results from the exit survey and interviews, we found evidence that the nudges had positive influences on some users’ posting behavior, mitigating unintended disclosures. We also found limitations of the current nudge designs and identified future directions for improvement. Our results suggest that a soft-paternalistic approach to protect people’s privacy on social network sites could be potentially beneficial.

Alessandro Acquisti & Christina Fong, An Experiment in Hiring Discrimination via Online Social Networks

Alessandro Acquisti & Christina Fong, An Experiment in Hiring Discrimination via Online Social Networks

Comment by: Robert Sprague

PLSC 2012

Workshop draft abstract:

Self-report surveys and anecdotal evidence indicate that U.S. firms are using social networking sites to seek information about prospective hires. However, little is known about how the information they find online actually influences firms’ hiring decisions.  We present the design and preliminary results of a series of controlled experiments of the impact that information posted on a popular social networking site by job applicants can have on employers’ hiring behavior. In two studies (a survey experiment and a field experiment) we measure the ratio of callbacks that different job applicants receive as function of their personal traits. The experiments focus on traits that U.S. employers are not allowed to enquiry about during interviews, but which can be inferred from perusing applicants’ online profiles: religious and sexual orientation, and family status.

Sasha Romanosky, David Hoffman, & Alessandro Acquisti, Docket Analysis of Data Breach Litigation

Sasha Romanosky, David Hoffman, & Alessandro Acquisti, Docket Analysis of Data Breach Litigation

Comment by: Kristen Mathews

PLSC 2011

Published version available here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1986461

Workshop draft abstract:

The proliferation of data breach disclosure (security breach notification) laws has prompted a flurry of lawsuits filed by alleged victims of identity theft against corporations that suffer a breach. Using data collected from Westlaw and PACER, we perform docket analysis on a sample of data breach lawsuits over the period from 1999 to 2010. This method of empirical legal research involves collecting, mining and coding relevant data from court documents (such as the complaints and judicial rulings). While much economic and legal scholarship has been written about data breaches, breach disclosure legislation, and the difficulties that consumers face from breach litigation, to our knowledge, this is the first research that attempts to empirical analyze the lawsuits, themselves.

In this working paper, we present preliminary results showing that the trend of known lawsuits appears to generally follow (and lag) the trend in reported data breaches. Since about mid-2006, the time taken for plaintiffs to organize and file a complaint has been steadily increasing, though the time to dispose of these suits has been steadily decreasing. Moreover, the overall duration of a data breach lawsuit is 15 months, on average. We also find that the settlement rate of data breach lawsuits is substantially lower in our sample (26%) compared with estimates found in other legal scholarship (67%). Finally, the average number of records lost is statistically much higher for known lawsuits than for the sample of all reported breaches (9.5m compared with 340k) and financial institutions are over-represented in breach litigation relative to the sample of known breaches, while government agencies and educational institutions are under-represented. Further, we use a probit regression to estimate the probability that a data breach will result in a lawsuit, and a multinomial logit model to examine the characteristics of lawsuits that impact particular outcomes of data breach lawsuits.

Alessandro Acquisti and Catherine E. Tucker, Guns, Privacy, and Crime

Alessandro Acquisti and Catherine E. Tucker, Guns, Privacy, and Crime

Comment by: Aaron Burstein

PLSC 2010

Workshop draft abstract:

In December 2008, a Memphis newspaper made publicly available an online, searchable database of all gun permit holders in Tennessee. The database included information such as the permit holder’s name, ZIP code, and his or her permit’s start and expiration dates. It did not receive much attention until, in February, an article about a parking argument that ended in a deadly shooting referred to it. The fierce debate which thereafter arose – with the NRA accusing the newspaper of a “hateful, shameful form of public irresponsibility,” and the newspaper standing by a “right to know” argument – exemplifies the complex interactions, and sometimes collisions, between privacy and other rights and needs. In this case, individual privacy rights collided with the collective right to know, and, arguably, with both individual and communal issues of security.

By preventing the release of personal data, individuals often hope to prevent harm to themselves. However, the publication of the gun permits data highlights one case where privacy and personal security may appear to be in conflict. Whereas gun rights advocates suggested that the publication exposed gun owners to risk (for instance, of criminals targeting houses known to hold guns, in order to steal them), those defending it argued that gun owners may be less likely to be targeted, precisely because the information was made publicly available. In this manuscript we attempt to quantify the actual impact that the publication of TN gun permits data had on 1) crime rates and 2) gun permit requests in the city of Memphis. Combining gun, crime, demographic, and location data from an array of sources and databases, we measured how rates of occurrences of different classes of crime changed, as function of the local density of gun ownership made public by the newspaper, before and after the publication of the database. Our results suggest that the publication of the database reduced more significantly the occurrence of violent crimes (such as robberies and burglaries) in ZIP codes with higher gun ownership density. At the same time, the publication was accompanied by a more significant percentage increase in gun permits requests in areas with pre-existing higher rates of gun ownership. To address concerns about unobserved heterogeneity, we also performed a falsification test by studying crime trends in a similar town (Jackson) in a neighboring state. We found no similar trends in crime during the time period in such town.

This paper contributes not just to the policy debate on the openness or secrecy of gun data (19 states allow the public to access gun permits information; other states either have no laws addressing the issue, or keep the information outside the public domain), but to the broader discourse on the boundaries and connections between privacy and security.

Alessandro Acquisti & Ralph Gross, Inferring Private Data from Publicly-Available Sources

Alessandro Acquisti & Ralph Gross, Inferring Private Data from Publicly-Available Sources

PLSC 2008

Workshop draft abstract:

I will present results from a study of privacy risks associated with information sharing in online social networks. Online social networks such as Friendster, MySpace, or the Facebook have experienced exponential growth in membership in recent years. They are no longer niche phenomena: millions use them for communicating, networking, or dating. These networks are successful examples of computer-mediated social interaction. However, they also raise novel privacy concerns, which this research aims at quantifying. Specifically, I evaluate the risks that personal information (PI) publicly provided on a social networking site may be used to gather additional and more sensitive data about an individual, such as personally identifying information (PII), exploiting the online profile as a ‘breeding’ document. More broadly, these results highlight the unexpected consequences of the complex interaction of multiple data sources in modern information economies.

Alessandro Acquisti, The Impact of Relative Standards on Concern about Privacy

Alessandro Acquisti, The Impact of Relative Standards on Concern about Privacy

Comment by: Lauren Willis

PLSC 2009

Workshop draft abstract:

Consumers consistently rank privacy high among their concerns, yet their behaviors often reveal a remarkable lack of regard for the protection of personal information.  We propose that one explanation for the discrepancy is that actual concern about privacy in a particular situation depends on comparative judgments.  We present the results of two studies that illustrate the comparative nature of privacy-related behavior.  The first study focuses on the impact of receiving information about self-revelations made by others on an individual’s own self-revelatory behavior. The second study focuses on the impact of past intrusions on privacy on current self-revelatory behavior.  We find that admission to sensitive and even unethical behaviors by others elicits information disclosure, and that increasing the sensitivity of questions over the course of a survey inhibits information disclosure. Together, these studies can help explain why people profess great concern about privacy yet behave in a fashion that bespeaks a lack of concern.