In the last year, data privacy has become a significant issue of public interest. In 2017, the Equifax breach compromised credit information for 143 million Americans. In 2018, Cambridge Analytica was revealed to have obtained data of up to 87 million Americans to psychographically target voters. Often, there is a misunderstanding regarding the difference between the two. The Equifax breach was what is commonly understood as a hack, where bad actors gained unauthorized access to data. On the other hand, Cambridge Analytica obtained data through Facebook, not a bug. Cambridge Analytica collected data on individuals and their friends through a feature provided by Facebook to software developers.
Companies like Facebook and Google are experts in monitoring the everyday usage of their platforms and monetizing the data collected. Shoshana Zuboff, retired professor of Harvard Business School, describes this business model as surveillance capitalism, where companies monitor the everyday use of a product or service to predict and modify human behavior and generate revenue. The data collected from likes, searches, and views allows Facebook and Google to draw insights and categorize individuals based on characteristics that third parties want to target. Facebook and Google have generated billions of dollars in revenue through this business model. Zuboff further asserts that surveillance capitalism has shifted privacy rights and choice of what remains private and what is monetized from the individual to tech giants.
As the conversation on data privacy in the US continues, companies with core business models of monetizing user data will face greater scrutiny. As wearables and smart home devices gain in popularity, the variety and quantity of data collected will also grow. Third parties potentially can gain insights into the homes and bodies of individuals that were not available before. Although the majority of third parties will use the new data to better market goods and services to consumers, the same data can allow bad actors to target individuals without their knowledge and modify behavior with more precision for non-commercial purposes like influencing voters.
Governments are starting to recognize this problem but mainly through a commercial lens. Europe enacted the General Data Protection Regulation (GDPR) and California recently passed the California Consumer Privacy Act (CCPA). These laws include new rights for individuals such as the right to erase their data and the right to say no to the sale of their data. However, surveillance capitalism is still a fundamental problem. As long as businesses rely on monetizing user data, there will be opportunities for bad actors to abuse platforms through legitimate features to predict and modify behavior. It is yet to be determined at what point Americans will cease to value tech products and services over the loss of data privacy and choice.