The Next Era of Tech Accountability? Maybe.

The last two months have been devastating for Facebook. In September, its stock price soared to the highest it has ever been. Since then, the price has fallen nearly 20%, in large part over revelations of how the company balances its commercial interests with user safety — both on and off the platform.

The Facebook Files, The Facebook Papers and subsequent hearings on the Hill suggest that a lack of transparency and accountability in how social media platforms in general, and Facebook in particular, operate has led to a wide variety of negative social outcomes. And what’s worse is that, while many platforms may be aware of their part in amplifying existing social turmoil, they do not seem particularly incentivized to change their commercial practices. But have these revelations been a watershed moment?

Facebook recognizes social networks need regulation. While the platform has been a boon for global commerce, providing a voice to the voiceless, and connecting communities across the globe, the fact remains that it cannot reasonably be left to mitigate the negative social experiences that result from this uninhibited, amplified engagement.

Facebook, of course, is not the only platform that should be scrutinized. For example, Snapchat is embroiled in litigation over its role in the death of three young people by encouraging (and rewarding) users for snapping while driving at dangerous speeds. Grindr, the dating app, has also been implicated in harming its users ever since it decided not to remove an ex-lover’s impersonation attempts which enabled harassment, stalking, and the fear of real world violence.

The recent Facebook revelations are a brick, but not the wall. Congress has known about the real world consequencesof social media platforms for some time. Whether or not these revelations are a watershed moment depends on lawmakers’ ability to re-imagine platform liability in the 21st century and learn from its mistakes in the past. Yet, reforming platform liability is no simple feat. Congress is stymied by two roadblocks: 1) prior legislation and 2) lack of subject-matter expertise.

Section 230 of the Communications Decency Act (CDA 230) governs liability for platform companies. CDA 230 provides that no “interactive computer service” shall be held liable as the speaker or publisher of third-party content posted on its platform. Under this law, courts have generally provided broad liability protections for platform companies against lawsuits that might seek to hold them liable for the content that users post on their platforms. However, in the wake of the Trump presidency, the insurrection at the Capitol, and the testimony from tech whistleblowers, lawmakers have proposed measures to reform this legislation in light of its undesirable outcomes.

Proposals to reform 230 range from well-meaning, but misguided to disingenuous and dangerous. The Platform Accountability and Consumer Transparency Act (PACT) is in the well-meaning camp. It attempts to address the issues where platforms have safe harbor for claims arising out of alleged civil rights violations by pegging liability and remediation efforts to the size and scope of the company. Yet this proposal, like so many others, fails to address the harms of hateful speech and mis/disinformation.

Efforts by lawmakers under the Protect Speech Act—a separate proposal—would simply force platforms’ internal operational guidelines into the eye of public scrutiny. This would enable violators to come up to the line of policy enforcement, but not be removed from the platform. Here, almost purposefully missing the issue, we see Congress’ efforts to regulate platforms as a backlash against the false notion of censoring conservatives; they say nothing about speech that incites violence and, in fact, potentially make platforms liable for good faith efforts to remove that speech.

Moreover, beyond the gaffes of misunderstanding Facebook’s business model or the harms of a finsta, Congress has a gap in its technological subject-matter expertise. Though Congressional members need not be data scientists, software engineers, or product managers, when those experts raise issues of technological governance it is troubling to hear members say “I can understand about 50 percent of the things you say.” A lack of investment in understanding the implications of viral media, the real world effects of dangerous speech, and now the metaverse will continue to have lawmakers playing catch-up to where technology is today.

To mitigate these challenges, Congress should approach platform regulation with a set of principles that optimize for open access, smart transparency, and harm reduction.

  • Open Access: The uninhibited connections between Americans and the global community is a mainstay of social media platforms. The ability to share cat videos, new music, and personal triumphs with billions is what makes the social web so unique at this moment in history. Regulating out the harmful effects of social media while maintaining avenues for people to connect globally is a delicate act. Lawmakers should consider both how proposed legislation expands and contracts this principle and what effects it might have on freedom of expression.
  • Smart Transparency: Calling for transparency into platform companies’ content policies and user reports is a good first step. However, to maximize transparency, lawmakers need to understand how data scientists, engineers, and product managers think about the real-world effects of social media to better regulate it. Commissioning a task force of experts who have worked in Online Trust & Safety is a more targeted approach to transparency and will ensure Congress’ desired outcomes for platform regulation.

  • Harm Reduction: As a general rule, platform companies should not profit from hate speech, disinformation, or otherwise illegal content. But even further, rewarding this behavior through algorithmic boosts makes it all the more likely that harm will occur off-platform and generally lessen the ability for people to share openly online. Congress should consider penalties that go beyond a slap on the wrist and devise effective enforcement mechanisms and penalties in connection with the FTC and FCC to hold platform companies accountable for maintaining a healthy social ecosphere on their platforms.

Are tech platforms finally heading for a reckoning? Maybe. But if lawmakers are looking for a solution to the negative externalities of social media, they will need more than just policies. They need expertise to guide them.