Regulating Social Media: The Fight Over Section 230 — And Beyond
September 2020
Who is liable for online content?
There are increasing calls to curtail or revoke Section 230, a 1996 law that protects internet companies from most lawsuits related to user-generated content. Critics of Section 230 say it discourages vigilant self-regulation. Proponents counter that the law has fostered free expression and innovation. Our report concludes that Section 230 ought to be preserved but significantly amended.
Creating a new internet regulator
Reform in this area should go beyond amending Section 230. We recommend formation of a Digital Regulatory Agency— an independent body with internet expertise that will establish and enforce rules aimed at boosting platform transparency and accountability without interfering in content decisions. The agency could, for example, require internet companies to disclose how their algorithms order news feeds and recommendations, which would help explain how harmful content spreads.
Related
See allDigital Risks to the 2024 Elections: Safeguarding Democracy in the Era of Disinformation
A new report by Paul M. Barrett, Justin Hendrix and Cecely Richard-Carvajal highlights that this year's primary tech-related threat to elections isn't AI-generated content, but the spread of false, hateful, and violent content on social media platforms.
Reality Check: How to Protect Human Rights in the 3D Immersive Web
Our report written by Mariana Olaizola Rosenblat explains the risks to privacy and safety exacerbated by immersive technologies and recommends steps tech companies and the government can take to minimize those risks.
Safeguarding AI: Addressing the Risks of Generative Artificial Intelligence
Our report on safeguarding AI argues that the best way to prepare for potential existential risks in the future is to begin now to regulate the AI harms right in front of us.