Who Moderates The Social Media Giants? A Call To End Outsourcing

Who Moderates Social Media
June 2020
Content moderation and consequences

In our latest report on the social media industry, we look at how the major platforms handle the problem of content moderation: deciding what remains on their sites and what should be removed. Despite the centrality of this task—without it, the platforms would be inundated by harmful content—the social media companies outsource content moderation to third-party vendors. The report examines the consequences of this marginalization of a core element of the social media business model, including how content moderation has played out during the coronavirus pandemic.

Removing Harmful Content

One consequence of the outsourcing and marginalizing of content moderation has been the failure to provide adequate review of hate speech and other material that can spark animosity and violence in developing countries. Focusing on Facebook as a case study, the report describes the role of social media in sparking unrest in Myanmar, India, Sri Lanka, Indonesia, and Ethiopia—and explains how the company says it has recognized this shortcoming and begun to address it.

Related

See all
WG EU submission
Feedback on the EU’s Consumer Agenda 2025–2030

The Working Group on Gaming and Regulation submitted feedback to the European Commission’s Consumer Agenda 2025–2030, urging the EU to strengthen enforcement against manipulative design practices in digital games and to modernize consumer protection rules for the digital marketplace.