Who Moderates The Social Media Giants? A Call To End Outsourcing

Who Moderates Social Media
June 2020
Content moderation and consequences

In our latest report on the social media industry, we look at how the major platforms handle the problem of content moderation: deciding what remains on their sites and what should be removed. Despite the centrality of this task—without it, the platforms would be inundated by harmful content—the social media companies outsource content moderation to third-party vendors. The report examines the consequences of this marginalization of a core element of the social media business model, including how content moderation has played out during the coronavirus pandemic.

Removing Harmful Content

One consequence of the outsourcing and marginalizing of content moderation has been the failure to provide adequate review of hate speech and other material that can spark animosity and violence in developing countries. Focusing on Facebook as a case study, the report describes the role of social media in sparking unrest in Myanmar, India, Sri Lanka, Indonesia, and Ethiopia—and explains how the company says it has recognized this shortcoming and begun to address it.

Who Moderates the Social Media Giants? A Call to End OutsourcingDownload

Related

See all
cover of Michael Posner's book, Conscience Incorporated on top of a blue background
Conscience Incorporated

In his new book Conscience Incorporated, Michael Posner, director of the Center for Business and Human Rights, offers practical strategies and bold reforms to help businesses align profitability with ethical responsibility.