A Platform ‘Weaponized’: How YouTube Spreads Harmful Content – And What Can Be Done About It
June 2022
We know less about YouTube than the other major social media platforms.
YouTube, with more than 2 billion users, is the most popular social media site not just in the United States, but in India and Russia as well. But because of the relative difficulty of analyzing long-form videos, as compared to text or still images, YouTube has received less scrutiny from researchers and policy makers. This in-depth report addresses the knowledge gap.
Does YouTube send unwitting users down a ‘rabbit hole’ of extremism?
In response to reports that the platform’s own recommendations were “radicalizing” impressionable individuals, YouTube and its parent, Google, altered its recommendation algorithm, apparently reducing the volume of recommendations of misinformation and conspiratorial content. But platform recommendations aren’t the only way people find potentially harmful material. Some, like the white 18-year-old accused of shooting and killing 10 Black people in a Buffalo, N.Y., grocery store, seek out videos depicting violence and bigotry. These self-motivated extremists can find affirmation and encouragement to turn their resentments into dangerous action.
A social media venue with global reach.
Roughly 80% of YouTube traffic comes from outside the United States, and because of language and cultural barriers, the platform’s content moderation efforts are less successful abroad than at home. Our report explores how YouTube is exploited by Hindu nationalists persecuting Muslims in India, right-wing anti-vaccine advocates in Brazil, and supporters of the military junta in Myanmar.
Related
See allDigital Aftershocks: Online Mobilization and Violence in the United States
Our new report draws on open-source intelligence to trace how extremist actors coordinate across online platforms to justify violence and recruit supporters, offering a framework for policy and platform response.
Feedback on the European Commission’s Digital Fairness Act
The Working Group on Gaming and Regulation submitted feedback to the European Commission’s Digital Fairness Act, calling for clearer, better-enforced rules across Member States that close regulatory gaps without adding unnecessary complexity to the EU’s digital framework.
Feedback on the EU’s Consumer Agenda 2025–2030
The Working Group on Gaming and Regulation submitted feedback to the European Commission’s Consumer Agenda 2025–2030, urging the EU to strengthen enforcement against manipulative design practices in digital games and to modernize consumer protection rules for the digital marketplace.
Technology & Democracy

