Banning Political Content is Not a Smart Way to Police Disinformation

All Meta Platforms apps on the screen of smartphone Facebook, Instagram, WhatsApp, Messenger, Threads, Meta Quest, Workplace. Concept Stafford, United Kingdom, July 6, 2023
April 12, 2024

Too often, social media companies try to cope with the challenge of misinformation and other harmful content on their platforms by lurching from one simplistic “solution” to the next. The search for one-size-fits-all responses leads to counter-productive policies and crude enforcement.

In February, Meta announced that it would attempt to reduce controversy on Threads and Instagram by ceasing to recommend content about politics and social issues on the two popular platforms. The goal was to limit the reach of accounts posting “political content” in hopes that this would reduce the flow of false, hateful, and divisive material. The move followed similar restrictions on Meta’s larger Facebook platform.

But now hundreds of political and news content creators, activists, and journalists have joined forces in an open letter to Meta asking the company to reverse its decision as to Threads and Instagram. The objecting parties argue that Meta is undermining the highest and best use of its virtual venues. “Meta’s platforms have a responsibility to be an open and safe space for dialogue, conversation, and discussion,” the letter states. It goes on to elaborate that Meta is “undermining the reach of our content online by limiting suggested political content on the platform through a new default setting for accounts.

With many of us providing authoritative and factual content on Instagram that helps people understand current events, civic engagement, and electoral participation, Instagram is thereby limiting our ability to reach people online to help foster more inclusive and participatory democracy and society during a critical inflection point for our country.

To their credit, the signatories propose a reasonable compromise:

Rather than unilaterally changing the default settings of accounts to limit political content without transparency to users across platforms, Meta should instead empower users to opt-out of seeing suggested political content. As users of Metas platforms, we did not choose to automatically opt-out of receiving suggested political content on civic activism and news updates. Removing political recommendations as a default setting, and consequently stopping people from seeing suggested political content poses a serious threat to political engagement, education, and activism.

The Washington Post’s dispatch on this situation explains that the Meta restrictions are hurting certain users in particular, including those who post about “LGBTQ rights, women’s rights, racial inequality and disability. And independent journalists and content creators say they’ve struggled to reach their audiences in recent weeks since the change was rolled out.”

One can sympathize, to some extent, with the desire of social media companies to deal with the endless complications of content moderation by imposing blunt restrictions that are easy to understand and implement. But when such limits undercut the most valuable uses of the companies’ services, the companies need to think more carefully about what they’re doing. This appears to be one such instance.

Related

See all