Snapchat Addresses the “fake news” Problem

snap
December 11, 2017

Snapchat claims it has found at least a partial solution to “fake news.” Snap, the Los Angeles-based parent of the image messaging app, recently announced it would separate social material from media. On the left side of the app—the social side—users will see their private conversations with friends, as well as their friends’ “stories.” On the right—the media side—they’ll  see content generated by publishers and celebrities, as well as material Snap curates from user-generated videos and photos.

In contrast, the previous version of the app, like Facebook, Snap’s much larger and more successful rival, mixed content produced by both friends and publishers.

Snap used the occasion of its redesign to take a jab at Facebook and Twitter, arguing that mixing social and media has led to “strange side-effects (like fake news).” Evan Spiegel, Snap’s chief executive officer, noted in an Axios op-ed that “social media fueled ‘fake news’ because content designed to be shared by friends is not necessarily content designed to deliver accurate information.”

So, has Snap hit upon a solution to “fake news”?

Not exactly. Snap’s redesign is not a game-changer. The company has never had the problems with misinformation that have plagued its competitors. (Instead, its main issue historically has been with sexually risqué content.) Snap has largely avoided the misinformation swamp by emphasizing human oversight of content delivered to users. For example, it has reviewed political and advocacy ads manually and included information about who paid for them. (In October, Twitter and Facebook announced they would follow suit on political ads.) The Snapchat redesign continues this trend.  Snap said “our curators review and approve everything that gets promoted” on the media side of the app.

Snap’s approach is both novel and familiar. It appears to be rooted in what the NYU Stern Center for Business and Human Rights called in our recent report on harmful content online “the incorrect premise of a binary choice.” That false premise starkly divides the tech world into editorial publishers and neutral pipes. We argued in our report that social media platforms are neither publishers nor pipes. They fall somewhere in the middle, where the platforms ought to follow a “third way” in governing themselves and dealing with harmful content. Snapchat seems to recognize this, although its proposed solution is to revert to that binary choice, splitting the app into two different communication models.

On the one hand, Snapchat will have a gated community of carefully curated content akin to Apple News. That’s the publisher side. On the other hand, it will offer a messaging service—the neutral tube. There are problems with this approach, however. A gated community does not conform to the romantic notion of social media as a democratic space for free expression, the proverbial marketplace of ideas. Snapchat will vet and showcase up-and-coming content creators on the media page, but it will inevitably shut out more obscure voices to the detriment of media diversity.

Still, there is reason to commend Snap’s approach. It explicitly recognizes the unique position the platforms occupy and attempts to navigate that position. Snap’s solution has the benefit of clarity, and it gives users context. It’s immediately obvious where Snap is curating and in what way. Its emphasis on human oversight may not work at Facebook, because the larger platform’s model of explicitly encouraging the social sharing of published content means there is much more material to check. But Mark Zuckerberg should keep a close eye on the new Snapchat.

Related

See all