German Companies Report on the Implementation of New Hate Speech Law
August 7, 2018
At the end of July, Facebook, Twitter, Google, and YouTube submitted their first reports under the Netzwerkdurchsetzungsgesetz (NetzDG), a new German law that requires Internet platforms with more than 2 million users to take down illegal content within 24 hours, or seven days in complex cases. If companies fail to comply, they risk fines for up to 50 million euros.
NetzDG requires companies to provide aggregate data on their implementation of the law twice a year, including on the number of complaints, the type of complaints, and the number of complaints that resulted in the deleting or blocking of content in Germany.
The reports reveal the following:
- The sheer number of take-down requests to social media platforms is daunting: During the first six months the law has been in effect, Twitter received the most complaints (265, 000), followed by YouTube (215,000). Facebook received surprisingly few requests (1,704), which the German Federal Ministry for Justice and Consumer Protection attributes to the platform’s requirement that people with complaints fill out a complicated form that is not easy to find.
- The complaints are assessed by company analysts who then decide whether a complaint breaches German laws–NetzDG refers to more than 20 legal statutes– or the platform’s community standards. The percentage of content that has either been deleted by the platforms or made inaccessible from Germany varies. Google followed-through in 46% of all requests received; YouTube, in 27%; Facebook, 21%; Twitter, 10%. These numbers show that in the majority of cases, the content actually remains untouched.
- The reports show that the vast majority of cases– more than 90% across all platforms– is indeed handled within 24 hours. To date, none of the platforms have had to pay any fines. This is surprising considering the volume and complexity of the issues the online platforms are dealing with.
- In most of cases in which the platforms found complaints justified, the content contained hate speech and extremist content.
- Across all social media platforms, the company’s community standards are stricter than legal requirements. Most take-down requests were executed due to breaches of the platform’s community standards and not because they violate German laws (e.g. on pornography).
There is also a lot that these reports do not tell us:
- The NetzDG does not require companies to justify individual take-down decisions publicly. How the companies’ content analysts–most of whom lack special legal training– interpret and apply legal statutes across the several hundred cases that they scan every day, remains unclear. Examples of cases in which content has been improperly removed (e.g in the context of satire) have come up in the media, but there is no systematic way to check whether and how legal and company rules are applied. The reports show that the platforms sought advise of law firms in only very few cases.
- Company community standards differ across these platforms, and these standards are more relevant for take-downs than the NetzDG. What platforms take down under their community rules without any complaint is not reported publicly and it is quite possible that the platforms clean out a lot of content before it receives any public attention. For example, one could easily imagine that posts that promote breastfeeding are deleted immediately under the platforms’ strict pornography rules.
- Since there is no comparison data from previous years, it is unclear whether the degree of harmful content on the Internet is decreasing under NetzDG. The law’s effectiveness for Germany may become clearer over the years, but it is not the only question that matters in a global context.
In our two recent reports on human rights in the tech industry, we argue that regulating harmful content online with the type of regulation that Germany has put in place can have a number of unintended effects, some of which may be more harmful than the sought-after benefit. Definitions of “harmful” can differ significantly, and repressive regimes may use the German model as an excuse to force online platforms to censor free speech. Among the countries already citing the German model are Russia, Singapore, and the Philippines. We therefore recommend refraining from adopting overbroad legislation regulating content. Instead, we urge governments to require increased transparency for online political-advertising sponsorship, improve collaboration between industry and governments, and strengthen coordination among government bodies. Taken together, such measures can help strike the delicate balance needed between eliminating harmful content and preserving free speech online.