A Clear-Eyed Look at the EU’s Digital Rules and Their Free Speech Implications for the US

DSA Free Speech_ QT
September 10, 2025

Last Thursday, the House Judiciary Committee held a hearing titled “Europe’s Threat to American Speech and Innovation” to examine the European Union’s Digital Services Act (DSA) and Digital Markets Act (DMA). The proceedings, featuring four expert witnesses, quickly devolved into the familiar pattern of political theater characterized by posturing, interruptions, and excursions into unrelated but highly charged topics. As a result, the hearing did little to clarify the actual content of the European regulations or their potential implications for the United States.

In contrast, two letters submitted ahead of the hearing — one to Representative Jim Jordan, who chairs the Committee, and the other to Commissioner Henna Virkkunen, who oversees the European Commission’s work on tech sovereignty, security, and democracy — offer a more substantive assessment. Both letters, signed by more than 30 digital rights scholars from across the EU and US, raise important questions about the nature of EU legislation and its possible impact on Americans’ free speech rights. Below, I highlight key points from those letters and add my commentary.

  1. The central aim of the DSA is to enhance user autonomy and rights in the face of largely unchecked platform power. It does so by enshrining procedural rights, such as the ability to appeal content moderation decisions; giving users more control over their online experience, such as the option to opt out of personalized recommendations; and banning manipulative design practices (“dark patterns”).

  2. The DSA also requires platforms to remove illegal speech. Article 9 establishes a notice-and-action mechanism, obliging platforms to comply with orders from national administrative or judicial authorities “to act against one or more specific items of illegal content.” “Illegal content” refers to material deemed unlawful under EU law or the domestic laws of member states. Such obligations are not unusual: the U.S. government similarly compels platforms to remove illegal content in areas such as copyright, fraud, defamation, and incitement.

  3. Generally, takedowns should apply only within the jurisdiction where the content is illegal. The European Commission has stated that “[w]here a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.” The phrase “general rule” introduces some ambiguity, however. Recitals to the DSA (interpretative guidance) further state that “in a cross-border context, the effect of the order should in principle be limited to the territory of the issuing Member State, unless the illegality of the content derives directly from Union law or the issuing authority considers that the rights at stake require a wider territorial scope, in accordance with Union and international law, while taking into account the interests of international comity.”

    This ambiguity regarding possible extraterritorial effects could be resolved through a political agreement. The letter’s authors specifically propose that “the US and the EU conclude a political agreement outlining that neither will attempt extra-territorial application of its speech rules at the expense of the rights of its citizens.”

  4. The DSA contains a few provisions affecting lawful speech, but their enforcement is meant to be “content-agnostic,” consistent with EU law. Articles 34 and 35, in particular, require a subset of “very large” online platforms and search engines to assess and mitigate certain “systemic risks” to fundamental rights, civic discourse, electoral processes, and public health. Recital 84 specifies that services should “pay particular attention [to] how their services are used to disseminate or amplify misleading or deceptive content, including disinformation.”

    While this language suggests platforms are expected to address disinformation — even when it constitutes lawful speech — the letter’s authors, including European legal scholars, argue that enforcement of the systemic risk provisions must remain content-agnostic to align with EU law. They acknowledge that these provisions could be abused to suppress lawful speech, but note that such use would run counter to both foundational EU principles and the DSA’s explicit requirement that obligations be carried out in line with the right to freedom of expression. To minimize confusion and prevent abuse, they recommend codifying “content-agnosticity” more explicitly.

In short, characterizing the DSA as a “censorship law” is highly misleading. The legislation primarily aims to enhance user rights and establish clearer procedures for content moderation, while requiring platforms to address only illegal content and systemic risks in a content-agnostic manner. Still, the European Commission could do more to clarify remaining ambiguities.

Related

See all