
Trump Administration Executive Order (EO) Tracker
The EU Parliament has adopted a new and highly debated Regulation on combating the online dissemination of terrorist content (TERREG, for short). The law will bring uniform and sharp take-down and due diligence obligations for hosting providers aimed at preventing the spread of content inciting terrorist offences, providing instructions to commit such offences or threatening to do so. Yet the debate about the proportionality of the new obligations will likely continue.
Just like any other bad actors, terrorist groups abuse social media and other digital channels to spread their propaganda, sow hatred, and to recruit new members. We had to witness this during the rise of ISIS, and since then, public awareness of the internet as a global forum to spread terrorist hate has been ever increasing.
The law often lags behind in effectively addressing new Internet phenomena – ISIS started publishing its gruelling video messages in 2014. And while it is crystal clear that such content has no legitimate place anywhere, some boundaries may prove less obvious. In the fight against terrorist content, lawmakers are tasked with weighing and balancing public security concerns with constitutional freedoms. Depending on background and context, some content may not be manifestly illegal, requiring a thorough factual and legal assessment. Yet the pace of the Internet will typically not allow for lengthy consideration, when potentially harmful content can reach a global audience within seconds. Under these circumstances, it can be hard to find the right balance and to ensure in particular that freedom of expression and information is duly preserved.
In the fight against any type of illegal content, including terrorist content, lawmakers around the world are requiring the operators of online platforms to take forceful action. Over the last few years, this has resulted in a patchwork of different approaches and due diligence obligations that can be challenging for platforms to comply with – a trend that we have also seen in various EU Member States:
Germany sought to address the dissemination of terrorist content as part of its much debated 2017 Network Enforcement Act (“Netzwerkdurchsetzungsgesetz”). The act, among other things, requires social media platforms to remove or disable “manifestly unlawful content” within 24 hours after receiving a complaint.
France followed in 2020 and introduced the so-called “Loi Avia” (Loi n° 2020-766 visant à contre les contenus haineux sur internet). This law went further than its German equivalent, requiring certain service providers to remove terrorist content within one hour upon notification by authorities. The Conseil Constitutionnel later held that this and other provisions were incompatible with freedom of expression under the French constitution.
Poland took a different direction and published a bill earlier this year that rather limits social media operators in their discretion to remove content (covered in this publication). Yet courts and authorities can still order the blocking of access to websites associated with terrorist activities under the Polish 2016 Act on Anti-Terrorist Activities.
In view of the fragmented individual approaches of the Member States, the EU Commission took harmonising action, publishing a draft proposal for a “Regulation on preventing the dissemination of terrorist content online” (TERREG) in 2018. Despite considerable criticism from Internet activists and NGO’s, TERREG made its way through the legislative process and has now been adopted by the European Parliament on 28 April 2021 (text available here).
TERREG is wider in scope than its siblings from France and Germany. It will apply to all hosting service providers offering services in the EU, Art 1(2). This also includes providers established abroad as long as their services have a “substantial connection” to the EU. By virtue of its nature as a Regulation, TERREG will be directly applicable in all EU Member States, without the need for national legislation.
TERREG will introduce a comprehensive set of obligations which are tailored to combatting terrorist offences and content as defined in the corresponding Directive (EU) 2017/541 on combating terrorism:
These rules and obligations will be enforced through penalties; systematic or persistent failure to execute removal orders by authorities within the one-hour deadline may lead to penalties of up to 4% of the provider’s global annual turnover of the preceding business year, Art 18(3).
TERREG and the Digital Services Act (DSA)
With the ambitious new online regime of the Digital Services Act (DSA) already on the horizon (see our related publication and our dedicated topic center and taskforce), TERREG adds a further layer of obligations that hosting providers need to observe to ensure they safely operate in the increasingly dense regulatory framework for online service providers. TERREG and the Digital Services Act do share several concepts and definitions, such as their applicability to non-EU service providers, complaint mechanisms, and information and transparency obligations.
But as always, the devil lies in the detail. For instance, Art 7 (2) TERREG requires hosting service providers to publish transparency reports before 1 March each year, whereas the DSA does not spell out a specific due date for such reports. Both reporting obligations will overlap in case of removal orders against terrorist content. Conversely, Art 17 (1) DSA prescribes that online platforms must accept complaints for at least six months upon removal, whereas Art 10 TERREG does not (directly) set any time limit.
It is to be hoped that we will see amendments to the current draft of the Digital Services Act, published in December 2020, in the ongoing legislative process to smooth out potential friction and inconsistencies between the DSA and TERREG.
TERREG will shortly be published in the Official Journal of the EU. The law will enter into force 20 days later, and become applicable after 12 months. Hosting service providers should therefore ensure their compliance with the proscribed measures by mid-2022 – which means that now is the time for providers to review their existing content compliance processes and to make adjustments where necessary. Providers will also be well advised to already take into account the additional requirements to be expected under the DSA.
Once TERREG takes effect, providers will also need to closely monitor how individual Member States and their courts approach the new regime. This is particularly pertinent for France: as noted above, the Conseil Constitutionnel already found that an obligation to comply with an administrative removal order within one hour, without judicial supervision, violates freedom of expression. Now Art 3(3) TERREG brings an almost identical provision with even graver penalties. Further debate is bound to ensue – and may eventually make its way to the Court of Justice of the European Union.
In the meantime, follow our posts on Engage to keep up to date on regulation of online services, and reach out to our dedicated taskforce for detailed advice.
Authored by Anthonia Ghalamkarizadeh and Florian Richter