News

Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online

Jul 20 2022 Anais Caugant

With the emergence of new and complex cross-border and cross-sectorial security threats, was highlighted the need for closer cooperation on security at all levels. For this reason, was introduced the EU Security Union Strategy 2020-2025 of the Commissionto focus on priority areas where the EU can bring value to support the Member States in fostering security for those living in Europe, with its Counter-Terrorism Agenda. Based on this strategy was enacted the Commission Recommendation (EU) 2018/334 on measures to combat illegal content online effectively, given the continued presence of terrorist content on the web.

Indeed, as pointed out recently by the Commissioner for Home Affairs, Ylva Johansson, “with the recent mass shooting at a supermarket in Buffalo, US, which was live-streamed online, it is clear that our work on removing terrorist content online is crucial”. To that end, the Regulation (EU) 2021/784 of the European Parliament and the Council of the European Union on addressing the dissemination of terrorist content online was adopted and is applied from 7 June 2022.

I. PURPOSE

The Terrorist Content Online Regulation responds to the need to tackle online content disseminated by terrorists to spread their message, to radicalise and recruit followers – as was the case when actions similar to the terrorist murder of the French professor Samuel Paty took place in Nice a few days later-, and to facilitate and direct terrorist activity.

Hosting services have a particular societal responsibility both to protect their services from misuse by terrorists - as their illegal activities have a serious negative consequence for society at large and undermine the trust of their users and damages their business models – and to help address terrorist content disseminated through their services online. However, this responsibility must be balanced by the consideration of the fundamental importance of the freedom of expression since hosting services providers play an essential role in the digital economy by facilitating public debate and the distribution and receipt of information, opinions, and ideas.

In consequence, the regulation, while providing a legal framework for removing or disabling access to terrorist content, ensures that this is done as transparently as possible and that the hosting services and the content providers have access to an effective remedy against a removal order or decision.

II. MAIN POINTS OF THE REGULATION

The One-Hour rule states that hosting service providers must remove terrorist content or disable access to terrorist content in all Member States within one hour of receipt of the removal order from the Member State’s competent authority. If the hosting service provider cannot comply with the removal order in case of force majeure or de facto impossibility justifiable by technical or operational reasons, it shall inform the competent authority that issued the removal order. If the hosting service cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall inform the competent authority that issued the removal order and request the necessary clarification (article 3).

Duty of the hosting service provider to protect its services against the dissemination to the public of terrorist content in a diligent, proportionate, and non-discriminatory manner, with regard to the fundamental rights of the users such as the freedom of expression and information. Hence, material disseminated for educational, journalistic, or research purposes is exempted (article 5).

Transparency obligation for online platform as well for competent authorities to report annually and publicly on the amount of terrorist content removed, the outcomes of complaints and appeals, as well as the number and type of penalties imposed on online platforms (article 7 and 8)

Right to information which allows the content provider to ask the hosting provider of the reasons for the removal or disabling by providing it with a copy of the removal order and of its right to challenge the removal order or decision (article 11).

Right to an effective remedy as the hosting service that has received a removal order or a decision shall have a right to an effective remedy, i.e. the right to challenge such a removal order or decision before the competent court (article 9). Moreover, the hosting service providers shall preserve the terrorist content for six months from the removal or disabling which are necessary for administrative or judicial review proceedings or complaint-handling against a decision to remove or disable access to terrorist content and related data and for the prevention, detection, investigation, and prosecution of terrorist offenses (article 6).

In the same way, content providers whose content has been removed or access to which has been disabled shall have the right to challenge such a removal order or decision before the competent court (article 9) or submit a complaint requesting the reinstatement of the content or of access. The hosting service provider shall inform the complainant of the outcome of the complaint within 2 weeks of the receipt (article 10).

Duty of cooperation between hosting service providers, competent authorities, and Europol since competent authorities shall exchange information, coordinate, and cooperate with each other and Europol to enhance coordination on information and actions. As well, when hosting service providers become aware of terrorist content involving an imminent threat to life, they shall promptly inform authorities competent for the investigation and prosecution of criminal offenses (article 14).

III. SANCTION OF NON-COMPLIANCE

It is for the Member States to lay down the rules on penalties applicable to infringements of this Regulation by hosting service providers considering relevant circumstances such as the previous infringements, whether the infringement was intentional or negligent or the size of the platform, to ensure that the penalties are proportionate, according to the Article 49 of the EU Charter of Fundamental Rights. The European Commission, nevertheless, stated that the financial penalties can be up to 4% of the platform’s turnover. Therefore, this regulation complements the Code of Conduct to combat illegal racist and xenophobic hate speech online, launched in May 2016 with four major Information Technology companies (Facebook, Microsoft, Twitter and Youtube) since such speech is often the source of subsequent terrorist action.

Experts with a long range vision, ready to overcome barriers and take on new challenges in new professional environments. Up to date on developments in Italy and abroad.

EXP Contacts

  Via di Ripetta, 141
00186 - Roma

 +39 06 6876917

 This email address is being protected from spambots. You need JavaScript enabled to view it.

Linkedin

Via Fontana, 22
20122 - Milano

+39 02 30573573

 This email address is being protected from spambots. You need JavaScript enabled to view it.

Linkedin

  1000 5th Street, Suite 200
Miami Beach, FL, 33139

 This email address is being protected from spambots. You need JavaScript enabled to view it.

Linkedin