Adopted on 24 August 2021, Article 42 of the law confirming compliance with the principles of the Republic imposes on certain online platform operators (in particular social networks, video sharing platforms and search engines) to combat illegal hate content more effectively, and to supervise their activities to moderate such content.

The text provides for two levels of obligations for these operators, depending on their audience on French territory.

First level of obligations for operators whose audience exceeds 10 million unique monthly visitors

Operators whose audience exceeds a threshold of 10 million unique monthly visitors will be subject to a set of obligations relating in particular to cooperation with law enforcement, on the establishment of mechanisms for the notification of illegal hate content and the treatment of such notifications, and on transparency regarding the moderation of such content.

Second level of obligations for “systemic” players, whose audience exceeds 15 million unique monthly visitors

“Systemic” players, exceeding a second threshold of 15 million unique monthly visitors, will be subject to additional obligations: they will have to assess the risks of the dissemination of illicit hate content on their services and take measures to combat this dissemination, while ensuring that freedom of expression is preserved.

Thanks to this new mechanism, whose supervision is entrusted to the Autorité de Régulation de la Communication Audiovisuelle et Numérique (Arcom), which has the power to sanction up to 6% of the operator’s global turnover, all the main social networks, video sharing platforms and search engines will be even more strongly involved in the fight against online hate.

These new obligations will remain in force until the entry into force of the European regulation on digital services (Digital Services Act) and at the latest until 31 December 2023. The national provisions will then be replaced by those of the European regulation.