Compare with different regulatory event:


Announced AGCM investigation into TikTok over alleged failure to remove content classified as dangerous

On 21 March 2023, the Italian Competition Authority (Autorità Garante della Concorrenza e del Mercato- AGCM) opened an investigation into TikTok over its alleged failure to moderate content displayed on the platform. The AGCM claims that the TikTok platform has not applied its content moderation Guidelines and failed to remove harmful content, such as content inciting suicide, self-harm and improper nutrition. The AGCM investigation was opened following reports on an increasing number of video content on the "french scar" depicting young individuals, including minors, who adopt "self-injurious behaviour." Furthermore, AGCM stated that it would investigate the TikTok content recommendation algorithm, alleging that the artificial intelligence techniques employed by TikTok are capable of causing "undue influence" on users. Finally, the AGCM will investigate the measures implemented by TikTok to address harmful content displayed and advertised on its platform.

Original source


Policy Area
Content moderation
Policy Instrument
Content moderation regulation
Regulated Economic Activity
online advertising provider, platform intermediary: user-generated content
Implementation Level
Government Branch
Government Body
competition authority

Complete timeline of this policy change

Hide details
under deliberation

On 21 March 2023, the Italian Competition Authority (Autorità Garante della Concorrenza e del Merca…

in force

On 6 March 2024, the Italian Competition Authority (AGCM) ruled in an investigation into TikTok ove…