On 11 May 2022, the European Commission released the Proposal on laying down rules to prevent and combat child sexual abuse. The multifaceted regulatory framework would establish rules for a range of digital service providers, including user content hosting services, messaging services and app stores, regarding online child sexual abuse material (CSAM). The obligations include the use of automated technologies to detect and report CSAM as well as identify and report child grooming activity. The proposal does not include a general monitoring obligation but rather implements a multi-step procedure that may result in a specific CSAM detection order. The procedure begins with an obligatory self-assessment by providers regarding risks of CSAM and grooming within their services, as well as measures they implement to reduce the present risks. If the risks persist, a detection order limited in both time and content can be issued by courts or independent national authorities. The detection order requires the use of technological measures, which are not yet specified in the proposal, to detect CSAM material (which has raised discussions regarding both its technical possibility and conformity with privacy measures, especially end-to-end encryption). If CSAM is detected, it must be reported to the authorities and domestic removal orders can be issued. Furthermore, internet service providers could be tasked with making certain content unviewable if content removal is not possible. In addition, app stores would be required to add age verification and assessment procedures for applications with a high risk for CSAM and grooming. Finally, the proposal would establish an independent "EU Centre on Child Sexual Abuse", which would determine the technologies to be used for CSAM detection and would receive reports of CSAM.
Original source