Australia: eSafety Commissioner adopted report on measures against child sexual exploitation and abuse implemented by Apple, Discord, Google, Meta, Microsoft, Snap, Skype, and WhatsApp

Description

eSafety Commissioner adopted report on measures against child sexual exploitation and abuse implemented by Apple, Discord, Google, Meta, Microsoft, Snap, Skype, and WhatsApp

On 1 August 2025, the eSafety Commissioner adopted a report focusing on child sexual exploitation and abuse, and sexual extortion, under Australia’s Online Safety Act 2021. The report reviews how eight technology companies, including Apple, Discord, Google, Meta, Microsoft, Snap, Skype, and WhatsApp, are tackling child sexual exploitation and abuse (CSEA) and sexual extortion on their platforms. The report covers the period from 15 June to 15 December 2024 and examines how the companies meet the Basic Online Safety Expectations. This includes user safety, reporting tools, and detection methods like hash-matching and language analysis. The report highlighted that Apple did not detect CSEA livestreaming on FaceTime or new abuse on iCloud email, and that Discord lacked detection on Go Live and only analysed sexual extortion in direct messages. It was also highlighted that Google did not detect new abuse on Gmail, Google Meet, or Messages, and had limited reporting features. It was specified that WhatsApp had slow response times and no specific CSEA reporting category. The report also warned about risks from generative Artificial Intelligence creating synthetic abuse content.

Original source

Scope

Policy Area
Content moderation
Policy Instrument
Content moderation regulation
Regulated Economic Activity
cross-cutting
Implementation Level
national
Government Branch
executive
Government Body
other regulatory body

Complete timeline of this policy change

Hide details
2025-08-01
under investigation

On 1 August 2025, the eSafety Commissioner adopted a report focusing on child sexual exploitation a…