On 5 February 2026, the eSafety Commissioner published the 2nd in a series of 4 transparency reports summarising how 8 technology companies are addressing child sexual exploitation and abuse (CSEA) on their services. The report evaluates responses to legally enforceable periodic transparency notices issued on 22 July 2024 under the Online Safety Act 2021 to Apple, Discord, Google, Meta, Microsoft, Skype, Snap, and WhatsApp. The report focuses on measures to address CSEA material and activity, including real and AI-generated child sexual abuse material, livestreamed abuse, online grooming, and sexual extortion. The report identifies safety gaps, including inadequate detection of live CSEA in video calling services and insufficient proactive detection of newly created CSEA material. For example, Meta did not use tools to detect live CSEA on Messenger, and Google did not use tools to detect live CSEA on Google Meet, while Apple did not use tools to detect new CSEA material across its services and relied on user reporting. The report also notes improvements, including expanded detection of known CSEA material and faster handling of reports, such as Snap, reducing moderation outcome times after a report of CSEA material. The periodic notices require two further reporting rounds in March 2026 and August 2026, and failure to comply with a mandatory transparency notice may incur financial penalties of up to AUD 825'000 per day.
Original source