On 24 July 2024, Australia's eSafety Commissioner issued legal notices to online platforms, including Apple, Google, Meta, Microsoft, Discord, Snap, Skype, and WhatsApp. These notices require the companies to report biannually to the regulator on their measures for addressing online child sexual abuse. They must detail their strategies for combating child abuse material, livestreamed abuse, online grooming, sexual extortion, and, where relevant, the creation of synthetic or deepfaked child abuse material generated by Artificial Intelligence (AI). The notices are issued under the Basic Online Safety Expectations outlined in the Online Safety Act 2021. This Act mandates online platforms to minimise illegal and harmful content and protect children from inappropriate exposure. The companies have until 15 February 2025 to submit their initial reports and may face financial penalties of up to USD 782'500 per day for non-compliance.
Original source