On 8 September 2025, the eSafety Commissioner opened an investigation into a technology company responsible for Artificial Intelligence (AI) generated nudify services used to create deepfake pornography. It was stated that the company was left unnamed by eSafety to avoid publicising it. The company runs AI-generated nude image platforms, where users can upload photos of real people, including minors, and are used to create deepfake sexual images of Australian schoolchildren. The services attract about 100’000 Australian users each month. It was highlighted that the company failed to prevent the creation of child sexual abuse material and may face fines of up to AUD 49.5 million.
Original source