On 9 September 2025, the eSafety Commissioner registered the Relevant Electronic Services Online Safety Code (Class 1C and Class 2 Material) under the Online Safety Act 2021. The Code establishes a regulatory framework for communication and gaming platforms serving Australian users, setting out obligations to manage harmful content and protect children. Services are categorised into types such as closed communication services, general communication services, dating services, gaming services with varying communication features, enterprise services, and telephony services, with compliance obligations tailored to their functions and risk profiles. A three-tier risk assessment system classifies services as high, moderate, or low risk based on the likelihood that Australian children will encounter harmful content, including online pornography, self-harm material, and high-impact violence material. Some service categories are pre-assessed and do not require formal risk assessments. The Code includes specific provisions for AI companion chatbot features, requiring separate risk assessments for restricted content categories. Services designed solely to generate harmful content are automatically classified as high risk. Universal compliance measures include mandatory age assurance for services primarily sharing pornography or self-harm material, and age verification for gaming services with R18+ content or simulated gambling. Closed communication services face extensive obligations, including prohibitions on criminal activities such as non-consensual intimate image sharing, grooming, and sexual extortion. They must maintain reporting mechanisms, safety tools, annual system reviews, and engagement with safety organisations. Other communication services have similar requirements with a focus on preventing the sharing of harmful content with children, including sophisticated safety features such as message blocking, content filtering, and restrictive default settings. Dating services are required to implement detection systems, reporting mechanisms, age assurance or notification measures, and user tools to limit unsolicited content, alongside continuous improvement programs. Gaming services with communication functions must maintain content moderation, user safety tools, and reporting systems, with regulatory responsibility focused on the entities controlling the end-user versions of services. Enforcement mechanisms across all categories include mandatory reporting to eSafety, timely responses to regulator communications, referral of unresolved complaints, regular reviews of system effectiveness, and maintaining qualified trust and safety personnel. The framework balances flexibility in implementation with consistent protection standards, acknowledging that technical feasibility and operational capacity vary across service types.
Original source