Australia: eSafety Commissioner registered Social Media Services (Messaging Features) Online Safety Code (Class 1C and Class 2 Material)

Description

eSafety Commissioner registered Social Media Services (Messaging Features) Online Safety Code (Class 1C and Class 2 Material)

On 9 September 2025, the eSafety Commissioner registered the Social Media Services (Messaging Features) Online Safety Code (Class 1C and Class 2 Material). The Code applies to private messaging components of social media platforms, focusing on Class 1C and Class 2 harmful material. Unlike other schedules covering entire platforms, it applies solely to instant messaging features that enable private communication between users, excluding public posting functions such as comments or community posts, which are addressed under the main social media code. The framework establishes compliance requirements for all messaging features, reflecting the potential risks associated with private messaging. Obligations include prohibitions against sharing online pornography with Australian children and maintaining clear, accessible terms and conditions. Providers are expected to take proportionate action when breaches occur, while retaining flexibility to respond appropriately to specific circumstances. Safety infrastructure requirements include integrated reporting mechanisms that protect reporter anonymity and provide clear guidance. Providers must maintain trained personnel to handle reports and conduct annual reviews of system effectiveness. Technical measures include user control tools such as message blocking, settings to prevent unwanted messages, group chat exit options, and, for child accounts, restrictive default privacy settings. Additional protections may involve automated nudity detection and restrictions preventing unknown adults from contacting children directly. Providers are also expected to regularly review and improve safety tools through research, collaboration with safety organisations, industry engagement, and refinement of algorithms. Transparency requirements include publishing information on available safety tools, eSafety’s role, complaint processes, and guidance on safe platform use.

Original source

Scope

Policy Area
Content moderation
Policy Instrument
Content moderation regulation
Regulated Economic Activity
platform intermediary: user-generated content, messaging service provider
Implementation Level
national
Government Branch
executive
Government Body
other regulatory body

Complete timeline of this policy change

Hide details
2025-09-09
adopted

On 9 September 2025, the eSafety Commissioner registered the Social Media Services (Messaging Featu…

2026-03-09
in force

On 9 March 2026, the Social Media Services (Messaging Features) Online Safety Code (Class 1C and Cl…