On 1 January 2026, the Online Safety Act 2025, including content moderation regulation, entered into force. The Act applies to licensed service providers, including Application Service Providers (ASPs), Content Application Service Providers (CASPs), and Network Service Providers (NSPs). The Act requires service providers to implement risk-based safety measures, child protection safeguards, and reporting mechanisms. The Act regulates the dissemination of harmful online content, including child sexual abuse material, financial fraud, harassment, incitement to violence or terrorism, and other harmful categories. Content related to child sexual abuse and financial fraud is designated as "priority harmful content" and is subject to additional regulation. Application service providers (ASPs) and content access service providers (CASPs) are required to implement measures to reduce user exposure to harmful content, including robust content moderation, child-specific safeguards, and tools for users to manage their safety, such as communication restrictions, reporting mechanisms, and assistance channels. Platforms must also restrict children’s access to harmful content by limiting communication with adults, regulating recommendation systems, addressing addictive features, and protecting personal data. Providers must submit an Online Safety Plan to the Malaysian Communications and Multimedia Commission (MCMC) and make it publicly available to ensure compliance, transparency, and accountability.
Original source