On 17 March 2025, the Office of Communications (Ofcom)'s illegal content Codes of Practice for user-to-user services that implement the Online Safety Act entered into force. Providers were required to carry out an illegal harm risk assessment by 16 March 2025, following which they must implement suitable measures to revoke illegal content on their platforms and reduce the risk of this content appearing. The codes introduce specific content moderation obligations for social media platforms. These include removing illegal content of all types and appropriately resourcing and training content moderation teams. Platforms must also provide reporting and complaints functions that users can easily find and use. Furthermore, platforms are required to test and refine their algorithms to reduce the spread of illegal content and implement automated hash matching to detect child sexual abuse material (CASM). They must also take down abusive materials, including intimate image abuse material ("revenge porn"), sexual exploitative material, and cyberflashing.
Original source