On 30 June 2025, the Office of Communications (Ofcom) opened a consultation on the draft Illegal Content Codes of Practice for user-to-user services until 20 October 2025. These codes detail recommended measures relating to content moderation for providers to comply with their illegal content safety duties. The measures require all services to maintain a content moderation function capable of reviewing suspected illegal content (ICU C1) and ensuring the swift takedown of such content (ICU C2), unless technically unfeasible. Large or multi-risk services must also establish internal content policies (ICU C3), performance targets (ICU C4), prioritisation strategies (ICU C5), adequate resourcing (ICU C6), and training and materials for moderators (ICU C7–C8). Additional obligations apply to high-risk or large services, including use of perceptual hash-matching to detect and remove child sexual abuse material (ICU C9), the detection of listed Child Sexual Abuse Material (CSAM) URLs (ICU C10), and proactive technology assessments (ICU C11–C12). The updates form part of broader changes including new content moderation measures (ICU C11-C16), enhanced reporting requirements (ICU D15-D17), recommender system safeguards (ICU E2), settings (ICU F3), and user access (ICU H2-H3). Modifications affect existing provisions on content moderation (ICU C1, C4, C9), reporting (ICU D8-D11, D13), settings (ICU F1-F2), terms of service (ICU G1), and user controls (ICU J1-J2).
Original source