Description

Office of Communications released guidance on online gaming compliance under Online Safety Act

On 14 October 2025, the Office of Communications (Ofcom) released guidance outlining the regulatory obligations of online video game providers under the Online Safety Act. The guidance specifies that the Act applies to online services with user-to-user functions enabling users to create, share or upload content that can be encountered by others, including features such as matchmaking, in-game chat, livestreaming, and user-generated avatars or environments. It emphasises Ofcom’s role in enforcing online safety duties related to illegal content and content harmful to children. The guidance references Ofcom’s Register of Risks for 17 categories of illegal content and 12 types of content harmful to children, including terrorism, grooming, harassment, and violent or bullying content. Providers are instructed to assess risk, implement safety measures, conduct children’s access and risk assessments, and maintain records in accordance with the Act’s safety and related duties.

Original source

Scope

Policy Area
Content moderation
Policy Instrument
Content moderation regulation
Regulated Economic Activity
platform intermediary: user-generated content, software provider: other software
Implementation Level
national
Government Branch
executive
Government Body
other regulatory body

Complete timeline of this policy change

Hide details
2025-10-14
adopted

On 14 October 2025, the Office of Communications (Ofcom) released guidance outlining the regulatory…