On 20 April 2022, the UK Government opened a public consultation on the Online Safety Bill, which includes a range of content moderation requirements. The bill introduces a number of duties of care for providers of user-to-user services, such as the assessment of the probability of children accessing the service, the assessment of illegal content risk, the protection from privacy infringements and the guarantee of the freedom of expression as well as reporting duties. "Category 1" (user-to-user services) and "Category 2A" (search engines) companies, which are the largest online platforms (whose threshold conditions are decided by further regulation by the Secretary of State in consultation with Ofcom), are required to address content "harmful to adults" that falls below the threshold of a criminal offence. The version of the law introduced into parliament requests such platforms to address "legal but harmful" categories of content that will be set out in secondary legislation by the Parliament. The modifications from the draft version include: criminalising "cyberflashing", bringing paid-for scam adverts into scope, requiring websites hosting pornography to check the users' age, requiring social media firms to give people the power to control who can interact with them, to block anonymous trolls and to control which posts are visible.
Original source