Subscribe to regular updates:

Share

DPA Digital Digest: United Kingdom [2024 Edition]

A close-up of the United Kingdom’s regulatory approach to data governance, content moderation, competition, artificial intelligence, and more.

Report Image

The “DPA Digital Digest” series provides concise summaries of each G20 nation’s digital policy. Based on the Digital Policy Alert database, we outline rules and enforcement cases in data governance, content moderation, competition, artificial intelligence, and domestic points of emphasis.

Authors

Tommaso Giardini, Nils Deeg, Anna Pagnacco

Date Published

21 Aug 2024

The United Kingdom (UK) is attempting to leverage post-Brexit regulatory autonomy to boost its digital economy. The UK’s ICT sector grew by over 11% in 2023, according to the OECD. The UK government estimates that the digital sector is growing almost three times faster than the total economy. 

To foster this growth, the UK’s five-point plan for digital trade focuses on open digital markets, free data flows, enhanced consumer safeguards, improved digital trading systems, and increased international cooperation. To facilitate international cooperation, in November 2023, the UK hosted the first AI Safety Summit.

But what do the UK’s domestic digital policies stand for? Our Digital Digest provides a succinct overview of the latest policy and enforcement developments in major policy areas and UK-specific points of emphasis.

  • Data governance: The UK is considering novel data protection rules and striving to harness data flows, including to the United States, while maintaining EU adequacy.

  • Content moderation: The UK is implementing its landmark Online Safety Act and deliberating new measures on media content and sexually explicit deepfakes.

  • Competition policy: The UK is implementing a novel digital competition regime and scrutinising digital firms with strategic market status, especially regarding their mergers.

  • Artificial intelligence: The UK has considered but not yet adopted rules, while various regulatory authorities are investigating AI companies across different policy areas.

  • The UK’s points of emphasis include minor protection, the taxation of the digital economy, and digital currencies.

Jump directly to the section that interests you most:


Discover the details of the United Kingdom's regulatory approach on our dedicated country page.

Remain up-to-date on new and upcoming developments with our free notification service

Written by Tommaso Giardini, Anna Pagnacco and Nils Deeg. Edited by Johannes Fritz.


Data governance

Policy developments

The new government, established in July 2024 after the general election, announced but did not yet advance the Digital Information and Smart Data Bill. The bill text is currently not publicly available.

The previous Parliament failed to adopt the Data Protection and Digital Information Bill prior to its dissolution in May 2024. The Bill would have amended the 2018 Data Protection Act and the UK General Data Protection Regulation (GDPR), which transitionally implemented the European Union’s (EU) GDPR. The Bill aimed to reduce compliance burdens, introducing exemptions for cookies, reducing record-keeping obligations, enabling the refusal of data access requests, and removing the local representation requirement for non-UK controllers. The Bill would also have replaced the Information Commissioner’s Office (ICO) with an Information Commission

In December 2023, the amended data protection regulations replaced references to fundamental rights under EU law in the UK GDPR and the 2018 Data Protection Act with references to the European Convention on Human Rights. 

The new government prioritises cybersecurity and has announced a Cyber Security and Resilience Bill. This priorisation follows the previous government, which adopted the 2022 National Cyber Strategy and Government Cyber Security Strategy. Under the UK GDPR, data processors and controllers must implement risk-appropriate cybersecurity measures and report data breaches to the ICO within 72 hours (as well as report high-risk breaches to data subjects). 

In April 2024, security standards for manufacturers of connectable products entered into force. Regarding electronic communications, the Office of Communications (Ofcom) requires operators of essential services to report breaches that last over 15 minutes or cause network degradation of 25%. Ofcom’s code of practice, implementing the Electronic Communications Regulations 2022, imposes risk minimisation measures on public electronic communications networks. To facilitate cybersecurity compliance, the government developed a “check your cybersecurity” tool and outlined steps in responding to cyber threats. Currently, the government is deliberating regulations on software cybersecurity

Data transfer/localisation developments

Currently, the UK enables data transfers through 1) adequacy decisions, granted by the Secretary of State to “essentially equivalent” regimes, 2) appropriate safeguards, such as standard data protection clauses and binding corporate rules, subject to risk assessment, or 3) specific exceptions, such as consent and contractual necessity. 

Regarding adequacy, the government recently reached adequacy with the United States (US), EU, and South Korea. Regarding safeguards, the ICO issued guidance on the International Data Transfer Agreement (replacing EU Standard Contractual Clauses), binding corporate rules and risk assessments

Data flows are also part of the UK’s international trade agreements and digital partnerships. In December 2022, the UK concluded negotiations to accede to the Comprehensive and Progressive Agreement for Trans-Pacific Partnership, which demands free data flows. In addition, data flows are subject of the UK-US Atlantic Declaration (establishing a “data bridge”), the UK-Singapore Digital Economy Agreement and the UK-Japan Digital Partnership, as well as negotiations to modernise free trade agreements with Canada and South Korea.

Guidelines and enforcement developments

The ICO regularly issues guidelines on data protection. In 2024, the ICO adopted guidance on data protection in online content moderation and cyber security incident reporting, as well as a code of practice for data protection in journalism. Currently, the ICO is considering guidelines on consent or Pay business models and employee data sharing, as well as its draft enterprise data strategy. Furthermore, consultations on cybersecurity regulation are underway regarding a governance code of practice, resilience for communications providers, and critical infrastructure

The ICO’s enforcement rarely results in fines.

  • In April 2023, TikTok was fined GBP 12.7 million for processing the data of children under 13 years without parental consent and not adequately informing users on data collection and sharing practices. 

  • In May 2024, the ICO  opened an investigation into Microsoft regarding safeguards to protect user privacy. In April 2024, the Upper Tribunal dismissed an appeal by the ICO against a ruling that found Experian had breached data protection laws but the initial punishment was too severe. 

Other agencies also pursue data-related enforcement. The Financial Conduct Authority fined credit reporting agency Equifax for a data breach in October 2023. The Competition and Markets Authority (CMA) is currently investigating Meta's use of data obtained through digital display advertising, specifically evaluating proposed commitments by Meta.

Content moderation

Policy developments

The Online Safety Act, the UK’s comprehensive framework against harmful online content under the purview of the Office of Communications (Ofcom), entered into force in January 2024. The law demands content monitoring from user-to-user services and search service providers that have “links with the UK.” Basic obligations apply to all regulated services, while additional obligations apply to certain categories of providers based on user thresholds and conditions which are yet to be finalised by the government. 

All regulated services must comply with content management and age verification practices in relation to minors, who must be prevented from accessing harmful content. Further, all regulated services must carry out illegal content risk assessments, remove or prevent illegal content, and operate content reporting and complaint procedures. The Act also creates a number of criminal offences, such as the sharing of intimate images without consent. 

Categorised user-to-user services have extended content moderation obligations, including the removal of content that violates the platform’s own terms of service (beyond illegal content). Major platforms must provide content control functions for adult users, among other specific obligations. 

In April 2024, an amendment to the Criminal Justice Bill to criminalise the creation of sexually explicit deepfakes was announced but failed to pass before the end of the parliamentary session. 

Audiovisual and media content has been a policy priority since 2020, when the UK withdrew from the EU’s Audiovisual and Media Services Directive and video-sharing platform framework. The Media Act, which received Royal Assent in May 2024, requires Ofcom to assess whether “on-demand programme service providers” have adequate content warnings, parental controls, and age verification. The Act also applies to foreign providers that target UK audiences.

Guidelines and enforcement developments

The Online Safety Act empowers Ofcom to impose fines and draft regulations and codes of practice. Ofcom has conducted consultations on its enforcement approach, super-complaints illegal content risk assessments, record keeping, and pornographic content. It is also deliberating codes of practice for user-to-user and search services, as well as for harmful content protection. In March 2024, Ofcom recommended a system to classify platforms under the Online Safety Act, to guide the government in determining the thresholds for user-to-user and search services which are subject to additional obligations. 

Beyond the Online Safety Act, in January 2024, Ofcom issued a report finding that major search engines can act as gateways to harmful, self-injury-related content. In July 2023, the government announced a pornography review to tackle exploitative and abusive content online. Finally, Ofcom, as the statutory regulator of video-sharing platforms, published guidance on protective measures and notification requirements.

There are currently no public, official sources on the UK government’s general online content enforcement. Ofcom will enforce the Online Safety Act once it is fully implemented and has focused its enforcement on minor protection (see the dedicated section below).

Competition

Policy developments

In May 2024, the UK adopted the Digital Markets, Competition and Consumers Act. The Act introduces an ex-ante regulatory regime for companies with strategic market status in digital markets. The Competition and Markets Authority (CMA) can designate companies with substantial market power and a position of strategic significance regarding a digital activity, if their annual turnover exceeds GBP 25 billion (global) or GBP 1 billion (local). Designated firms are subjected to conduct requirements and structural remedies. 

In addition, the Act imposes new merger rules. Designated firms must notify the intention of UK acquisitions if the transaction value exceeds GBP 25 million and the acquired equity stake exceeds specified thresholds. For all acquiring firms, mandatory merger notification is triggered by a UK share of supply exceeding 33% and UK turnover exceeding GBP 350 million. In addition, the Act grants the CMA sanctioning powers, prohibits certain unfair commercial practices, such as fake reviews, and regulates anti-competitive agreements

The Act will be implemented through secondary legislation. In January 2024, the CMA adopted its provisional approach for the implementation of Part I of the Act. Further, the CMA consulted on guidance on the digital competition regime and reporting requirements for designated firms. 

Guidelines and enforcement developments

The CMA, equipped with a Digital Markets Unit since 2021, rigorously enforces both unilateral conduct, focusing on large firms, and merger rules. 

  • In July 2024, the CMA opened a consultation on the revised approach to Google's "Privacy Sandbox," a user-choice prompt enabling users to select whether to retain or remove third-party cookies from Chrome. 

  • In May 2024, the CMA consulted on Meta’s revised commitments regarding the use of data obtained through digital display advertising for Facebook Marketplace. The CMA’s investigation into Amazon for disadvantaging third-party sellers in its marketplace was closed in November 202,3 following the acceptance of commitments. 

  • The CMA is also investigating Apple’s abuse of market power in its App Store, while consulting on its intention to accept Google’s commitments in a similar case. 

  • Google is currently under investigation regarding advertising and header bidding services. The case previously also scrutinised the Jedi Blue agreement with Meta to exclude Google’s competitors from the advertising market. In addition, Google and Amazon are under investigation for fake and misleading reviews

  • In November 2023, the Court of Appeal confirmed that the CMA’s investigation into Apple and Google’s “effective duopoly” over mobile browser and cloud gaming could proceed, after being appealed due to a procedural delay. 

The CMA also scrutinises mergers in digital markets. 

  • In April 2023, the CMA first blocked the Microsoft/Activision Blizzard merger due to competition concerns in the cloud gaming market, but then approved the merger after Microsoft appealed and restructured the transaction. 

  • Previously, the CMA blocked and ordered divestment in the Dye & Durham/TM Group merger and the Facebook/Giphy merger. The latter blocking gained prominence because Giphy had no sales in the UK and the investigation was conducted ex-post, including a record fine of GBP 50 million for non-compliance with the initial enforcement order. In June 2023, the CMA approved Meta’s sale of Giphy to Shutterstock. 

The CMA’s investigations of the Qualcomm/Autotalks, Adobe/Figma, and NVIDIA/Arm transactions closed following parties’ decision to abandon the respective deals. The CMA has also approved mergers, including Broadcom/VMware. Amazon/iRobot, Facebook/Kustomer, Microsoft/Nuance, NortonLifeLock/Avast and Viasat/Inmarsat. Currently, the CMA is investigating the proposed Vodafone/Hutchison merger.

Artificial Intelligence

Policy developments

The UK is striving for a pro-innovation approach on AI regulation. Ensuring the right national and international governance of AI is one of three pillars of the UK AI strategy and its subsequent action plan, along with investments and support for the AI transition. 

Recently, the Department for Science, Innovation and Technology urged regulators to update their strategic approach to AI and consulted on a code of practice for AI cybersecurity. The Department also issued guidance on AI Assurance and implementing the AI regulatory principles. In May 2024, the novel AI Safety Institute, part of the Department, adopted its safety evaluations platform

Legislative proposals regarding AI have spanned across policy areas but not advanced. The AI Regulation Bill, introduced in November 2023 but rejected with Parliament’s dissolution in May 2024, included provisions relating to copyright, data transparency, AI-content labelling, and audit, as well as the establishment of an AI authority. Another proposal aimed to regulate AI use in the workplace.

Guidelines and enforcement developments

Several government agencies are focusing on AI through guidelines and enforcement. 

Regarding enforcement, several authorities have scrutinised AI, especially the CMA. 

  • The CMA opened investigations into AI partnerships between Microsoft and Mistral AI, between Amazon and Anthropic, and between Microsoft and OpenAI. The latter two investigations are still ongoing. 

  • The ICO cooperated with the Australian authority in an investigation of Clearview AI’s training dataset of pictures of people for facial recognition purposes (under appeal). The ICO’s investigation into Snap over its "My AI" feature was concluded after the company changed its policies concerning risk mitigation. 

  • In December 2023, the UK Supreme Court held that the inventor of a patent must be a natural person, following the rejection of proposed guidelines on patent applications relating to AI inventions in November 2023. 

Points of emphasis

Minor protection

Minor protection is central to the UK’s digital policy and spans across several policy areas. Since September 2021, the Age Appropriate Design Code (Children’s Code) applies to providers of online services that are likely to be accessed by children under 18. The Code contains 15 standards that require reduced minor data collection, privacy by default, and age verification, among others. The ICO published guidelines, including on the definition of likely to be accessed and compliance measures for game designers

In 2024, the ICO published its priorities in the 2024-2025 Children's Code Strategy as well as its updated opinion regarding age assurance for the Children's Code, including guidance for online services and data protection compliance. In September 2023, the government adopted guidance on end-to-end encryption and child safety and on sharing information in a way that safeguards children.

In addition, Ofcom and the ICO consulted on children’s online risks and children’s online safety and privacy, in the context of the Online Safety Act. Regarding gaming, the Department for Digital, Culture, Media & Sport concluded that children should not have access to loot boxes in video games without parental supervision. Finally, the UK Gambling Commission fined online betting provider Betway for marketing on the children’s pages of a football club’s website.

Ofcom also focuses its enforcement on age verification. 

 

  • In May 2024, Ofcom opened an investigation into OnlyFans regarding age verification measures. 

  • In the same month, Ofcom decided not to open a formal investigation into Twitch as it determined the platform had complied with its obligation to protect minors from harmful video content. 

  • In December 2023, the same month as it published its report on minor protection in video-sharing platforms, Ofcom opened an investigation into TikTok for potential non-compliance with a formal information request related to its parental control systems.

Taxation

The UK has advanced rules on both direct and indirect taxation of the digital economy. In October 2021, the UK announced that it would repeal its Digital Service Tax (DST) in view of international negotiations on the OECD/G20 Inclusive Framework. Adopted in 2020, the DST of 2% applied to revenues derived from specific "digital services activities," e.g. social media, online marketplace platforms and search engines, that were attributable to UK users. The DST targeted companies that generated revenues of GBP 500 million (global) and GPB 25 million (local) through such services. 

The DST prompted an investigation by the United States (US) Trade Representative, which announced punitive tariffs. In 2021, the US reached a political agreement under which the UK withdraws the DST upon entry into force of Pillar 1 of the Inclusive Framework, while the US defers its tariffs. In February 2024, the countries agreed to extend the terms of this agreement until the end of June 2024. In December 2023, the UK bill implementing the OECD/G20 Inclusive Framework for the taxation of the digital economy entered into force. 

In November 2022, the government decided not to introduce an Online Sales Tax due to its elevated complexity and risks of evasion and market distortion. The government previously consulted on a tax of 1-2% on online sales to decrease disparities with in-store retailers. Since January 2021, online marketplaces facilitating sales of overseas goods to UK customers are responsible for collecting value-added-tax.

Digital currencies

The UK is actively updating its regulatory framework for digital assets. In June 2023, the Financial Services and Markets Act was adopted, granting regulatory powers in relation to digital assets to the Treasury, the Financial Conduct Authority (FCA), and the Bank of England. 

Following the law’s adoption, digital securities sandboxes were established to facilitate the controlled adoption of innovative digital assets under the management of the Bank of England and the FCA. The Treasury has further issued plans for the regulation of fiat-backed stablecoins. Both Parliament and the Law Commission inquired on regulatory reforms, while the government has consulted on approaches to tax crypto assets. Finally, the Bank of England and the Treasury consulted on a central bank digital currency

In October 2023, the Economic Crime and Corporate Transparency Act received Royal Assent, broadening law enforcement authorities’ powers to seize crypto assets related to criminal activities. The FCA, a central authority in the oversight of crypto assets marketing, has issued financial promotion rules and compliance guidance. The FCA further restricted a peer-to-peer lending platform’s ability to approve crypto asset financial promotions. In March 2024, the FCA stated that it approved the creation of a UK listed market segment for crypto asset Exchange Traded Notes for professional investors. 

Finally, the ICO stated in July 2023 that WorldCoin, a cryptocurrency project based on biometric identification, is required to carry out data protection impact assessments.