A close-up of the European Union’s regulatory approach to data governance, content moderation, competition, artificial intelligence, and more.
The “DPA Digital Digest” series provides concise summaries of each G20 nation’s digital policy. Based on the Digital Policy Alert database, we outline rules and enforcement cases in data governance, content moderation, competition, artificial intelligence, and domestic points of emphasis.
The European Union (EU) is recognised as a leading force in digital policymaking. The EU has established global benchmarks for digital rules through its General Data Protection Regulation, Digital Markets Act, Digital Services Act, and Artificial Intelligence Act.
Beyond policymaking, the EU is the world’s largest exporter of digitally deliverable services, totaling USD 770 billion in 2022, according to the Jacques Delors Institute. In addition, EU member states employ approx. 10 million ICT specialists and invest over a quarter of their Recovery and Resilience Facility funds in digital transformation.
But what do the EU’s digital policies stand for? Our Digital Digest provides a succinct overview of the latest policy and enforcement developments in major policy areas and EU-specific points of emphasis.
Data governance: The EU has implemented the Data Governance Act, adopted the Data Act, established the EU-US Data Privacy Framework, and focused its enforcement on large firms.
Content moderation: The EU has started implementing and enforcing the Digital Services Act, and further adopted novel rules on media freedom and political advertising.
Competition policy: The EU is enforcing the Digital Markets Act, its new regime for digital “gatekeepers,” through designations and investigations by the European Commission.
Artificial intelligence: The EU has adopted the landmark Artificial Intelligence Act and is working on its implementation throughout the coming years, while member states are enforcing existing rules on AI providers.
The EU’s points of emphasis include the taxation of the digital economy, crypto assets, and telecommunications.
Jump directly to the section that interests you most:
Discover the details of the European Union's regulatory approach on our dedicated country page.
Remain up-to-date on new and upcoming developments with our free notification service.
Written by Tommaso Giardini, Nils Deeg, and Maria Buza. Edited by Johannes Fritz.
The European Commission (“Commission”) presented the European Strategy for Data in 2020, striving to create a “single market for data” building on the flagship General Data Protection Regulation (GDPR).
The GDPR, effective since May 2018, sets out a comprehensive data protection framework that applies across EU member states. It sets out six legal bases for the processing of personal data, including consent and legitimate interest, establishes technical and organisational requirements to protect personal data, and demands data breach notification. Further, the GDPR enshrines data subject rights, including to access, rectify, and erase personal data.
The Data Governance Act, implemented in September 2023, aims to facilitate data sharing and increase data availability. The Act forms mechanisms for the reuse of public sector data, requires data intermediaries to enable data sharing, and establishes common “European data spaces,” including in health and finance. In January 2024, the Commission established the register of data intermediation service providers. In August 2023, the Commission adopted logos to identify data intermediation service providers and data altruism organisations.
The Data Act, to be implemented in September 2025, regulates the sharing of data generated by connected devices. It requires Internet of Things providers to enable data access, sharing, and portability. In addition, the Data Act aims to prevent contractual imbalances in data sharing contracts, enable public sector access to private sector data, and require interoperability for data-processing services.
In November 2022, the EU adopted the Network and Information Security (NIS2) Directive. NIS2 introduces cybersecurity requirements across sectors, including incident reporting. Special obligations and harsher fines apply to essential and important entities. EU member states must implement NIS2 by October 2024, including by designating national authorities and establishing incident response teams. The Commission issued guidance on the compatibility of existing rules and NIS2 and consulted on an implementing regulation clarifying risk management measures.
In January 2025, the regulation on digital operational resilience for the financial sector (DORA) will enter into force. DORA introduces a range of cybersecurity requirements and requires ICT providers from non-EU countries that deliver critical services to financial entities to open a local subsidiary. EU authorities are currently developing a range of technical standards, including on risk management and incident classification.
The GDPR sets out several mechanisms for the transfer of personal data, the most salient being “adequacy decisions” by the Commission that designate non-EU countries with adequate levels of data protection.
The most salient adequacy decision, issued in July 2023, concerned data transfers to the United States (US) under the EU-US Data Privacy Framework. Negotiations on the framework began in 2020, when the Court of Justice of the European Union invalidated the US-EU Privacy Shield, mainly due to US intelligence services’ access to EU user data. In 2022, the countries reached an agreement in principle and the US president issued an Executive Order to implement US commitments under the framework. To implement the framework, the European Data Protection Board (EDPB) adopted procedural rules on the framework’s redress mechanism and binding advice. The EDPB previously raised concerns regarding data subject rights and exemptions under the framework. Currently, the Commission is reviewing the framework’s functioning.
Beyond the US, the EU has adopted adequacy decisions for a number of countries since 2021, including South Korea and the United Kingdom. In January 2024, the Commission renewed adequacy decisions for 11 countries, including Argentina and Canada.
Further data transfer mechanisms include binding corporate rules, standard contractual clauses, approved codes of conduct, and certification, on which the EDPB has issued guidance.
GDPR rules are generally enforced by national supervisory authorities. To improve coordination in cross-border cases, the EU is drafting a regulation to revise the procedural framework. Additionally, EU institutions can get involved in two ways.
First, the EDPB directs the focus of national authorities through coordinated enforcement actions. The three actions to date have focused on data controllers’ right of access (2024), data protection officers (2023) and the use of cloud-based services by the public sector (2022).
Second, the EDPB issues binding decisions on disputed cases with cross-border effects, often by the Irish Data Protection Commission (DPC).
In November 2023, the EDPB issued an urgent binding decision extending a ban on Meta’s behavioural advertising.
In August 2023, the EDPB adopted a binding decision in a DPC case concerning Tiktok’s processing of children’s data. The DPC subsequently fined TikTok EUR 345 million for violating “data protection by design” requirements.
In April 2023, the EDPB issued a binding decision regarding Meta’s data transfers to the US, which the DPC followed up with a EUR 1.2 billion fine (under appeal). In January 2023, the DPC fined Meta EUR 5.5 million because WhatsApp obliged users to agree to its updated Terms of Service to use its services, forcing users to consent. In December 2022, the DPC fined Meta because its “contractual necessity” legal basis was insufficient for data processing on Instagram (EUR 180 million) and Facebook (EUR 210 million). Meta subsequently switched to the legal basis of “legitimate interest.”
The EDPB can also issue non-binding opinions, recently noting that consent or pay models likely violate the GDPR’s requirements for valid consent.
A common target of national authorities’ recent GDPR enforcement was Worldcoin. Authorities in Austria, Italy, Spain, and Portugal opened investigations into Worldcoin’s data processing practices, including iris scans, to create a global “World ID.” Spain and Portugal temporarily suspended Worldcoin. The authorities forwarded documents to the Bavarian data protection authority, which will issue the final ruling, since Worldcoin has its main European establishment there.
The Digital Services Act (DSA), in effect since February 2024, is the EU’s novel framework for content moderation and online safety. It establishes tiered obligations for intermediary services, hosting services, online platforms, and “very large” online providers with at least 45 million average monthly users in the EU.
Intermediary services are subject to a range of rules covering transparency.
Hosting services must install a notification mechanism for users to report unlawful content, provide statements of reasons for restrictions in relation to illegal content, and establish appropriate complaint-handling systems.
Online platforms are further prohibited from using dark patterns to manipulate or nudge user choice through user interface design and may not present targeted advertising to minors.
Additional rules for “very large online platforms” (VLOPs) and “very large search engines” (VLOSEs) include risk assessment and mitigation, audit, and data access for vetted researchers, among others.
In terms of oversight, the DSA grants the Commission powers regarding infringements by VLOPs and VLOSEs and establishes the European Board for Digital Services. Member states must designate national Digital Services Coordinators.
In August 2025, the European Media Freedom Act will enter into effect. The Act introduces safeguards against the “unjustified removal” of media content. Platforms with over 45 million monthly active users in the EU must process complaints by media service providers and notify media service providers with justification to remove content.
In October 2025, the regulation on the transparency and targeting of political advertising will enter into force. The regulation requires online service providers to label paid online political advertising, limit third-country-sponsored advertisements prior to elections, and uphold data protection. Notably, rules on non-discrimination in the provision of political advertising entered into force in April 2024.
In November 2023, the European Parliament adopted a negotiating mandate on the proposed regulation to combat online child sexual abuse material (CSAM). Under the European Parliament’s proposal, providers must adopt mitigation measures against CSAM, complemented by judicially imposed temporary detection orders. In February 2024, the EDPB issued a statement voicing concerns regarding data protection implications of detection orders.
In a parallel attempt to curb CSAM, the EU’s ePrivacy Directive was partly derogated until April 2026 to enable the processing of certain personal data to detect, report, and remove CSAM. The previous derogation, set to end in August 2024, was extended in May 2024.
Since June 2021, the Directive on Copyright in the Digital Single Market regulates the use of protected content by online content-sharing service providers. Providers must obtain authorisation for making protected works available, remove access to infringing content upon notification, and install a complaint and redress mechanism. Providers exceeding 5 million monthly unique visitors must further prevent infringing uploads. In addition, the Directive introduces the principle of appropriate and proportionate remuneration, entitling authors and performers that licence or transfer their copyrights to remuneration and information.
The Commission is responsible for designating VLOPs and VLOSEs under the DSA.
In April 2023, the Commission issued its first designations, including e-commerce platforms such as Amazon and AliExpress, social media platforms such as TikTok, Instagram, and Twitter, and several Google applications.
The second designations, issued in December 2023, covered adult content platforms Pornhub, XVideos, and Stripchat.
In 2024, the Commission designated e-commerce platforms Shein and Temu as well as adult content platform XNXX.
Several firms have challenged their designation. Amazon’s challenge was rejected in March 2024, while challenges regarding Zalando and Pornhub are under deliberation.
The Commission can open formal investigations into VLOPs and VLOSEs. In July 2024, the Commission issued preliminary findings alleging that X breached the DSA regarding dark patterns, advertising transparency, and researcher access to data. Other formal DSA investigations concern minor protection on TikTok and Meta, illegal content and disinformation on Meta, and risk assessment and consumer protection by AliExpress.
Beyond formal proceedings, the Commission has issued numerous requests for information, asking 6 providers regarding their generative AI risk mitigation and 17 providers regarding data access. Other requests for information concerned targeted advertising on LinkedIn; minor protection on Snap, Stripchat, Pornhub, and Xvideos; content reporting on Temu and Shein; app store rules by Apple and Google; the “pay or consent” model of Meta; and the transparency of Amazon’s recommendation systems.
Finally, national authorities can forward cases to the Commission. For example, the Danish Competition and Consumer Authority forward cases involving Meta‘s alleged failure to remove fake advertisements and Temu‘s alleged failure to delist illegal products.
The Digital Markets Act (DMA) is the EU’s novel digital competition rulebook. In force since May 2023, the DMA introduces the concept of gatekeepers¹ that provide core platform services². The DMA requires gatekeepers to notify the Commission of their intent to merge or acquire other companies providing core platform or other digital services, regardless of thresholds. Regarding unilateral conduct, the DMA prohibits the combination of personal data between different services and requires interpersonal communications services to be interoperable with other systems.
In February 2024, the Commission adopted the revised market definition notice to reflect digital ecosystems and multi-sided platforms when assessing "relevant markets." Since June 2022, the amended vertical block exemption regulation and vertical guidelines cover platforms and online intermediaries as “suppliers.” The amendment introduces calculations of digital market share, definitions of vertical agreements, and quality requirements for online sales and advertising.
The Commission enforces both the DMA, through designations and investigations, and previous EU competition rules.
In September 2023, the Commission designated the first six gatekeepers, namely Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft, in respect to various core platform services.
In the spring of 2024, the Commission designated Apple’s iPad OS as a core platform service and travel service provider Booking as a gatekeeper.
Companies are challenging DMA designations. The designations of Apple and Meta are under appeal, while the General Court dismissed ByteDance’s lawsuit against its designation in July 2024. Currently, the Commission is investigating whether to designate X, following a rebuttal.
Notably, the Commission has decided not to designate TikTok Ads, X Ads, iMessage and Microsoft’s Bing, Edge, and advertising service.
The Commission has initiated a number of investigations against gatekeepers under the DMA.
In July 2024, it issued the preliminary finding that Meta’s pay or consent model breaches the requirement to provide equivalent services to users who do not consent to data combination.
In June 2024, it issued preliminary findings that Apple’s App Store steering rules may breach the DMA.
Currently, the Commission is investigating the Google Play steering rules, potential self-preferencing on on Google Search and the Amazon Store, as well as Apple’s app new contractual terms for app developers, new fee structure and terms for alternative app stores, and provision of means to uninstall software and change default settings and services.
Beyond the DMA enforcement, the Commission scrutinises unilateral conduct in digital markets based on existing competition rules.
Currently, the Commission is investigating Meta’s use of data collected from its advertisement service for Facebook Marketplace, Microsoft’s tying of Teams with Office 365, and an agreement between Google and Meta on online display advertising.
Such investigations have led to substantial fines. In March 2024, the Commission fined Apple EUR 1.8 billion because the App Store rules allegedly prevent music streaming providers from informing users of cheaper alternatives (under appeal). Previously, the Commission fined Google EUR 2.4 billion for directing traffic from its search engine to its comparison shopping services (under appeal) and EUR 4.1 billion for requiring pre-installation of Google apps for access to the Google Play Store.
In other cases, the Commission closed investigations by accepting firms’ commitments, including regarding Apple’s restriction of Near Field Communication (NFC) technology on its devices to Apple Pay and regarding Amazon’s use of non-public data and discrimination against sellers that don’t use its logistic services.
In merger regulation, the Commission blocked Booking’s proposed acquisition of eTraveli citing concerns of increased dominance in the hotel online travel agency market. The Commission also investigated the Adobe/Figma and the Amazon/iRobot transactions, concluding its investigations since both transactions were abandoned. Furthermore, the Commission approved, subject to various commitments, the Microsoft/Activision Blizzard, Orange/MasMovil, Broadcom/VMware, Meta/Kustomer, Google/Photomath, Amazon/MGM and Orange/VOO/Brutélé transactions.
In May 2024, the EU adopted the Artificial Intelligence (AI) Act, to be implemented in the coming years. The Act introduces a risk-based regulatory framework with tiered obligations and implementation deadlines depending on AI systems’ risk classification.
AI systems posing unacceptable risk levels, for example social scoring, are prohibited starting February 2025.
High-risk AI systems, for example AI systems used in critical infrastructure, education, and employment, are subject to a range of obligations. Obligations cover impact assessment, risk management, testing, data governance, cybersecurity, conformity assessment, and post-market monitoring, among others.
In addition, the AI Act includes transparency obligations for AI systems that interact with natural persons and rules for general-purpose AI models.
In September 2022, the Commission proposed a directive adapting liability rules to AI. The directive grants victims the right to claim damages resulting from defective AI products. AI providers must disclose information on their systems in civil court proceedings. In October 2023, the European Data Protection Supervisor (EDPS) issued an opinion proposing further measures, including procedural safeguards, understandable disclosure, and the easing of burdens of proof.
The General Product Safety Regulation, set to enter into force in December 2024, aims to counter risks stemming from AI-related products. The focus lies on consumer protection, especially safety challenges stemming from novel technologies sold online. In addition, the regulation establishes rules on cybersecurity and data protection.
The oversight of the AI Act is split between existing and new governing bodies.
The AI Office, formally established in February 2024, is responsible for coordinating the development of AI policy across the EU and for overseeing the implementation of the AI Act.
The European AI Board, composed of member state representatives, advises and assists the Commission and member states in applying the AI Act.
By August 2025, member states must designate national competent authorities. For example, Poland and Denmark have designated authorities, while Spain has launched the first AI regulatory sandbox.
AI-related enforcement spans across policy areas with a focus on generative AI.
In January 2024, the Commission initiated a competition investigation of Microsoft’s investment into OpenAI. The Hungarian competition authority is investigating Bing’s AI search feature.
Under the DSA, the Commission has issued requests for information to a number of VLOPs and VLOSEs regarding their mitigation measures in relation to risks posed by generative AI.
In 2021, the EU consulted on but then postponed its digital levy in view of negotiations on the OECD/G20 Inclusive Framework. From 2024, a directive requires member states to implement the Framework’s minimum taxation rules, namely the Income Inclusion Rule and the Undertaxed Payment Rule under Pillar 2 (Global Anti-Base Erosion Rules). The rules affect companies with a global annual turnover of over EUR 750 million.
In 2021, the EU subjected e-commerce providers to its value-added tax (VAT) regime. In addition, it abolished the import VAT exemption for small consignments (previously EUR 22) and lowered the threshold for intra-EU distance sales that triggers a business registration requirement to EUR 10,000. A new “VAT One Stop Shop” allows businesses to register in only one member state.
Starting January 2024, platform providers must adhere to new tax reporting rules. Providers must verify and send information to tax authorities, including an overview of amounts paid and platform commissions. A currently deliberated proposal would standardise VAT reporting, establish a Single VAT Registration (SVR) and expand record-keeping obligations to short-term accommodation rental and business-to-business supplies providers.
The Markets in Crypto-Assets Regulation (MiCA), adopted in May 2023, requires issuers of crypto-assets to register with national authorities and obtain authorisation to operate in the EU. The registration requirement entered into effect from June 2024 for issuers of e-money and asset-referenced tokens, while crypto-asset issuers and service providers must register from December 2024. In addition, crypto-asset issuers and providers must follow transparency requirements regarding the characteristics of their crypto assets, their asset reserves and consumer rights. Finally, MiCA obliges "significant" crypto-asset issuers³ to ensure effective risk management, implement liquidity management strategies and have 3% of own capital as reserve assets.
The anti-money laundering and countering terrorism financing package contains two cryptocurrency regulations.
Effective from December 2024, the Regulation on information accompanying transfers of funds and certain crypto-assets requires crypto-asset service providers to disclose information on transactions, including the originator and beneficiary, verify the accuracy of payer and payee information for transactions over EUR 1,000, and check for entities at high risk of money laundering.
In May 2024, the Council adopted the Regulation on the prevention of the use of the financial system for the purposes of money laundering or terrorist financing extending due diligence rules to crypto assets for transactions exceeding EUR 1,000, with stricter requirements for transactions above EUR 10,000.
Finally, starting January 2026, the tax transparency rules for crypto-asset transactions by customers residing in the EU enter into effect.
The Gigabit Infrastructure Act, adopted in April 2024, strives to facilitate the rollout of high-capacity networks. To this end, the Act introduces a mandatory conciliation mechanism between public sector bodies and telecom operators, promotions for connectivity in rural and remote areas, as well as conditions for fair and reasonable network access. The Gigabit Recommendation, adopted in February 2024, guides national regulatory authorities’ regulation of network infrastructure access regarding operators with significant market power.
The Commission further consulted on the future of the electronic communications sector and its infrastructure, including the proposal of “fair contributions” from digital providers to telecommunication providers. Previously, the European Telecommunications Network Operators' Association suggested a direct compensation mechanism from large content and application providers to internet service providers. The Body of European Regulators for Electronic Communications’s assessment did not find evidence to justify a direct compensation mechanism.
Gatekeepers include companies with an annual EU turnover over EUR 7.5 billion (last three years) or market capitalisation over EUR 75 billion (last year) that operate a core platform service in at least three EU countries, reaching more than 45 million monthly “end users” and 10'000 “business users” (last three years).
Core platform services include: online intermediation services such as app stores, online search engines, social networking services, certain messaging services, video sharing platform services, virtual assistants, web browsers, cloud computing services, operating systems, online marketplaces, and advertising services.
The definition includes entities with 1) over 10 million asset-referenced token holders, 2) a market capitalisation of over EUR 5 billion, 3) over 2.5 million daily transactions with a value exceeding EUR 500 million 4) "gatekeeper" status based on Digital Markets Act or 5) designation as "significant" issuer by the Commission, if other criteria are not met.