A close-up of France’s regulatory approach to data governance, content moderation, competition, artificial intelligence, and more.
The “DPA Digital Digest” series provides concise summaries of each G20 nation’s digital policy. Based on the Digital Policy Alert database, we outline rules and enforcement cases in data governance, content moderation, competition, artificial intelligence, and domestic points of emphasis.
France, the European Union’s second-largest economy, boasts over 40 million e-consumers. In 2023, France’s digital economy sustained growth of over 6%, reaching a size of EUR 66 billion, according to Numeum. Still, the European Union’s “Digital Decade” report highlights the potential for the digitalisation of both small and large enterprises in France.
France further pursues international coordination, including by hosting the AI Safety Summit in February 2025. Finally, France’s strategy for international digital strategy emphasises governance, the economy and security.
But what do France’s domestic digital policies stand for? Our Digital Digest provides a succinct overview of the latest policy and enforcement developments in major policy areas and France-specific points of emphasis.
Data governance: France has implemented a cybersecurity audit requirement and strictly enforced data protection rules, especially regarding cookies and online advertising.
Content moderation: France is implementing new rules on illegal content and age verification, while focusing its enforcement on pornography websites.
Competition policy: France is implementing the EU Digital Markets Act and targeting large technology companies with enforcement action across numerous topics.
Artificial intelligence: France is considering rules on copyright and labelling, while various agencies have issued guidance on the implications of AI in their policy areas.
Jump directly to the section that interests you most:
Discover the details of France's regulatory approach on our dedicated country page.
Remain up-to-date on new and upcoming developments with our free notification service.
Written by Tommaso Giardini, Nils Deeg, and Maria Buza. Edited by Johannes Fritz.
As a member of the European Union (EU), the General Data Protection Regulation (GDPR) applies in France. The French data protection authority (CNIL) rigorously enforces the GDPR (see below) and regularly publishes compliance guidance. Several CNIL guidelines concern cookies, including guidelines on cookie walls, revised consent requirements for online cookies, and Google’s Privacy Sandbox. Currently, CNIL is inquiring on health data to determine regulatory priorities.
Furthermore, CNIL provides guidance on cybersecurity and the establishment of the data ecosystem. CNIL issued a guide on basic personal data security and is holding two consultations on the security of critical personal data and multi-factor authentication. In addition, CNIL adopted guidance on open data and data sharing via application programming interfaces.
Since October 2023, the amended French Code of Consumers mandates cybersecurity audits. Providers of "online platforms” and interpersonal communications services must be audited by licensed auditors and present the results to consumers in a “comprehensible format.” A decree will specify thresholds for the application of the requirement.
In July 2023, the European Commission issued its adequacy decision for the United States (US) in view of the EU-US Data Privacy Framework (DPF), to enable transatlantic data transfers following the invalidation of the EU-US Privacy Shield. A member of the French National Assembly’s request to suspend the adequacy decision was rejected by the EU General Court in October 2023.
Previously, CNIL led efforts to implement the invalidation of the Privacy Shield, promptly issuing a methodology for data controllers to define an action plan and conduct risk assessments on data transfers outside the EU. In 2022, CNIL ordered French websites to stop using Google Analytics due to the lack of sufficient additional safeguards to prevent US intelligence services from accessing transferred data. CNIL then published guidelines on preventing data transfers through Google Analytics, noting that contractual clauses are insufficient to justify US data transfers. Beyond the DPF, in January 2024, CNIL consulted on its guide for transfer impact assessments.
CNIL’s enforcement emphasises cookies, user consent, and employee protection.
In December 2023, CNIL fined Yahoo EUR 10 million for violations related to cookies. CNIL also conducted two dedicated investigation campaigns into cookies.
Also in December 2023, CNIL fined Amazon EUR 32 million for its continuous worker monitoring in warehouses.
In June 2023, CNIL fined CRITEO EUR 40 million for multiple GDPR violations in its advertising business.
In December 2022, CNIL fined Apple EUR 8 million for saving advertising identifiers in its App Store without user consent and complicating the deactivation of targeting. On the same day, CNIL fined gaming developer VOODOO EUR 3 million for monitoring users without consent.
In 2021, CNIL fined Google EUR 150 million for making it more difficult to refuse than to accept cookies. CNIL finally closed the case in July 2023, after Google added an “only allow essential cookies” button to the French Google and Youtube websites.
In 2020, CNIL fined Google EUR 100 million for placing cookies on “Google.fr” without obtaining user consent, informing users, and enabling users to remove cookies. CNIL also fined Facebook EUR 60 million, Microsoft (Bing) EUR 60 million, Amazon 35 million, and TikTok EUR 5 million for similar infringements.
France upholds rigorous content moderation and minor protection rules. In January 2025, the law securing and regulating the digital space enters into full effect. The law aims to implement the EU Digital Services Act (DSA) and Digital Markets Act (DMA) and requires social media platforms to promptly delete illegal content and suspend users convicted of cyberbullying. Notably, the constitutional court invalidated a provision establishing “online insults” as a criminal offence since it disproportionately infringed on freedom of expression.
Previously, a 2022 law required online platforms with over 10 million users to inform authorities of content removals and designate “human and technological resources” to detect illegal content. The 2020 hate speech law required user-content, communication, and search platforms to remove “manifestly illegal” content within 24 hours of notification. Terrorist content and child sexual abuse material flagged by authorities, was subject to a one-hour takedown deadline. The constitutional court invalidated the requirement as unconstitutional due to free speech concerns.
Several laws contribute to online minor protection.
The abovementioned law securing and regulating the digital space demands age verification for pornographic content.
A 2023 law regulating social networks requires user-content platforms to only allow users over 15 or obtain explicit parental consent.
A 2022 law demands that device manufacturers and operating system providers install parental controls.
In 2021, a law limited the commercial exploitation of online images of children under 16, demanding approval by legal representatives for their distribution on video-sharing platforms. Users must also be able to report images that undermine children’s dignity or integrity.
Another 2021 law obliges platforms that display pornographic content to limit minors’ access or face restrictions.
Finally, beyond legislation, the government and several digital platforms signed a charter and launched a laboratory on online minor protection in 2022.
In February 2024, the Audiovisual and Digital Communication Regulatory Authority (ARCOM) published draft orders designating certain audiovisual media services as services of general interest and setting basic conditions of visibility and accessibility for such services.
Since 2021, France has established two new regulatory agencies with oversight of online content.
ARCOM was created by merging previous agencies to combat online piracy, disinformation and hatred, and improve minor protection. ARCOM has investigative powers, can publicly blacklist infringing sites, and can block mirror sites. ARCOM is also France’s designated “digital services coordinator” under the DSA.
Viginum, the Vigilance and Protection against Foreign Digital Interference Service, monitors foreign digital threats that interfere with national interests, including citizen information during elections.
In March 2024, the Council of State referred a lawsuit involving ARCOM and two Czech providers of pornographic websites to the Court of Justice of the EU. The providers argue that ARCOM’s referral power to enforce a prohibition on pornographic material likely to be seen by minors is contrary to EU law. ARCOM had initiated court proceedings to block access to five pornography websites for failing to comply with the age verification requirements in 2022.
France emphasises the implementation of the European Union’s Digital Markets Act (DMA) rather than domestic policy development. The law securing and regulating the digital space, signed in May 2024, implements the DMA. The DMA imposes rules for “gatekeepers” and requires competition authorities to inform the European Commission of all their acquisitions, without thresholds. The French competition authority’s roadmap for 2023-2024 highlights its intention to closely monitor digital platforms in online advertising and, more broadly, the role of data.
With the stated goal of increasing competition between brick-and-mortar and online vendors, in October 2023, France mandated a minimum fee on book deliveries. Shipping charges on new books must include a fee of EUR 3 for all orders of a value below EUR 35.
France’s competition authority rigorously enforces competition rules in digital markets, especially app store providers and advertising.
A joint probe into Apple and Google investigates whether the providers impose undue fees on French start-ups, excessively collect their data, and subject them to unilaterally modifiable contracts. The investigation is currently litigated in courts that do not provide public, official sources. In 2022, Google was reportedly fined EUR 2 million and Apple was reportedly fined EUR 1 million.
In July 2023, the competition authority alleged that Apple was imposing discriminatory conditions on the use of user data for advertising purposes. The authority is currently investigating Apple’s App Tracking Transparency for self-preferencing.
In May 2023, following complaints that Facebook denied access to independent advertising verification services, the authority required Meta to publish objective, transparent, non-discriminatory, and proportionate criteria for entering partnerships.
In 2021, the authority fined Google EUR 220 million for discriminating against competitors in algorithmic auctions for online advertising. Google’s commitments to end the special treatment of its own services and improve interoperability with other advertisement providers become legally binding. In the same year, Google paid EUR 1.27 million in damages to Oxone, a directory enquiry service provider.
Other enforcement actions have addressed online content remuneration and anti-competitive agreements.
In March 2024, the authority fined Google EUR 250 million for failing to fulfil commitments from a previous case in which Google was found to circumvent publisher and press agency rights to appropriate content remuneration. The authority is currently inquiring on competition dynamics in the online content creation sector.
In October 2022, a court dropped one of the authority’s charges in a case concerning Apple’s product price-fixing, reducing fines for Apple (from EUR 1.1 billion to EUR 371.6 million) and its two wholesalers in France, Ingram Micro (from EUR 62.9 million to EUR 19.5 million) and Tech Data (from EUR 76.1 million to EUR 24.9 million).
In the same month, the authority fined Rolex EUR 91.6 million for prohibiting authorised distributors from selling its watches on the internet.
The EU’s landmark AI Act will apply in France. At the domestic level, in September 2023, France introduced a bill aiming to regulate copyright regarding AI. The bill would require AI developers to obtain right-holder authorisation to use copyrighted materials for training purposes. Furthermore, AI-generated content would have to be clearly labelled.
In March 2024, the government’s AI commission issued recommendations to incentivise AI development in France. The commission suggests establishing a World AI Organization and an International AI Fund, as well as reforming the mandate of the data protection authority (CNIL), among other recommendations.
CNIL spearheads the charge to apply data protection rules to AI providers.
CNIL is developing an AI action plan and establishing a unit dedicated to AI.
CNIL has published guidance on GDPR-compliant AI dataset creation and AI development, and is currently revising the latter.
In April 2023, CNIL fined facial recognition company Clearview AI EUR 5.2 million for violating a previous order to cease the processing of user data in France and respond to data subject requests. In October 2022, CNIL fined Clearview EUR 20 million for collecting data without an appropriate legal basis and failing to respond to data access and erasure requests.
The competition authority is also analysing the AI market. In June 2024, the authority published the findings of its market inquiry into competition in the generative AI sector. The inquiry evaluated the strategies of digital companies for consolidating market power in early stages of the generative AI value chain, focusing on issues such as cloud infrastructure, computing power, the role of skilled personnel, and partnerships with specialist companies.
The law securing and regulating the digital space, signed in May 2024, aims to secure cloud sovereignty through enhanced security measures and transparency obligations. This follows the 2021 National Cloud Strategy’s objective to achieve "technological sovereignty" through support for local cloud computing projects, increased government digitalisation, and a trust label. The strategy references two European sovereignty initiatives: GAIA-X, the European data infrastructure created by the French and German governments, and the “Next Generation Cloud Infrastructure and Services.”
France pursues a “cloud at the centre” approach in digitising governmental projects. Digital administration services are to be hosted on cloud infrastructure provided either by the government or by private providers that meet strict security criteria. To host projects involving personal or corporate data, cloud services must be certified with the new ”trusted cloud” label. The label is awarded to services of high technical and legal security, based on the “SecNumCloud” certificate of the French cybersecurity agency (ANSSI).
In addition, authorities from various policy areas have focused on cloud computing. In January 2024, the data protection authority published guidelines on data encryption and security in cloud computing. In June 2023, the competition authority published a report on competition in the cloud computing sector, focusing on IT infrastructure and platform services. As part of the report, the competition authority identified Amazon Web Services, Google Cloud, and Microsoft Azure as major providers and listed examples of competition risks, including cloud credits, egress and fees.
In 2019, France adopted a 3% Digital Services Tax on revenue generated from either digital intermediation services that enable direct user interaction (marketplaces and networking services) or targeted digital advertising services. The DST applies to companies exceeding annual revenue thresholds of EUR 750 million (global) and EUR 25 million (local).
The DST prompted an investigation by the United States Trade Representative into the DST’s potential discrimination, inconsistency with international taxation principles and burden on US commerce, leading to the announced but then deferred tariffs on French products. In 2021, the US reached a political agreement with France and other European countries in view of negotiations on the OECD/G20 Inclusive Framework. France agreed to withdraw the DST upon entry into force of Pillar 1 of the Inclusive Framework, while the US agreed to defer its tariffs. In February 2024, the countries agreed to extend the terms of this agreement until the end of June 2024.
Since 2022, France imposes value-added-tax (VAT) on the gig economy, namely ride-sharing and the delivery of goods (including food) by motorbike, scooter and bicycle. Digital platforms that facilitate such services must register and pay VAT of up to 0.5%. The tax rate is determined on a yearly basis by decree and is currently 0.46%.
In December 2023, the National Assembly adopted the finance bill for 2024, which contains measures to implement Pillar Two of the OECD/G20 Global Anti-Base Erosion Model Rules. It sets up a framework for a global 15% minimum tax for certain multinational companies.
In March 2023, the president signed a law harmonising French and EU policy, including stricter licensing rules for crypto firms. The law adapts domestic requirements to obtain a digital asset provider licence to the EU Regulation on Markets in Crypto Assets (MiCA). Providers must uphold requirements regarding anti-money laundering, fund segregation, cybersecurity, and reporting, including risk and conflicts of interest. France granted Binance such a licence in 2022.
In addition, the Financial Markets Authority (AMF) amended its general regulation to incorporate MiCA provisions and its doctrine to set out transitional provisions. In June 2024, the AMF further published an updated crypto-asset blacklist of fraudulent websites operating without the requisite authorisation.