A close-up of New Zealand’s regulatory approach to data governance, content moderation, competition, artificial intelligence, and more.
The “DPA Digital Digest” series provides concise summaries of each economy’s digital policy. Based on the Digital Policy Alert database, we outline rules and enforcement cases in data governance, content moderation, competition, artificial intelligence, and domestic points of emphasis.
New Zealand is a leading digital nation and its Digital Strategy aims to secure this status through trust, inclusion and growth. The digital technologies sector contributed NZD 7 billion (approx. USD 4.27 billion) to the GDP in 2021, according to the Ministry of Business, Innovation, and Employment. Since 2016, this sector has expanded at a robust annual rate of 10.4%, significantly outpacing the overall economic growth rate of 5.1%.
At the international level, New Zealand prioritises engagement with regional partners and supports multilateral approaches to trade. It is also actively involved in several key trade agreements, including the ASEAN-Australia-New Zealand Free Trade Area, the Comprehensive and Progressive Agreement for Trans-Pacific Partnership, the Pacific Agreement on Closer Economic Relations, and the Regional Comprehensive Economic Partnership.
But what do New Zealand’s domestic digital policies stand for? This Digest provides a succinct overview of the latest policy and enforcement developments in major policy areas and New Zealand-specific points of emphasis.
Data governance: New Zealand introduced bills to amend the Privacy Act, including to grant users the right to access their generated data, and is drafting a code on biometric data.
Content moderation: New Zealand amended its content moderation framework, established voluntary frameworks to address harmful content and focused its enforcement on terrorist content in online streaming.
Competition policy: New Zealand amended the Commerce Act and focused its enforcement action on preventing anti-competitive practices and mergers.
Artificial intelligence: New Zealand focused on deploying AI systems securely, applying privacy principles to AI and increasing the use of AI in the public sector.
New Zealand’s points of emphasis include the taxation of the digital economy and online gambling.
Jump directly to the section that interests you most:
Discover the details of New Zealand's regulatory approach on our dedicated country page.
Remain up-to-date on new and upcoming developments with our free notification service.
Written by Maria Buza and Svenja Bossard. Edited by Tommaso Giardini.
The protection of personal data in New Zealand is governed by the Privacy Act of 2020. The Act establishes 13 privacy principles that regulate how agencies collect and process personal information. Data collection and processing has to be lawful, necessary, and directly obtained from individuals who are informed of the purpose of its collection. Agencies that collect and process personal data must also report data breaches that are likely to cause serious harm within 72 hours. The Act further grants the Privacy Commissioner (OPC) the authority to issue binding codes of practice. The OPC is currently drafting a biometrics privacy code to define biometric information and set obligations for agencies, such as conducting proportionality assessments and ensuring transparency. Previously, the OPC adopted a privacy code for telecommunication providers.
The government is currently deliberating two Bills that concern the Privacy Act:
The Customer and Product Data Bill, introduced in May 2024, seeks to create a framework for accessing and sharing customer and product data among businesses. Under the Bill, businesses must provide designated data to customers and accredited third parties.
The Privacy Amendment Bill, on which the government held a consultation until June 2024, introduces a new privacy principle to require agencies that collect personal information indirectly to inform individuals, as for direct data collection.
Regarding cybersecurity, the government is deliberating rules to strengthen infrastructure cybersecurity. Currently, there is no legal definition for critical infrastructure. A draft framework, released in June 2023, would consider telecommunications, financial, and digital services (such as data storage) as critical infrastructure. The framework proposes to improve government collection of data on ownership, control, and cyber incidents and establish enforceable minimum resilience standards. Previously, the Intelligence and Security Act established the National Cyber Security Centre to assist organisations with cybersecurity incidents impacting nationally significant organisations.
New Zealand does not have a general data localisation requirement. The Tax Administration Act requires records related to any business or economic activity carried out in New Zealand to be kept and retained in the country. Offshore storage is allowed with approval from the Commissioner of Inland Revenue.
Agencies can transfer personal information overseas if the recipient is subject to privacy laws or contractual obligations providing safeguards comparable to the Privacy Act. If no such protections are in place, informed and explicit consent is required for data to be transferred. The OPC can prohibit trans-border data flows if the information received from another state is likely to be sent to a third state that lacks comparable safeguards or if it could violate the OECD guidelines on the protection of privacy and transborder flows of personal data or the basic principles of the Privacy Act. The government is currently deliberating the Privacy Amendment Bill, which would empower the OPC to assess the privacy protection level in third countries regarding data transfers.
The OPC has the authority to issue both binding and advisory guidelines and serves as the primary enforcer of the Privacy Act.
The OPC has issued several guidelines, including on sensitive information, privacy responsibilities, and privacy impact assessments. In April 2024, the OPC issued a report following consultations with stakeholders on young people's privacy. The OPC indicated that it is considering developing specific guidance and exploring regulatory options to address concerns raised during the consultation. Additionally, several agencies have released guidelines to address cybersecurity concerns, including the Reserve Bank's guidance on cyber resilience and the Computer Emergency Response Team's guide on using behavioural insights for online security.
In terms of enforcement, the OPC opened two investigations since the adoption of the Privacy Act:
In February 2024, the OPC launched an investigation into Foodstuffs North Island’s facial recognition technology (FRT) trial across 25 stores. The OPC requested evidence justifying the use of FRT to reduce retail crime, given concerns about privacy, effectiveness, and potential bias against people of colour.
In collaboration with the Australian Information Commissioner, the OPC launched an investigation into Latitude following a data breach in March 2023. The breach exposed sensitive personal information, including driver’s licences, passports, and financial data.
New Zealand's content moderation framework builds on the 1993 Films Videos and Publications Classification Act. The Classification Act criminalises the creation and distribution of "objectionable" material. The Classification Office determines whether a publication is objectionable, including depictions of extreme violence, dehumanising behaviour, or the sexual exploitation of minors.
In 2022, the Urgent Interim Classification of Publication and Prevention of Online Harm Bill expanded the Classification Act's scope in response to a terrorist attack in 2019:
The live streaming of objectionable content is considered a criminal offence with penalties of up to 14 years of imprisonment.
The Chief Censor, who heads the Classification Office, can issue "interim classifications" if content is deemed potentially harmful and objectionable.
The Department of Internal Affairs can issue take-down notices to online platforms to remove or block access to publications classified as objectionable. Failure to comply within 24 hours can lead to court-ordered enforcement and fines of up to NZD 200,000 (approx. USD 124,215).
The Bill’s provisions aiming to create government-backed systems to block or filter harmful online content were rejected by Parliament.
The Harmful Digital Communications Act (HDCA), implemented in 2015, criminalises the posting of digital communications with the intent to cause harm to an individual. Netsafe, appointed as a qualified agency, is responsible for handling complaints and resolving them through mediation, as well as connecting with both domestic and international platforms. Platforms are exempt from liability for user-generated content if they follow the Act's procedures. These include notifying the author of a complaint and, if necessary, removing or disabling the content within 48 hours. If complaints cannot be resolved, the District Court may issue enforceable orders, including content removal or apologies. An amendment adopted in March 2022 introduced stricter penalties against the unauthorised posting of intimate visual recordings.
In July 2023, the Department of Internal Affairs conducted a review on safer online services and media platforms to update the current regulatory framework. It proposed regulating social media platforms through a set of codes overseen by a newly established industry regulator to mitigate harmful content. In April 2024, the DIA released a summary report concluding the review. To date, no Bills have been submitted to Parliament, and the government has not proposed any rules.
Since August 2023, New Zealand is deliberating on the Fair Digital News Bargaining Bill. The Bill requires large digital platforms to negotiate compensation with local media companies for using their news content. It outlines a framework for negotiations, a bargaining code, and an arbitration process to ensure fair agreements.
New Zealand has advanced a framework for industry self-regulation and voluntary frameworks to address harmful and illegal content.
Under HDCA, approved agencies can adopt codes of practice in collaboration with online platforms. In July 2022, Netsafe, in collaboration with NZTech, developed the code on Online Safety and Harms. Its signatories, which include Meta, Google, TikTok, Twitch, and X, agreed to reduce the prevalence and mitigate the risks of harmful content in areas such as child sexual exploitation, hate speech, incitement of violence and disinformation.
In June 2022, the DIA adopted the voluntary digital child exploitation filtering system addressed to internet service providers. The system blocks access to websites containing child sexual abuse material. The system operates alongside enforcement activities, including online investigations and coordination with other agencies to take down objectionable websites. The system does not remove illegal content but blocks access to it.
In May 2019, New Zealand led international efforts through the Christchurch Call to Action, urging governments and platforms to address the dissemination of terrorist and violent extremist content. The signatory platforms pledge to implement transparent measures to prevent the upload and dissemination of such content and enforce community standards.
The Classification Office designated several materials as objectionable.
In March 2019, the Christchurch Mosque Attack Livestream and the text file The Great Replacement Manifesto of the perpetrator were classified as objectionable and banned on the basis that they promote criminal acts, including mass murder and terrorism.
In October 2019, Halle Attack Livestream was banned for depicting and promoting extreme violence and terrorism. The distribution and possession of the game The Shitposter was prohibited after it was found to trivialise and promote criminal activity, specifically acts of terrorism and mass murder, through its gameplay and in-game commentary.
In May and July 2022, the streaming of the Buffalo mass shooting and the pseudo-documentary Three Faced Terrorist -- Part Two were banned for depicting acts of extreme violence and cruelty.
In its guidance, the DIA noted that in most cases, the content classified as objectionable is already content that violates the online platform's terms of use. DIA uses trusted flagger programs and in-platform reporting systems to request the removal of such content. If these measures fail, a formal take-down notice is issued, requiring hosts to remove or restrict access. In its report, DIA specified that it issued 245 informal requests and 26 formal requests in 2023. In February 2024, the DIA reported on 47 investigations that uncovered nearly 3 million pieces of illegal material and led to the seizure of 209 devices. Additionally, more than 1.1 million websites hosting child sexual abuse material were blocked.
New Zealand has not adopted specific rules for digital competition and instead applies the Commerce Act of 1986, which prohibits certain practices and mergers that could substantially lessen competition. In April 2023, the Commerce Act was amended to expand the prohibition regarding abuses of market power. Businesses may now be in breach of the Act if their actions substantially lessen competition, regardless of the intent or direct use of market power. Anti-competitive conduct can include actions such as refusal to supply, exclusive deals, loyalty rebates, or predatory pricing. The amendment further prevents large businesses from enforcing intellectual property rights in circumstances that would reduce competition significantly.
The Commerce Commission's enforcement primarily concerns preventing anticompetitive mergers and enforcing consumer protection laws.
In August 2023, the Commerce Commission approved Microsoft's acquisition of Activision Blizzard. It concluded that the merger would not significantly reduce competition, as Activision games are not essential for rivals to compete with Microsoft.
In July 2024, the Commerce Commission blocked AlphaTheta's acquisition of Serato. AlphaTheta, which supplies DJ hardware and software, intended to merge with Serato, a leading DJ software provider. The Commission denied the merger, concluding it would significantly reduce competition in the DJ hardware and software markets.
In May 2024, the High Court ruled against Viagogo for misleading consumers in a case brought by the Commerce Commission. The court ordered Viagogo to correct its website information and amend terms to allow disputes to be settled in New Zealand courts. Viagogo has appealed the ruling.
New Zealand has not yet proposed or enacted specific legislation regarding artificial intelligence (AI).
Several ministries and government agencies issued guidelines addressing the use and security of AI systems:
In July 2024, the Ministry of Business, Innovation and Employment adopted a paper promoting responsible AI use across the economy and public sector. It addresses challenges such as mistrust, low uptake, and limited adoption of AI, which hinder innovation and productivity. The paper promotes a risk-based approach in line with the OECD AI Principles.
In April 2024, the National Cyber Security Centre issued a guide on deploying AI systems securely. It covers security measures for AI systems, including collaboration with IT, threat modelling, secure deployment, continuous protection, access controls, and regular audits. In November 2023, the agency also published guidelines for secure AI system development.
In September 2023, the Privacy Commissioner (OPC) issued guidance on AI and information privacy principles, outlining privacy considerations and legal requirements for AI use. The guidance recommends conducting and updating privacy impact assessments and addresses issues such as data relevance, purpose alignment, tracking, and AI accuracy. Previously, in May 2023, the OPC adopted guidelines on the use of generative AI.
In August 2023, the government proposed a 3% tax on gross digital services revenue earned by multinational groups earning over EUR 750 million (approx. USD 820 million) yearly from global digital services and more than NZD 3.5 million (approx. USD 2.13 million) annually from digital services provided to New Zealand users. The proposal, if adopted, would apply from January 2025, with a possible delay of up to five years. This flexibility allows the government to evaluate the effectiveness of the OECD/G20 Pillar-Two-Approach regarding the taxation of the digital economy before implementing the tax.
Currently, non-resident suppliers of remote services must register for and charge goods and services tax (GST) on sales to New Zealand residents if annual sales exceed NZD 60,000 (approx. USD 36,564). Electronic marketplaces, acting as sales agents, are responsible for collecting GST on behalf of suppliers. Since April 2024, New Zealand requires gig and sharing economy platforms, such as Uber and Airbnb, to collect GST. This shifts the tax responsibility from individual service providers to platforms. Additionally, an offshore gaming duty of 12% is imposed on GST-registered entities outside New Zealand that offer remote gambling services to residents.
In November 2021, guidelines under the Gambling Act clarified which forms of remote interactive gambling are prohibited. This includes the use of devices such as computers and phones for gambling that involves payment and chance. The advertisement of overseas gambling is also prohibited. Specifically, foreign gambling operators are banned from promoting or publicising their services to prevent New Zealanders from being encouraged to gamble on international platforms.
Additionally, the Digital Instant Game Rules, effective from October 2017, regulate digital instant games. Providers can offer their games to users who are over 18 years old and have to comply with the conduct rules.