On 2 October 2025, the Dutch Data Protection Authority (AP) and the Netherlands Authority for Consumers and Markets (ACM) issued a joint statement requiring organisations that use chatbots in customer service to ensure that individuals are always able to communicate with a human representative. Organisations are required to clearly disclose when a chatbot is being used, and to guarantee that the chatbot does not provide incorrect, evasive, or misleading information. The regulators emphasised that consumer law already obliges businesses to communicate directly, effectively, and accurately, a requirement that is reinforced for intermediary services such as social media, marketplaces, and online platforms under the Digital Services Act (DSA), which mandates that customers must have access to a non-automated communication method. The statement noted that forthcoming amendments to consumer law under the Digital Fairness Act will provide additional clarity for other businesses, while new transparency obligations under the Artificial Intelligence Regulation (AI Act), applicable from 2 August 2026, will require organisations to inform users when they are interacting with an automated system. The AP and ACM indicated they will intensify supervision due to increasing complaints, highlighting risks related to poor responses, lack of human access, unclear identification of chatbots, and information security and privacy vulnerabilities.
Original source