On 21 April 2026, the Dutch Data Protection Authority (AP) opened a consultation on draft guidance on the right to explanation in automated decision-making under Article 22 of the General Data Protection Regulation (GDPR), until 26 May 2026. The guidance is directed at all public and private organisations using automated decision-making that produces legal effects or significantly affects individuals. The guidance would require organisations to provide 2 forms of explanation, namely a general proactive explanation typically delivered through a privacy statement covering decision logic, data categories used, and factor weightings, and a specific personal explanation upon request detailing the individual data used, the essential algorithmic steps, and the causal link between data and outcome. Organisations would additionally be required to notify data subjects of their rights to human intervention, to express a point of view, and to challenge the decision. The guidance distinguishes between insightful models such as simple rule-based systems and small decision trees, models that can be made transparent through additional techniques such as counterfactual explanations, and opaque models such as large neural networks. For opaque models, the AP notes that adequate explanation is not currently achievable, meaning organisations deploying such systems are unlikely to meet their GDPR explanation obligations. Organisations may limit explanations where a genuine trade secret or gaming risk exists but may not withhold explanation entirely on those grounds.
Original source