Compare with different regulatory event:

Description

Opened consultation on Canadian Guardrails for Generative AI (Code of Practice)

On 16 August 2023, the Government published the Canadian Guardrails for Generative AI – Code of Practice and opened a public consultation. The code of practice is voluntary and aims to provide legal clarity until the implementation of the Artificial Intelligence and Data Act (AIDA) to developers, deployers, and operators of generative AI systems. The Code of Practice consists of 6 elements. First, generative AI systems must be safe. Ways in which the system may attract malicious or harmful inappropriate use must be identified. Second, regarding the output of these systems, they must be fair and equitable. The datasets are to be assessed and curated, and measures to assess and mitigate biased output are to be in place. Third, the systems must be transparent. In particular, Fourth, the deployment and operations of such a system must be supervised by humans. A mechanism to identify and report adverse impacts has to be established. Fifth, the system's validity and robustness must be ensured by employing testing methods and appropriate cybersecurity measures. And finally, in light of the accountability of such a system, multiple lines of defence must be in place, and roles and responsibilities have to be clearly defined.

Original source

Scope

Policy Area
Other operating conditions
Policy Instrument
Design requirement
Regulated Economic Activity
ML and AI development
Implementation Level
national
Government Branch
executive
Government Body
central government

Complete timeline of this policy change

Hide details
2023-08-16
in consultation

On 16 August 2023, the Government published the Canadian Guardrails for Generative AI – Code of Pra…