On 9 February 2026, the Ministry of Science and Technology opened a consultation on Decree No. 2026/ND-CP implementing the Law on AI. Article 14 mandates conformity assessments for high-risk AI systems through technical testing and dossier examination, with systems on a certification list requiring assessment by designated organisations. Suppliers are responsible for maintaining conformity and disclosing information. Article 15 establishes a risk-based inspection and supervision framework where state agencies monitor high-risk AI systems for compliance. Agencies can request documents and reassess certifications following serious incidents or operational deviations, and must address false declarations, with the Ministry of Science and Technology providing guidance and publishing periodic reports. Article 17 mandates that providers of medium and high-risk AI systems ensure transparency and accountability by making basic system information publicly available or ready for disclosure upon request. Suppliers of high-risk systems must maintain technical dossiers, coordinate with authorities during incident investigations, and implement remedial measures. Where the supplier of the AI model is not its developer, the supplier is responsible for coordinating with the developer to meet these requirements. The Ministry of Science and Technology is responsible for guiding methods of performing transparency and accountability responsibilities. Article 21 requires suppliers and operators of high-risk AI systems to maintain post-deployment monitoring mechanisms that track performance, detect abnormalities, and collect user feedback. Such mechanisms must also assess risk management measures, mandate system review, and document updates. Suppliers and operators must review and reclassify AI systems when significant changes occur.
Original source