Kazakhstan introduces mandatory audits for high-risk AI systems

AI systems will be assessed through audits and documentation checks, with approved systems added to publicly available lists maintained by government authorities.

Flag of Kazakhstan

Kazakhstan has introduced new rules requiring audits of high-risk AI systems before they are included in official government lists. The framework sets out procedures for identifying and publishing trusted AI systems across sectors.

Sectoral authorities will compile and update lists of high-risk AI systems based on applications submitted by system owners. These lists will be published on official government websites to promote transparency and trust.

Applicants must submit formal requests, documents confirming intellectual property rights and a positive audit conclusion. Authorities will review submissions within ten working days, assessing system purpose, functionality and required documentation.

Systems that meet all criteria will be added to the list and published within five working days. If inconsistencies are identified, applicants will be notified and may resubmit documents for review within a shortened timeframe.

Updated versions of the lists will be released as revisions occur, ensuring ongoing oversight of AI systems. The measures aim to support structured monitoring and responsible use of AI technologies.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot