Should IAEA inspire AI governance?

Nuclear energy and an International Atomic Energy Agency may not be an appropriate model for regulating artificial generative intelligence.

Artificial super intelligence, predicted by Masayoshi Son, could be 10,000 times smarter than the human brain by 2035.

Nuclear energy and The International Atomic Energy Agency (IAEA) model may not be an appropriate analogy for regulating Artificial General Intelligence (AGI). OpenAI proposes IAEA as an inspiration for regulation AGI.

However, this article highlights several key differences between nuclear technology and AI, including the lack of established paths through which AI could destroy the world and the dominant role of private enterprises in the AI space.

It also emphasises the intangible nature of AI, making it difficult to safeguard like nuclear materials.

The article suggests that instead of using the IAEA model, the focus should be on identifying specific pathways through which AI could pose an existential threat to humanity before devising control approaches.