Eric Schmidt: Leave AI regulation to Big Tech, governments lack expertise

Former Google CEO Eric Schmidt argues that AI regulation should be left to Big Tech, as governments lack the expertise. Schmidt believes premature government regulations are too restrictive and suggests an agreement among industry players to prevent a ‘race to the bottom’ in AI development. He emphasises the need for international consensus on defining ethical boundaries.

 Smoke Pipe

Former Google CEO Eric Schmidt believes that governments should leave the regulation of artificial intelligence (AI) to big tech companies, as they currently lack the necessary expertise. Schmidt made these remarks in light of the potential revolutionary impact of AI on productivity, which could lead to significant job displacement in the knowledge sector.

During an interview with NBC’s Meet the Press, Schmidt argued that those outside the tech industry lack the knowledge required to define the ‘reasonable boundaries’ of AI or determine where they should be placed. Schmidt expressed concern about premature government regulation, which he believes tends to be overly restrictive. Instead, he suggested that the key players in the industry should reach an agreement to prevent a ‘race to the bottom’ in AI development.

Schmidt’s perspective is influenced by his role as the chair of the National Security Commission on Artificial Intelligence. The commission’s final report, released in March 2021, recommended that the United States invest $40 billion into developing AI technology. Schmidt emphasised the need for establishing guardrails against unethical AI practices and promoting international agreement on defining those boundaries.

Currently, US officials are soliciting public comments on the best approaches to regulating AI until 10 June. Europe may also refine its own regulatory plans during this period. Schmidt believes that the critical issue is finding ways to address the worst behaviours associated with AI while securing international consensus on defining those behaviours.