US authorities call for companies to make sure Artificial Intelligence (AI) products do not violate civil rights

US officials warn of AI risks for financial institutions and businesses. The increasing use of AI in areas such as lending, employment and housing could lead to bias and civil rights violations, particularly against marginalised groups. Companies must ensure their use of AI is not discriminatory.

 Architecture, Building, Office Building, Urban, Person, Pillar, City, House, Housing, Portico, Parthenon, Prayer, Shrine, Temple

US officials from the Consumer Financial Protection Bureau, the Justice Department’s civil rights unit, and the Federal Trade Commission have cautioned financial institutions and other businesses about the risks of using AI. The officials warned that the increasing use of AI in various sectors such as lending, employment, and housing could lead to bias and civil rights violations, particularly against marginalised groups. They emphasised that the responsibility lies with companies to ensure that their use of AI is not discriminatory, and that they understand the reasons behind the decisions made by their AI systems.

The Consumer Financial Protection Bureau is actively seeking whistleblowers from the tech sector to identify where new technologies breach civil rights laws. The bureau’s director, Rohit Chopra, explained that if companies cannot comprehend why their AI is making certain decisions, they cannot legally use it. The officials also signalled their intention to monitor marketplaces for instances of discrimination arising from the use of AI.

This warning comes at a time when the popularity of AI tools has prompted increased scrutiny from regulators in both the US and Europe. In particular, the use of AI in decision-making processes has raised concerns about accountability, transparency, and the potential for bias. The officials stressed that innovation should not be used as a cover for lawbreaking and called for responsible use of AI to ensure that the technology is not used to discriminate against vulnerable populations.