Ofcom targets search engines over harmful content following Online Safety Act enactment

A commissioned report revealed alarming statistics, indicating that one in five search results on self-injury terms leads to further harmful content, underscoring potential risks on open-ended sites like Google with over 80 billion monthly visits.

 Electronics, Mobile Phone, Phone, Scoreboard

UK regulator Ofcom is shifting focus to major search engines like Google and Bing, citing concerns over their role in providing easy access to harmful content related to self-injury and suicide, particularly for underage users.

A commissioned report covering Google, Bing, DuckDuckGo, Yahoo, and AOL revealed alarming statistics: one in five search results on self-injury terms leads to further harmful content. This highlights potential risks on open-ended sites like Google, which has over 80 billion monthly visits compared to TikTok’s 1.7 billion active users.

The flaw in moderation algorithms was also identified, with Ofcom considering regulatory actions, fines, and criminal liability for non-compliant tech firms to address user safety. The regulatory authority plans a spring consultation to address concerns raised in the report, emphasizing the need for search engines to improve protection, especially for children.

Google disputes some findings, emphasizing its safety features, while Microsoft and DuckDuckGo are yet to respond.

Why does it matter?

As pointed out by TechCrunch, the report fails to address the potential impact of generative AI searches, an emerging concern that could further exacerbate issues associated with harmful content. While measures are in place to prevent the misuse of platforms like ChatGPT, uncertainties persist regarding user attempts to circumvent these controls.