UK: Facebook, Google, Twitter and TikTok questioned over online child safety failures

Representatives of Facebook, Google, Twitter and TikTok were questioned for five hours by 

 by a parliamentary committee in the UK that is examining the incoming Online Harms Bill. The platforms were questioned over concerns of safety of children on their platforms. Most of the members of Parliament were highly skeptical about these  platforms’ ability to keep children safe. 

The representatives were grilled on platform specific issues. Twitter over racial abuse aimed at black football players after the England team lost the Euro Cup final ; TikTok over how it works to limit spread of harmful viral  content such as the Tide Pod challenge and on how to  protect filter bubbles from forming; Google was asked to clarify over allegations of spreading and promoting  hate speech videos by the company’s algorithms. However Facebook faced the harshest criticism.

Earlier in the week, Facebook whistleblower Frances Haugen deposed before the committee and alleged that Facebook’s algorithms pushed extreme and divisive content to users. During the questioning refuting the claims,  Facebook’s Global Head of Safety  Antigone Davis stated  that the company’s algorithms demote rather than promote any form of  polarising content and that many of these are societal issues. However, she declined to state to what extent their AI system is able to detect dangerous content. She added that Facebook would welcome a regulator with “proportionate and effective enforcement powers”. Facebook, she added, is largely supportive of  the U.K. ‘s safety legislation.

On the Online Safety Bill, Theo Bertram, TikTok’s European director of government relations and public policy, and Nick Pickles, Twitter’s senior director for public policy, expressed concern that bad actors may take advantage of the provisions built into the bill to protect politicians and journalists. Facebook’s Davis expressed concern about the need to do a risk assessment around every system or product change. Google’s vice president of public policy, Markham Erickson, suggested that the committee tighten the definitions around online harms, while Pickles pointed out that terms such as “other illegal content” could be too open to interpretation.

 

The UK’s agenda differs from the one in the US, since it is examining the draft Online Safety Bill that will allow UKs regulator Ofcom to hold social media companies accountable and  fine them up to 10% of their turnover if they fail to remove or limit the spread of illegal content on their platform.