Meta’s Oversight Board to review handling of violent content in Israel-Hamas conflict cases

Meta’s Oversight Board will focus on a video depicting the aftermath of a Gaza hospital explosion and another featuring a kidnapped woman.

Meta expands its AI toolkit with faster models and new datasets.

Meta’s independent Oversight Board is set to scrutinise the social media giant’s response to violent content related to the Israel-Hamas conflict. Two cases involving a hostage-taking and a bombing mark the inaugural use of a new expedited review mechanism, mandating decisions within 30 days.

Amidst a surge in violent and misleading content during the two-month-old conflict, Meta temporarily adjusted its content removal criteria but faced criticism for the alleged suppression of support for Palestinians. The board will review the removal and restoration of a video depicting the aftermath of an explosion at Al-Shifa Hospital and another showing a kidnapped woman.

Meta has expressed openness to the review and pledged to implement the board’s decisions.

Why does it matter?In response to the 7 October attacks by Hamas on Israel, Meta removed over 700,000 pieces of content breaching its policies, including posts supporting or praising Hamas. This action aligns with the European Union’s call to major social media platforms like Meta, X, YouTube, and TikTok to intensify their efforts in combating misinformation. Despite these efforts, and with the Oversight Board in the picture, combating harmful content becomes progressively more challenging. This is exacerbated by the unsettling trend of hate groups exploiting AI to generate content, posing a new layer of complexity and urgency to the issue.