Oversight Board criticises Meta’s rules on faked videos as incoherent and narrow

The current policy only blocks fake videos of people saying things they did not say and does not cover portrayals of people doing things they did not do.

 Electronics, Phone, Mobile Phone, Text

Meta’s Oversight Board has called on the company to revise its policies regarding manipulated media, particularly in the context of a faked video of President Joe Biden. The board upheld Meta’s decision not to remove the video, as it did not violate the company’s current manipulated media policy, which only applies to videos altered by AI that show people saying things they did not say.


However, the board denounced the policy as ‘incoherent’ and too narrow, coming short of addressing other forms of manipulated content, including edited audio or videos portraying people doing things they did not do. Meta established the external advisory board in 2020 as an independent body in charge of reviewing the firm’s content moderation decisions. It has the authority to make mandatory decisions on individual pieces of content, but its policy recommendations are not binding on Meta.

Why does it matter?


The board’s comments and guidance aim to ensure that Meta’s policies are clear, justifiable, and practical in deterring the spread of manipulated media that could deceive users and damage the integrity of democratic processes. In the wake of the fake Biden video, the Oversight Board’s decision stresses the need for Meta to focus on the potential harms of doctored media rather than the means by which content is produced.

The board members recommend that Meta urgently update its policies to cover a broader range of manipulated content, including audio and audiovisual, regardless of the method used to create the outcome. This recommendation comes in light of the upcoming elections in the US and elsewhere around the world, where such fake content could have consequential impacts. Mark Zuckerberg, the company’s founder and CEO, also stood in a Senate hearing room on 31 January and had to apologise directly to the families of children victims of sexual exploitation on social media platforms.

Meta has acknowledged the Oversight Board’s guidance and has committed to reviewing its recommendations, with a response expected within 60 days as per the company’s bylaws.