Meta ends fact-checking program in the US
Mark Zuckerberg shifts the company’s content moderation approach.
Meta Platforms has announced the termination of its US fact-checking program and eased restrictions on politically charged discussions, such as immigration and gender identity. The decision, which affects Facebook, Instagram, and Threads, marks a significant shift in the company’s content moderation strategy. CEO Mark Zuckerberg framed the move as a return to ‘free expression,’ citing recent US elections as a cultural tipping point. The changes come as Meta seeks to build rapport with the incoming Trump administration.
In place of fact-checking, Meta plans to adopt a ‘Community Notes’ system, similar to that used by Elon Musk’s platform X. The company will also scale back proactive monitoring of hate speech, relying instead on user reports, while continuing to address high-severity violations like terrorism and scams. Meta is also relocating some policy teams from California to other states, signalling a broader operational shift. The decision follows the promotion of Republican policy executive Joel Kaplan to head of global affairs and the appointment of Trump ally Dana White to Meta’s board.
The move has sparked criticism from fact-checking organisations and free speech advocates. Angie Drobnic Holan, head of the International Fact-Checking Network, pushed back against Zuckerberg’s claims of bias, asserting that fact-checkers provide context rather than censorship. Critics, including the Centre for Information Resilience, warn that the policy rollback could exacerbate disinformation. For now, the changes will apply only to the US, with Meta maintaining its fact-checking operations in regions like the European Union, where stricter tech regulations are in place.
As Meta rolls out its ‘Community Notes’ system, global scrutiny is expected to intensify. The European Commission, already investigating Musk’s X over similar practices, noted Meta’s announcement and emphasised compliance with the EU’s Digital Services Act, which mandates robust content regulation. While Meta navigates a complex regulatory and political landscape, the impact of its new policies on disinformation and public trust remains uncertain.