Meta extends fact-checking to Threads amid concerns but allows opt-out in the US

As Threads introduces new features, including search and tagging functions, experts warn of potential challenges in curbing the dissemination of false information on the platform.

 Text, Electronics, Mobile Phone, Phone

Meta has announced plans to extend its fact-checking system to its text-based platform, Threads, addressing previous content moderation gaps. The move follows Media Matters’ highlighting of misinformation on Threads, with far-right figures testing its policies.

However, Meta will allow Threads users in the US to opt out of part of the fact-checking program, potentially impacting the platform’s effectiveness. Threads is also introducing features, such as a new tagging, that may increase the speed of content spread, raising concerns about misinformation reach.

Moreover, researchers are concerned that Threads’ search function, reportedly deliberately blocking specific keywords, may contribute to spreading inaccurate information. As Meta continues to add features to Threads, the platform faces challenges in enforcing policies and preventing the spread of false and harmful content.

Why does it matter?

According to Media Matters, Meta is prioritising content moderation and user safety less. They provide various examples since the beginning of 2022, such as the recent decision to allow Facebook users to create multiple personal profiles and switch between them without logging out. Meta has also downsised about a quarter of its workforce since November 2022, including teams focused on site security, privacy, and integrity. These developments raise concerns, particularly with the upcoming 2024 elections, as they can potentially disrupt electoral processes.