Meta to expand its child safety measures

Meta announced that it will expand it child safety measures through three areas: Recommendations and Discovery, Restricting Predators, and Strengthening Enforcement.

Meta's logo Electronics, Mobile Phone, Phone, Person, Face, Head

Meta announced that it will expand its child safety measures to combat online predators. Essentially, Meta’s Child Safety Task Force aims to enhance its child safety policies through three main areas. The first area, called ‘Recommendations and Discovery‘, involves expanding the list of child safety-related terms and applying machine learning to detect harmful associations between terms, among others.

The second, ‘Restricting Potential Predators and Removing their Networks‘, focuses on identifying suspicious adults to prevent them from finding, following, or interacting with teens on Instagram and Facebook. These include removing Instagram Reels and Facebook Groups that are found to be violating Meta’s child safety policies. Law enforcement specialists will track the evolving behaviors in online networks, including new coded language, aiming to improve and enhance detection technology.

Lastly, the third area in which Meta aims to enhance its child safety policies is by ‘Strengthening its Enforcement.’ Meta claims that it has improved its systems and that it has been using technology designed to find child exploitative imagery. Additionally, Meta announced that it joined Lantern, a program by the Tech Coalition, enabling the sharing of signals about accounts violating child safety policies for internal investigations and actions.

Why does it matter?

These measures come as a response to numerous accusations the company has faced for not preventing the spread of child sexual abuse material (CSAM). As The Verge stated, a June report in The Wall Street Journal revealed that Instagram facilitates connections between accounts involved in buying and selling CSAM through its recommendations algorithm. A follow-up investigation also revealed that the issue extends to Facebook Groups, where pedophile accounts and groups exist, some with as many as 800,000 members.

The USA has been pressuring the company to enhance its child safety measures, and the EU has given Meta a deadline of 1 December 2023 to provide more information on how it protects minors. Meta CEO Mark Zuckerberg, along with other key executives from major tech companies, will testify before the Senate in January 2024 regarding online child exploitation.