Meta takes action against child sexual abuse material on Instagram
SIO found a network of social media profiles managed by children promoting child sexual abuse material (SG-CSAM) and initiated efforts to locate and close these accounts.
According to a Meta spokeswoman, Meta has organised a task group to investigate increasing charges that Instagram manufactured child sexual abuse content for distribution and sale.
The Stanford internet Observatory (SIO) found a network of social media profiles purportedly managed by children promoting child sexual abuse material (SG-CSAM). The report raised the alarm, and the SIO, in collaboration with social networks and anti-child sexual exploitation organisations, initiated efforts to locate and close these accounts and prosecute individuals guilty of this criminal behaviour.
According to the SIO, Instagram enabled users to search for phrases that its algorithms recognised were related to SG-CSAM, with a pop-up box alerting users that the results may contain photographs of child sexual abuse but still allowing them to continue seeing the results. Following a warning, Instagram eliminated the opportunity for people to review material.
EU industry chief Thierry Breton will meet with Mark Zuckerberg on June 23. During the meeting, Breton will demand immediate action from Zuckerberg to address the issue of content targeting children, as Meta’s voluntary child protection code appears to be ineffective. Breton has stated that Meta will need to show the actions it intends to take to adhere to the European Union’s Digital Services Act (DSA), which governs online content, by 25 August.
SIO has recognised account networks advertising SG-CSAM as a commercial concern. Meta has removed these accounts and updated its policy. Researchers have also uncovered active Instagram and Twitter accounts with similar misuse. Meta took measures to eliminate specific phrases and hashtags related to SG-CSAM, and it said that it shut down 27 violent networks between 2020 and 2022.