Online platforms face new EU duties on child protection
Draft regulation sets fresh obligations for digital firms to curb child sexual abuse material across EU platforms.
The EU member states have endorsed a position for new rules to counter child sexual abuse online. The plan introduces duties for digital services to prevent the spread of abusive material. It also creates an EU Centre to coordinate enforcement and support national authorities.
Service providers must assess how their platforms could be misused and apply mitigation measures. These may include reporting tools, stronger privacy defaults for minors, and controls over shared content. National authorities will review these steps and can order additional action where needed.
A three-tier risk system will categorise services as high, medium, or low risk. High-risk platforms may be required to help develop protective technologies. Providers that fail to comply with obligations could face financial penalties under the regulation.
Victims will be able to request the removal or disabling of abusive material depicting them. The EU Centre will verify provider responses and maintain a database to manage reports. It will also share relevant information with Europol and law enforcement bodies.
The Council supports extending voluntary scanning for abusive content beyond its current expiry. Negotiations with the European Parliament will now begin on the final text. The Parliament adopted its position in 2023 and will help decide the Centre’s location.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
