The expiry of the EU ePrivacy derogation, which allowed technology to detect child sexual abuse material online, has raised concerns over weaker child safeguards. The lapse is seen as creating legal uncertainty for platforms that rely on established detection tools to prevent ongoing harm.
For years, technology companies have voluntarily used hash-matching to detect and remove CSAM, a widely recognised tool for disrupting abuse and protecting victims.
Google is among the organisations calling on the EU institutions to urgently finalise a regulatory framework, alongside nearly 250 child rights organisations, warning that reduced capacity could impact child safety globally.
The EU institutions face criticism for failing to maintain an interim agreement, with stakeholders saying the lack of continuity undermines child online safety efforts.
Meta, Microsoft, and Snap have reaffirmed their commitment to continue voluntary detection and reporting measures while respecting user privacy. The companies also urge the EU institutions to finalise an urgent regulatory framework for consistent and effective child protection standards.
The absence of a clear framework has been described as creating instability for responsible platforms operating across Europe. Fragmented rules and legal uncertainty can slow detection and reporting systems, weakening coordinated protection efforts across platforms and borders.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
