EU targets platforms over child safety and addictive design practices

A new coordination mechanism aims to standardise age verification solutions across member states and reduce fragmentation in digital identity systems.

The European Commission has stepped up enforcement under the Digital Services Act, focusing on platform accountability and online child protection.

The European Commission has intensified enforcement under the Digital Services Act (DSA), targeting online platforms for child safety, addictive design features, and insufficient age-verification systems.

Executive Vice-President Virkkunen said the measures are intended to ensure platforms are held accountable when services expose minors to harmful or restricted content.

Actions have been taken against multiple major platforms, including TikTok, Facebook, Instagram, Snapchat, and Shein, over concerns related to design practices such as infinite scroll, autoplay, and highly personalised recommendation systems.

Additional enforcement has also been launched against pornographic platforms for failing to implement adequate age verification tools.

Alongside enforcement, the EU has developed a digital age verification app designed to give users control over personal data through privacy-preserving technology based on zero-knowledge proofs.

The system is already technically ready and is being tested across several member states, either as a standalone tool or integrated into national digital wallets.

The Commission is also preparing an EU-wide coordination mechanism to standardise accreditation of national solutions and avoid fragmentation across member states. The initiative aims to establish a unified age-verification framework that upholds privacy standards and supports wider adoption across digital services.

Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!