EU weighs social media age rules to protect children

Ursula von der Leyen said the EU could consider whether social media should have access to young people.

European Commission graphic for children's social media access rules

The European Commission has signalled that it may propose EU-level rules on delaying children’s access to social media, as concerns grow over addictive platform design, harmful content and AI-enabled risks for minors.

In a keynote address at the European Summit on Artificial Intelligence and Children in Copenhagen, European Commission President Ursula von der Leyen said the EU must consider whether young people should be given more time before using social media. She said the question was not whether young people should have access to social media, but ‘whether social media should have access to young people’.

Von der Leyen said almost all the EU member states had called for an assessment of whether a minimum age is needed, while Denmark and nine other member states want to introduce one. She added that the Commission’s expert panel on child safety online is advising on the issue, and that a legal proposal could follow this summer, depending on its findings.

Von der Leyen linked the debate to wider concerns about platform business models. She argued that children’s attention was being treated as a commodity through addictive design, advertising, algorithmic recommendation systems and content that can harm mental health. She also pointed to risks linked to AI-generated sexualised images and child sexual abuse material.

The Commission President cited enforcement under the Digital Services Act, including actions involving TikTok, Meta and X, as well as investigations into platforms over whether children are being drawn into harmful content. She said the EU had created strong tools through the Digital Services Act and the Digital Markets Act, and that platforms breaking the rules would be held accountable.

Von der Leyen said that any age restriction model would depend on reliable age verification. She said the EU had developed an open-source age verification app that would soon be available, including a rollout in Denmark by summer, and that the Union was working with member states to integrate it into digital wallets.

The speech also framed child online safety as a matter of platform responsibility, not just parental control. Von der Leyen said social media companies should be responsible for product safety in the same way other industries are, adding that ‘safety by design’ protections should be strengthened and expanded. She also pointed to the forthcoming Digital Fairness Act, which is expected to address addictive and harmful design practices.

Why does it matter?

The speech suggests that the EU child online safety policy may be moving from platform accountability after harm occurs towards more structural controls over access, design and age verification. A possible social media delay would mark a major shift in how the EU approaches children’s participation online, raising questions about privacy-preserving age checks, children’s rights, parental responsibility, platform duties and the balance between protection and digital inclusion.

Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!