Social media platforms face penalties over child safety
Teenagers praise social media’s benefits, but UK ministers push for tighter regulations to address safety concerns under the Online Safety Act. Further research into screen addiction is underway.

The UK government is intensifying efforts to safeguard children online, with new measures requiring social media platforms to implement robust age verification and protect young users from harmful content. Technology Secretary Peter Kyle highlighted the importance of ‘watertight’ systems, warning that companies failing to comply could face significant fines or even prison terms for executives.
The measures, part of the Online Safety Act passed in 2023, will see platforms penalised for failing to address issues such as bullying, violent content, and risky stunts. Ofcom, the UK‘s communications regulator, is set to outline further obligations in January, including stricter ID verification for adult-only apps.
Debate continues over the balance between safety and accessibility. While some advocate for bans similar to Australia‘s under-16 restrictions, teenagers consulted by Kyle emphasised the positive aspects of social media, including learning opportunities and community connections. Research into the impact of screen time on mental health is ongoing, with new findings expected next year.