UK regulator orders revised safety assessments under Online Safety Act

Online Safety Act enforcement expands as the UK’s Ofcom demands transparency and stronger safeguards from digital platforms.

Ofcom orders major platforms to submit risk assessments under the UK's Online Safety Act to improve user protection.

Ofcom has ordered more than 40 online services to submit revised risk assessments under the UK’s Online Safety Act, increasing pressure on platforms to show how they identify and reduce illegal content and other user harms.

The move marks a tougher phase in the UK’s online safety regime, with the regulator signalling that incomplete or delayed submissions could trigger enforcement action.

Ofcom said earlier reviews had identified weaknesses in several assessments, prompting companies to strengthen their approach and improve safeguards.

The requirement is especially significant for services likely to be accessed by children, which must also examine the risk of exposure to harmful content and demonstrate what protective measures they have in place. In that sense, the regulator is pushing platforms to treat safety not as a reactive moderation issue, but as a design and compliance obligation.

Ofcom has also indicated that major platforms will eventually have to publish summaries of their risk assessments, adding a transparency layer to the regime.

The latest demands suggest that the UK is moving beyond setting out online safety expectations and into a more interventionist stage focused on supervision, accountability, and enforcement.

Would you like to learn more about AI, tech and digital diplomacyIf so, ask our Diplo chatbot!