UK children’s bill advances with new online safety powers
House of Commons-printed text of the UK Children’s Wellbeing and Schools Bill includes child internet restrictions and data-consent changes.
The UK’s Children’s Wellbeing and Schools Bill has moved forward with a substantial set of online safety amendments, showing how child protection policy is increasingly being folded into wider legislation beyond the Online Safety Act itself. The current printed version of the bill, published as it continues through consideration of amendments between the Commons and Lords, includes new powers that could allow ministers to require providers of specified internet services to prevent or restrict children’s access to certain services, features, or functionalities where there is a risk of harm.
At the centre of the package is a proposed new section 214A to be inserted into the Online Safety Act 2023. Under that provision, the Secretary of State would be able to make regulations requiring providers of specified internet services to block or limit access for children of a specified age. The text makes clear that those powers could apply not only to entire services but also to specific features or functions within them.
That matters because the bill goes well beyond a general statement of principle. The amendments envisage regulations that could address issues such as the amount of time children spend on services, the times of day they can access them, contact from strangers, live audio or video communications, and the ability of unknown users to identify a child’s actual or approximate location. In other words, the government is seeking flexible powers to target specific design features and risks rather than relying only on broad platform-wide restrictions.
The bill would also place Ofcom into the process. As drafted, the regulator is expected to carry out research or provide advice at the Secretary of State’s request to support the making of regulations under the new power, and to publish that advice afterwards. A separate clause would require the Secretary of State, within six months of the Act being passed, to lay before Parliament a progress statement on the first regulations and a timetable for bringing them forward, unless those regulations have already been made.
Another part of the amendment package would give ministers the power to alter the age at which a child can consent to the processing of personal data in relation to information society services, within a range of 13 to 16. The text also allows for regulations on age verification for that consent, including provisions on compliance, monitoring, and enforcement. That means the bill is not only about access and harmful features, but also about the data governance rules that shape children’s use of digital services.
Also, the bill shows that Parliament has not fully settled the question of how far to go. The latest printed text also includes Lords’ amendments to Commons Amendment 38J, which would require the Secretary of State to make regulations imposing highly effective age-assurance and anti-circumvention measures for under-16s on specified regulated user-to-user services. Those Lords’ changes sit within the continuing exchange between the two Houses, rather than representing a final agreed position. The bill remains in the ‘consideration of amendments’ stage and has not yet received Royal Assent.
The broader significance of the bill is that the UK is moving towards a more interventionist model of child online safety, one that reaches beyond content moderation into product design, age assurance, feature controls, and the governance of children’s data. But the legislative picture is still in flux. What is emerging is not yet a final settlement, but a live parliamentary struggle over how prescriptive ministers should be, how much discretion they should have, and how strongly the law should push platforms to redesign services for children.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
