Next steps for online safety in the UK: 5Rights sets the criteria for legislative action
In the coming weeks, Parliament will debate proposals on social media bans for under‑16s – a critical moment to strengthen the UK’s online safety framework. As children, parents, and civil society call for more ambitious protections, 5Rights Foundation is setting out clear criteria to ensure regulation delivers real‑world change.

Protecting children in the digital environment requires more than age-based platform restrictions. Any initiative should meet the following criteria:
- Tech neutral: Regulation must go beyond social media to address dangerous persuasive design practices across all services children are using, including games, AI chatbots and EdTech. Parliament must protect children wherever they are, and confront design features that deliberately exploit their attention, manipulate behaviour, or encourage addictive use, such as endless scrolling, autoplay, targeted nudges, and reward loops. It must stamp out bad practice, and encourage all services to do better.
- Age appropriate: Regulation must support an internet that caters to children as they grow. Either products and services are safe for all, or their access to children, and service design must reflect the evolving needs and capacities of children. It should make explicit and enforce a ban on personalised services – social media, but also personalised games and AI chatbots – for children under 13. It should also mandate tiered age-gating of risky features and functionalities to protect and empower all children up to 18 as they grow.
- Holistic: Regulation must not view access restrictions in isolation, but as part of a holistic approach to age-appropriate design. As part of companies’ duty of care, they should enable children to access, and empower them to engage with, services, features and functionalities that are appropriate for their age. Safeguards beyond age-gating to be considered include introducing friction, warnings, positive nudges, safety filters and enhanced reporting and redress systems. Statutory guidance should ensure both age-tiered access restrictions and supportive safeguards for children are independently evidence-based and developmentally informed, catering to children up to the age of 18.
- Swiftly and robustly enforceable: Regulation must ensure action can be taken swiftly to restrict services in breach of the law (e.g. those not complying with minimum age limits of 13) or, based on the precautionary principle, where there is substantial evidence of high-risk to children (including older children to 18). Guidance on age-appropriate access restrictions, default settings and safety measures must be developed (and periodically reviewed) based on a clear, ambitious timetable, with swift consultation processes that include children. Once issued, providers should be given a short timeline to amend their services or ensure children cannot access them. A single regulator or commissioner should be empowered to act for children with the full range of measures, including financial penalties, mandatory operational changes, and, where necessary, business disruption or suspension.
We urge the government to act now, going beyond platform bans to tackle the root causes of harm in digital environments. Manipulative design must be prohibited; age-appropriate protections enforced and children’s rights, wellbeing, and development placed at the heart of digital policy. Companies that break the law or exploit children must be held to account – and if they refuse to comply, they should be blocked from accessing child users entirely. Enforcement cannot be piecemeal: the entire online ecosystem must be safe by design, and we must move swiftly.
“We welcome the strong political consensus for more and stronger action to protect children from harmful tech services. To deliver for children, any new initiative must meet some key criteria: it must go beyond social media to address harmful persuasive design practices which are found everywhere – from gaming to AI chatbots and Edtech; it must be age-appropriate, robustly protecting younger children while gradually empowering children as they grow; it must be holistic and go beyond blunt access restrictions; and it must be swiftly and robustly enforceable. The aim must be to have a radically different digital environment for children by this time next year, where all under 18s are both safe, and can thrive.”
Leanda Barrington-Leach, Executive Director at 5Rights