Skip to main content

UK Children’s Safety Code must hold tech companies to account

Two weeks ago, 5Rights welcomed the publication of Ofcom’s draft Children’s Safety Code of Practice under the Online Safety Act. However, the draft Code’s complete lack of consideration for safety by design means that, without a serious review, it will fail to protect children and effectively implement the new law.

A young boy wearing headphones lying on a bed holding his tablet device.

The next government must prioritise robust implementation of the Online Safety Act and put children’s safety and wellbeing at the centre of any proposed tech strategy.

The draft proposals provide some detail for how children must be prevented from viewing very harmful content and the standards tech companies must uphold in complying with this duty. Age assurance is an important tool in keeping children safe from services that are not age-appropriate, or services that can be harmful to them, and the Online Safety Act (OSA) will be one of the first regulatory vehicles in the world to make this mandatory. Ofcom’s inclusion of stronger accountability on tech companies to look at how algorithms and recommender systems can lead children to viewing harmful content is also positive.

While this is welcome, the Code overall fails to deliver what is needed to create the online world for children as the OSA aspires to. As currently drafted it codifies current practice, is regressive in some places, and there are several areas of concern which must be addressed before publication of the final codes in 2025.

  • Age assurance: The application of age assurance measures is narrow, adopting a ‘one size fits all’ model which only proves a user is over 18 (V.5, p. 15.9) – this will fail to provide children with age-appropriate experiences, despite this being a key requirement of this section of the Act. Services will also not be required to uphold the minimum age requirements on their platforms, despite the Act being clear that tech must uphold its terms of services (V.5, p. 15.314).
  • Weak consideration of safety by design: Ofcom’s draft code focuses heavily on ex-ante measures, such as content moderation (V.5, p. 16) and user reporting (V.5, p. 18), that seek to remedy harms once they have already occurred at the expense of measures which make services safer and prevent children from being harmed. This is counter to the goal of the OSA to make services safe by design (Online Safety Act, S.1(3)).
  • Risk to children not holistically addressed: The risks to children identified in draft Risk Register (V.3) are not holistically addressed in the measures of the Code. There are significant gaps in the risky design features covered, for example there are very few mitigation measures to cover livestreaming or direct messaging despite the plethora of research – including 5Rights’ Pathways and Risky by Design – illustrating both are high-risk. Further, there are no measures to prevent children from accessing high-risk features and functionalities by default.

5Rights will continue to forensically analyse the draft code measuring it against the baseline set by the Children’s Coalition for Online Safety, led by 5Rights, which set 5 core requirements for companies to:

  1. Give high standards of protection to children using high-risk services, irrespective of the size of the service.
  2. Prioritise children’s safety in product design and development.
  3. Take a comprehensive approach to risk mitigation that considers age-appropriate access to content, features and functionalities, safety and privacy settings, user reporting, media literacy, and the advice of external experts and children themselves.
  4. Give safety teams sufficient resources and autonomy to prioritise children’s best interests, even when these conflict with commercial interests.
  5. Consider the impact of their business model on safety and ensure governance and accountability checks and balances are strong.