Enforcing the Online Safety Act: Children’s Coalition sets the bar for Ofcom

In May, the UK regulator Ofcom is expected to publish a draft Code of Practice for child online safety. To support Ofcom, 5Rights and the Children’s Coalition for Online Safety have published a report setting out a series of baseline requirements the Code will need to reflect if the Online Safety Act is to effectively deliver for children.

In October 2023, the Online Safety Act (OSA) passed into the law. This new regulation empowers the regulator, Ofcom to enforce strong duties to require that all online services provide children a high level of protection and mitigate risks of harm they face by ensuring they are safe by design. Ofcom is now consulting on the Act’s provisions, including those specific to children which are due to come into force in 2025.

Coming together as 21 organisations and led by 5Rights, the Children’s Coalition for Online Safety has set out a series of baseline requirements that we expect to see reflected in Ofcom’s guidance, if the Code of Practice is to deliver and ensure tech companies to truly meet their obligations to children in the Act. The 5 core requirements are for companies to:

  • Give high standards of protection to children using high-risk services, irrespective of the size of the service.
  • Prioritise children’s safety in product design and development.
  • Take a comprehensive approach to risk mitigation that considers age-appropriate access to content, features and functionalities, safety and privacy settings, user reporting, media literacy, and the advice of external experts and children themselves.
  • Give safety teams sufficient resources and autonomy to prioritise children’s best interests, even when these conflict with commercial interests.
  • Consider the impact of their business model on safety and ensure governance and accountability checks and balances are strong.

The full report is available here.

5Rights and the children’s community has developed a strong body of evidence demonstrating the critical importance of this systemic and safety-by-design approach. 5Rights Foundation research illustrates how children are exposed to recommender systems that lead them down spirals of harmful content – including pornography, violent, eating disorder, and self-harm content, and routinely allows them to be recommended to and contacted directly by strangers. Children are also frequently dragged back into the online world through notifications, rewards and ‘popularity’ metrics. Sadly, the list is extensive.

The impact this is having on children is catastrophic. Research by the NSPCC has found that at least 1 in 20 of children encounter sexual risk online – although this could be potentially up to a quarter; and further work by Barnado’s has found over half of children had experienced cyberbullying or had seen other people being cyberbullied. The IWF, which works to prevent child sexual abuse imagery online, recorded an eight-fold increase in sexual extortion containing child sexual abuse material between 2022 and 2023.

5Rights Foundation has also developed practical frameworks to support regulators and companies in implementing child rights online. These include the Child Online Safety Toolkit, IEEE Standard 2089 for an Age Appropriate Digital Services Framework, Child Rights by Design, and the Age Appropriate Design Code.

We look forward to seeing Ofcom publish a robust draft Code that not only protects children and keeps them safe, but that encourages them to thrive in the digital world. 5Rights stands ready to support Ofcom in making this a reality.