Ofcom must rethink Online Safety Act illegal harms Code of Practice
Seven years in the making, 5Rights celebrated the passage of the Online Safety Act into UK law in October 2023. Upon receiving royal assent, the Act signalled the opportunity to create systemic change for children in the online world. Ofcom quickly followed this momentous occasion with the first of its draft codes of practice on the illegal harms section of the Act.
However, while we acknowledge the scale of their task in implementing this new regulation, we regret that the first draft of Ofcom’s proposals falls short of what is needed to deliver on the promises of the Act.
Although Ofcom has clearly sought to understand many of the risks associated with illegal harms, our analysis found that the proposals demonstrate a highly concerning lack of consideration and alignment with existing UK regulation and best practice regarding concerns around children.
In particular:
- The proposals give far greater consideration to the interests and costs to business than the costs to the many victims who have come to harm because of the commercial imperatives of tech companies. While the Act requires regulated services take a “proportionate” approach to fulfilling their duties, Ofcom is also required to look at the severity of harm.
- The proposals place undue focus on the size of services rather than their risk, creating a regime that will exempt many services from comprehensive duties. Small is not safe, and companies with 7 million users are not large – they are behemoths.
- Ofcom’s draft Risk Register posits that for the majority of illegal offences in scope of the legislation – including grooming, encouraging suicide and harassment, stalking, threats and abuse offences – the business model is not a risk factor, but various functionalities including recommender systems are. There is a wealth of evidence that functionality designed to keep attention is intrinsically connected to the business model.
- Considering the risk associated with a product, feature, or functionality before it has been introduced and mitigating harm ahead of its introduction is the norm in most other sectors and a fundamental principle of safety by design, as required for services already under the UK’s Age Appropriate Design Code. Yet, Ofcom has chosen to only require this kind of ex-ante assessment for the largest services or as a secondary measure in the risk assessment proposals.
- The draft Illegal Harms Code of Practice overly focuses on ex-post facto measures rather than outcomes-based standards which would promote safety by design and encourage innovation in safety. 5Rights considers that measures should go as far as possible, expressed in processes that iterate until the goal has been reached, thereby driving creative solutions and innovations, while furthering investments in online safety.
- Ofcom’s proposals interpret online safety ‘measures’ as tools rather than systems and processes, which could mean that companies are judged as compliant with the regulation even when the desired outcome has not been achieved. In the context of established international legal frameworks, existing regulation on children’s privacy, best practice, as well as a substantial body of independent expert opinion, there is no justification for this interpretation.
- On child user measures, we are confused as to why these would only be required of services with a high risk of grooming. Default settings should be applicable for all offences, and there should be alignment with hard-won existing regulation in the Age Appropriate Design Code.
- We are deeply concerned that Ofcom’s fixation on technical evidence – which tech companies control – rather than evidence of outcomes could lead to online safety regressing in the UK.
Read our full response to Ofcom’s illegal harms consultation here.
Read the statement from the Online Safety Network on Ofcom’s approach signed by 5Rights here.
Read the letter from the Bereaved Families for Online Safety to Ofcom here.