Holes not fixed: UK regulator publishes final proposals for tackling illegal harm online
Ofcom’s Illegal Harms Code of Practice is the first of the UK’s Online Safety Act framework to be laid before Parliament. Earlier this year, we set out our concerns that the Code did not go far enough and its approach was not in line with the ambition or intent of the Act. From our initial assessment, these holes have not been fixed. The regulator must do more to future-proof this crucial framework.
This week, the UK Government has laid the first of Ofcom’s Illegal Harms Code of Practice before Parliament. Earlier this year, we published our assessment of the draft code stating that it fell short of the ambition, spirit and intent of the Online Safety Act and called on Ofcom to drastically rethink their approach to the implementation: refocusing on corporate responsibility and safety by design.
From our initial assessment we are disappointed that many gaps in the Code persist – setting an unwelcome precedent for the future of the regulatory framework:
- Safety by Design: The legislation has a clear objective that services are made “safe by design” but the majority of Ofcom’s proposed measures are not designed to prevent harm from occurring in the first place – instead focusing on content moderation and reporting tools. While greater requirements on governance and accountability are welcome, this in itself will not ensure safety by design. The final Code will also not require the largest and riskiest companies to include results from product testing – common in most other industries – in their risk assessment in all circumstances. We are deeply concerned that some measures have been watered down from the proposals set out earlier this year. On content moderation, Ofcom will now only require companies to take down illegal where it is “technically feasible” (Vol. 2, 2.45) – potentially creating a loophole for companies to absolve themselves of this duty and actively disincentivising innovation of more effective means to remove illegal content.
- Proportionality: The final code continues to give greater consideration to commercial interests and business costs over the victims who have come to harm because of the failure of tech companies. In its final statement, Ofcom reaffirms this by stating they are “more concerned that measures may hinder providers of smaller services.” (Approach, 1.150) As we noted in our response, proportionality should reflect the severity and risk of harm to users – in particular children.
- Size of service: Despite concerns from civil society that Ofcom’s threshold for large services has been set too high – seven million monthly UK users – it has failed to change its position. Concerningly, Ofcom has actively removed some measures from smaller, low-risk services deeming them “not proportionate.” (Approach, Pg 1) Further, the Government has also approved Ofcom’s advice on categorisation thresholds – which will mean high-risk services will not be subject to the most stringent duties of the Act.
- Child user measures: Measures in the Code on putting in place default settings for child users – particularly where functionalities allow children to be contacted by or be searchable by strangers – only apply to services where there is a high-risk of grooming: despite the fact the UK’s Age Appropriate Design Code states this should be on “all services likely to be accessed by children.” This measure remains unchanged in the final Code of Practice.
- ‘Safe Harbour’: Ofcom has broadly dismissed the concerns of civil society organisations that the Code’s safe harbour status could allow tech companies to roll back safety measures, creating a ‘race to the bottom’ to comply with the regime.
“At 5Rights we have spent the last year working to provide the evidence to support Ofcom to fix the holes in its draft online safety plans to protect children.
“With the final proposals for how they will tackle illegal harm published, we find that they have not fixed these holes and have watered them down in key areas.
“The Online Safety Act gives Ofcom substantial powers to keep children safe and prevent them coming to harm. There is no excuse not to use them.”
Colette Collins-Walsh, Head of UK Affairs
In the coming weeks, we will continue to assess the Code in full. Ofcom has said it will consult on further measures for the next iteration of the Codes in the Spring, including applying certain measures to small single-risk services. While this consultation is welcome, it is evident that future iterations of the Code will require a wholly different approach that prioritises the protection of users upstream and ensures tech companies are held to account for all risks on their services.