Skip to main content

Online Safety Act is now law – what does this mean for children?

The Online Safety Act places a duty of care on tech companies to make their services safer for people in the UK. The Act sets out strict requirements for tech companies to make sure their services are not harming children or being used to harm children.

A teenager is lying in bed, holding a smartphone above her face. The room is dark, and the only light comes from the phone's screen, illuminating her face as she intently looks at the device.

To protect children’s safety, the Act requires tech companies to:

1. Assess how their services might cause harm to children

The Online Safety Act requires tech companies to carry out children’s risk assessments that look at how the design of their services, their business models, functionality, algorithms and other features can contribute to or alleviate the risk of harm to children. For example, assessing how functionality like direct messaging could present a risk to children if a stranger were to contact them.

Risk assessments must also look at certain functionalities that can impact how a child might use the service. For example, features like ‘autoplay’ of video or sound content have been found to promote addictive behaviours. Assessments must consider the risk of harms to children in different age groups to make sure the services children use or engage with is age-appropriate.

As the regulator, Ofcom will publish guidance for tech companies on what the risks are to children and how the companies should carry out these risk assessments.

2. Prevent children from viewing porn, suicide and illegal content

Tech companies will have to ensure their services are safe, and that children are prevented from encountering illegal content.

They must ensure their services cannot be used for illegal activity, such as child sexual exploitation and abuse offences. For example, they must not allow child sexual abuse material to be uploaded to their site. Tech companies will be expected to use proactive technology to prevent this and have robust content moderation in place. Other content in this category includes content promoting suicide, self-harm and eating disorders.  

Ofcom will begin consulting on draft codes of practice – which will include the measures that tech companies will be expected to take to manage and mitigate the risk to children from illegal content – shortly after the Act becomes law.

3. Protect children from viewing age-inappropriate content

Tech companies have a duty to ensure their services are made safe by design to minimise risk and protect children from encountering harmful content.

This type of content includes bullying, content which incites hatred based on race, religion, disability, gender reassignment or sexual orientation, and violent content. Tech companies could take measures to protect children from this kind of content by ensuring algorithms are designed to only offer children age-appropriate content.

4. Where they need to verify someone’s age, it must be accurate and respect their data

The Act sets out the occasions where tech companies will need to use age assurance technology (for example, age verification) to protect children, and what standards that technology must meet if they are using it.

Commercial porn sites

Commercial pornography sites or tech companies with services that host pornographic content must use strong age verification or age estimation to prevent children from accessing pornography.

Age assurance technology

The Act sets out standards or rules for the use of age assurance that ensure alignment with the ICO’s Age Appropriate Design Code. These standards include, among other things, that age assurance technology should be easy to use, proportionate to the risk and easy to understand (including to those with protected characteristics), as well as aiming to be interoperable.Information collected about a child through age assurance technology should not allow for excessive data gathering.

5. Give bereaved parents and coroners access to information where a child may have died due to online harm

In response to the stonewalling faced by families and coroners during the inquests into the deaths of children Molly Russell and Frankie Thomas, the Act includes measures for coroners and for parents to take where a child is thought to have died due to online harms.

For inquests

To help coroners with their investigations, the Act gives Ofcom the power to request information from tech companies on their behalf where there is reason to believe the service may hold information relating to the death of a child. This can include information such as content viewed or engaged, how the content was encountered, how algorithms and other functionalities may have contributed, and how the content was interacted with.

Ofcom has extensive enforcement powers to use if tech companies do not comply, including the ability to hold senior managers criminally liable.

For parents

To support parents and carers in the event of a death, tech companies must provide parents with a helpline or dedicated section of their site which sets out how they can obtain information in circumstances where a child has died.