Let’s make it easy for online services to protect children’s data

“This code provides practical guidance on how to design data protection safeguards into online services to ensure they are appropriate for use by, and meet the development needs of, children.” 

So says the Information Commissioner’s Office (ICO) in its draft Age Appropriate Design Code, published in April of this year. 

The need for safeguards to protect children are common to children’s health, education, labour, and entertainment - in fact, virtually every sector. The same ought to be true for their data, as is recognised by Recital 38 of the GDPR (“Children merit specific protection with regard to their personal data”).

So far, so uncontroversial - until the ICO required the following of online services in the course of providing this ‘specific protection’:

“You must apply [the Code’s] standards to all users unless you have robust age checks in place to distinguish children from adults.”

This is neither nuanced nor flexible. “This code requires all of us to be treated like children”, cried one organisation. “Nothing less than a mandate to lock the entire British internet – every site, service, and app – behind an age-gated surveillance system”, warned another. 

These reactions were perhaps guilty of failing to understand that the Code requires online services to provide data protection for under 18s – not content moderation/restriction. In the context of the code, therefore, ‘treating all of us like children’ effectively means ‘giving all of us greater protection for our data’.  

But the reaction wasn’t entirely unwarranted given the wording in the draft. Requiring every online service likely to be accessed by children to age-verify is not the kind of proportionate, risk-based regulation for which the ICO has a strong reputation. It also failed to reflect the level of upheaval that this would require, not least given the current state and sophistication of the age-verification market. 

The internet, conceived as a place in which all users are equal, has always failed to distinguish children from adults. This is a norm that can only be changed at a fundamental, systemic level, and even the most committed optimists, including the Information Commissioner herself, acknowledges that ‘age-verification tools are still a developing area’. 

With all that said, the question that the ICO has posed of industry is the right one: if you don’t know which of your users are children, how can you provide them with specific protection? If the solution to this problem is not age-verification (at least not for the time being), then a different one needs to be found. 

Ticking the box

Currently, most online services ask for your permission to use their data, and most of us just tick the box. We are all familiar with the cookie pop-up that greets us every time we visit a website. 

But what if instead of this…

The choice was this…

Meaning a child who says they are a child will be offered the data protection that the Code demands.

High-privacy by default to restrict the spread of a child’s data; restrictions on the commercial use of geolocation data; protection from nudge techniques that encourage children to give up their data or extend their use; a presumption against profiling for marketing purposes; child-friendly privacy information; and easy-to-use tools allowing children to exercise their data rights. A Code-compliant service, by design and default, for any child who wants it. 

And that is a key point. For all the worry about a child who may lie about their age, there are tens of millions of children who don’tAnd when there’s no restriction on content or access, just extra protection for their data, the incentive to lie withers away. Some sites will always need more robust age verification.  It is against the law to sell knives, alcohol, tobacco, many pharmaceuticals and various other prohibited items to under 18s. The proposal we’ve outlined here is clearly not appropriate to fulfil those obligations. But accessing a high bar of data privacy should not require a high bar of entry – quite the contrary.  

A generation of children has become accustomed to clicking ‘accept’ and surrendering their privacy. Imagine if clicking ‘accept’ meant just the opposite.

The images above are taken from 5Rights’ booklet for kids Demystifying the Age Appropriate Design Code

5Rights Foundation makes case for children's rights at IGF, Berlin 2019

Read More

5Rights Foundation leads joint UK letter to FTC on COPPA Rule Review

Read More

UNCRC General Comment expert consultation takes place in London

Read More

Broadband Commission calls for meaningful universal connectivity to drive global development

Read More

100+ brilliant women in AI making technology work for, and with, society

Read More

5Rights joins Peace Day pledge against Cyber Violence

Read More

Let’s make it easy for online services to protect children’s data

Read More

BLOG: From African American Apocalyptic Fiction to Children’s Rights in the Digital Environment, by Colin Gibson

Read More

5Rights YouGov poll: Parents' views on internet and child data protection regulation

Read More

The Age Appropriate Design Code: "A new deal between children and the tech sector"

Read More

5Rights welcomes Online Harms White Paper, calls for swifter legislation

Read More

Working with the Government of Rwanda and Police Scotland to support children online

Read More

House of Lords Select Committee on Communications Publish Report ‘Regulating in a Digital World’

Read More

UNCRC General Comment on children’s rights in relation to the digital environment

Read More

Council on Extended Intelligence Publish ‘The Case for Extended Intelligence’

Read More

UN High Level Panel on Digital Cooperation

Read More