Skip to main content

Let’s make it easy for online services to protect children’s data

“This code provides practical guidance on how to design data protection safeguards into online services to ensure they are appropriate for use by, and meet the development needs of, children.” 

So says the Information Commissioner’s Office (ICO) in its draft Age Appropriate Design Code, published in April of this year. 

The need for safeguards to protect children are common to children’s health, education, labour, and entertainment – in fact, virtually every sector. The same ought to be true for their data, as is recognised by Recital 38 of the GDPR (“Children merit specific protection with regard to their personal data”).

So far, so uncontroversial – until the ICO required the following of online services in the course of providing this ‘specific protection’:

“You must apply [the Code’s] standards to all users unless you have robust age checks in place to distinguish children from adults.”

This is neither nuanced nor flexible. “This code requires all of us to be treated like children”, cried one organisation. “Nothing less than a mandate to lock the entire British internet – every site, service, and app – behind an age-gated surveillance system”, warned another. 

These reactions were perhaps guilty of failing to understand that the Code requires online services to provide data protection for under 18s – not content moderation/restriction. In the context of the code, therefore, ‘treating all of us like children’ effectively means ‘giving all of us greater protection for our data’. But the reaction wasn’t entirely unwarranted given the wording in the draft. Requiring every online service likely to be accessed by children to age-verify is not the kind of proportionate, risk-based regulation for which the ICO has a strong reputation. It also failed to reflect the level of upheaval that this would require, not least given the current state and sophistication of the age-verification market. 

The internet, conceived as a place in which all users are equal, has always failed to distinguish children from adults. This is a norm that can only be changed at a fundamental, systemic level, and even the most committed optimists, including the Information Commissioner herself, acknowledges that ‘age-verification tools are still a developing area’. 

With all that said, the question that the ICO has posed of industry is the right one: if you don’t know which of your users are children, how can you provide them with specific protection? If the solution to this problem is not age-verification (at least not for the time being), then a different one needs to be found. 

Ticking the box

Currently, most online services ask for your permission to use their data, and most of us just tick the box. We are all familiar with the cookie pop-up that greets us every time we visit a website. 

But what if instead of this…

An illustration of a tablet screen showing a pop-up message commonly seen on websites. The message reads: "We use cookies and other technologies to collect user data from your device, so that: We can deliver content and advertising that's relevant to you; We give you the best experience." Below, there is a green button that says "Got it!" In a speech bubble above the tablet, it reads: "You're probably used to seeing something like this when you visit a website."

The choice was this…

A modified version of the previous image. The same pop-up message appears on the tablet, but now includes an additional line: "If you're under 18, we give you greater protection for your data. Click Accept or read our Under 18s Privacy Policy." There are two buttons: one says "Got it!" and the other says "Accept (Under 18)." A speech bubble says: "What if it looked more like this...?" with another red bubble highlighting the need to "Give the child or young person the opportunity to self-declare for the purposes of data protection."

Meaning a child who says they are a child will be offered the data protection that the Code demands.

The final screen shows a new message on the tablet that reads: "GREAT! Because you are under 18 we have given you EXTRA protection for your data. You can now enjoy our site in PRIVACY." There is a green button saying "Do you want to know more?" A red bubble points out, "But when you self-declare SOMETHING ACTUALLY HAPPENS!!" The website shown in the background is named "FUNtube."

High-privacy by default to restrict the spread of a child’s data; restrictions on the commercial use of geolocation data; protection from nudge techniques that encourage children to give up their data or extend their use; a presumption against profiling for marketing purposes; child-friendly privacy information; and easy-to-use tools allowing children to exercise their data rights. A Code-compliant service, by design and default, for any child who wants it. 

And that is a key point. For all the worry about a child who may lie about their age, there are tens of millions of children who don’tAnd when there’s no restriction on content or access, just extra protection for their data, the incentive to lie withers away. Some sites will always need more robust age verification.  It is against the law to sell knives, alcohol, tobacco, many pharmaceuticals and various other prohibited items to under 18s. The proposal we’ve outlined here is clearly not appropriate to fulfil those obligations. But accessing a high bar of data privacy should not require a high bar of entry – quite the contrary.  

A generation of children has become accustomed to clicking ‘accept’ and surrendering their privacy. Imagine if clicking ‘accept’ meant just the opposite.