Five key changes introduced by the Age Appropriate Design Code
The Information Commissioner’s Age Appropriate Design Code, published this week, represents a sea-change in the way that children and young people experience the digital environment. Here are five of the ground-breaking changes that it makes to children and young people’s data protection.
The Code protects everyone under 18
‘A child is defined in the UNCRC and for the purposes of this code as a person under 18’.
Currently, child-specific data protections more-or-less vanish the moment children reach the age of ‘data consent’. That’s 13 years old in the UK and US, and up to 16 in countries around Europe. In sum, as it stands, as soon as a child is deemed old enough to consent to the processing of their data, they’re deemed old enough to not need any child-specific protection for their data.
The Code takes a different view. By defining a child as anyone under the age of 18, it doesn’t raise the age of data consent. Rather, it asserts that all minors are entitled to certain baseline protections for their data.
Crucially, the Code says that the nature and extent of the data protection provided should reflect the child’s age and development needs. So, while all under 18s are entitled to protection, that protection doesn’t need to be the same.
‘Likely to be accessed’ by children
The Code applies to all services that are ‘likely to be accessed’ by children, which means it covers all the services children use in practice, not just those that are designated as ‘directed to children’.
This is a significant shift from the status quo. To date, children’s online data protection has largely been defined by the Children’s Online Privacy Protection Act 1998 (‘COPPA’) in the USA. COPPA only covers online services ‘directed to children’, which doesn’t include general audience services like Facebook or YouTube, provided that children (under 13s) are not their primary audience and they don’t have actual knowledge that children are using them. In practice, this allows services to turn a blind eye to their child users.
The Code is different. In effect, it acknowledges the reality that children make up a third of internet users worldwide and spend the vast majority of their time on services that can’t be neatly categorised as ‘directed to children’. As a result, more of the onus should be on service providers to demonstrate that they don’t have child users, rather than on regulators to demonstrate that they do.
Automated recommendation of content
This paragraph from the Code’s section on profiling is clear, concise, and ground-breaking:
“If you are using children’s personal data to automatically recommend content to them based on their past usage/browsing history then you have a responsibility for the recommendations you make. This applies even if the content itself is user generated. In data protection terms, you have a greater responsibility in this situation than if the child were to pro-actively search out such content themselves. This is because it is your processing of the personal data that services the content to the child. Data protection law doesn’t make you responsible for third party content but it does make you responsible for the content you serve to children who use your service, based on your use of their personal data.“
This would tackle cases such as that of Molly Russell, the British teenager who took her own life in 2017 and whose family subsequently discovered that she had been repeatedly auto-recommended graphic self-harm and suicide content on social media platforms like Instagram and Pinterest.
Upholding terms, conditions, and community guidelines
The Code makes services accountable for how well they uphold their community guidelines/user behaviour rules, and establishes this as a fundamental precondition of ‘fairness’ under GDPR. In other words, if you don’t provide children with the environment you say you will, the processing of data can’t be ‘fair’ or ‘transparent’.
“When children provide you with their personal data in order to join or access your service they should be able to expect the service to operate in the way that you say it will, and for you to do what you say you are going to do. If this doesn’t happen then your collection of their personal data may be unfair and in breach of Article 5(1)(a).“
5Rights Foundation proposed a similar ‘regulatory backstop for community guidelines’ in its January 2019 report Towards an Internet Safety Strategy. The report suggested that the measure ‘would allow companies the freedom to set their own rules as they wish, but routine failure to adhere to their own published rules would be subject to enforcement notices and penalties.’ The Code has effectively introduced this arrangement, making it clear that companies must say what they do, do what they say, or face the consequences.
Nudge techniques
‘Do not use nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections’.
The Code’s provision on the use of ‘nudge techniques’ has been a long time coming. In 2018, 5Rights published the report Disrupted Childhood: The Cost of Persuasive Design, which exposed the use of behavioural science and psychological theory to manipulate children into giving up more of their data, including by extending their time online. In addition to our report, the Center for Humane Technology, the Norwegian Consumer Council, the Australian Competition and Consumer Commission, and even the US Senate have all taken aim at the techniques used by online services to nudge children users towards decisions and outcomes that serve commercial interests over users’ interests.
Now, for the first time anywhere in the world, there is regulation that says ‘no’ to the manipulation of children in this way. This includes nudge techniques that might lead children to lie about their age when signing up to online services, and the Code goes even further in recommending that nudges be used positively instead, to support children’s privacy.
Final word: a new deal between children and the tech sector
Taken together, these provisions represent a sea-change in the way children and young people are treated in the digital environment. There will inevitably be areas of the Code that require clarification, re-interpretation, and even revision. The Information Commissioner must take care that start-ups and SMEs are not disproportionately impacted, and equally that children’s access to the digital environment is not unduly restricted. These are issues that all regulation must contend with, however, and with over a year until the Code takes effect, none of them are insurmountable.