Friday 2nd September 2022 marks the first anniversary of the Age Appropriate Design Code coming into force. The Code is the first statutory code of practice for children’s data anywhere in the world. It transforms the way companies collect, share and use children’s data, by requiring them to offer children a high level of privacy protection by default. The protections the Code offers are a significant and welcome change to how children are supported in the digital age.
Over the last 12 months, 5Rights has recorded the changes online services have made to comply with the regulation. To mark the anniversary, 5Rights is taking a bird’s eye view of the Code in its first year: has it achieved what it was intended to, and how has a child’s experience when going online changed as a result of the Code’s introduction?
The Age Appropriate Design Code has had a transformative impact in two ways: first, in the design changes it has inspired across services children use for play, news, education, to express their creativity and to conduct their friendships; and second, in marking a decisive shift in the culture of digital services, ensuring children’s rights – and the concept of childhood itself – are respected online.
Underlying many of the ills the Code was introduced to address is the drive for ever greater ‘engagement’: to design features, products and services which maximise time spent on the service, draw as many people as possible to the service and to encourage as much interaction as possible on the service, all of which, in a circular fashion, allow for and incentivise the collection of more personal data.
Many services have made changes so that children’s accounts are set to the highest privacy settings, by design and default. To highlight just a few of the most eye-catching changes:
Adults can no longer direct message young people who do not follow them on Instagram.
YouTube’s default upload settings have been changed to the most private setting for under 18s.
Greater user control over their online experience and positive nudges
Alongside default settings, many services have unveiled further user controls to help users manage and curate their own experience:
Twitter is expanding its Safety Mode, which blocks accounts who send abusive messages from following the recipient for seven days. Twitter will now proactively scan Tweets for abusive content, and positively nudge recipients to turn on Safety Mode.
Among Us now grants users the opportunity to manage data collection at sign up and in game, including the option to turn off a personalised game experience and opt out of data collection.
Some services have also taken innovative approaches to updating their terms of service, privacy policies and community guidelines to make them more suitable for children. For example:
King, a prominent game publisher with 16 games listed on the Apple and Google app stores, has produced a gamified version of its privacy policies to educate their users on their data practices.
Spotify has for the first time published its platform rules, setting out which content is prohibited and the consequences of uploading such content, with signposting for users on how to report any potential violations.
Changes to data practices
Services are also changing their data practices to collect, process and share fewer data on their child users. One notable example is the image sharing site Pinterest, which has made a number of changes after the introduction of the Age Appropriate Design Code. Where previously the service would nudge child users to turn on push notifications and daily personalised recommendations, Pinterest now has notifications switched off by default and users are not pressured to switch them on. In fact, Pinterest acknowledges the role of the Code with a notice in its Help Centre, regarding the use of data from other sites, from partners, Pinterest activity and measuring ad effectiveness, stating that children in the UK will not be shown ads and their data will not be shared or used outside the service.
Shift from parental controls
As seen above, many of the positive changes have been focused on changes to default settings, marking a shift from the pre-Code approach of large services resorting to parental controls as the predominant form of child safeguards. Where parental controls have been announced since the adoption of the Code, we have seen examples of services requiring teens to agree to parental surveillance before it can be activated. For instance:
Family Centre allows parents to see who their child is friends with on Snapchat. Parents can also see who their child has communicated with over the last seven days (but not the content of those conversations). Children will have to agree before their parent can begin monitoring.
Meta will create a Parent Dashboard with supervision tools for their Quest virtual reality headsets, but these tools will only be available if the child consents to their use.
Recognition of evolving capacities
Many of the positive changes have included a tacit recognition of ‘evolving capacities’; even as the age of digital adulthood has shifted from 13 to 18, with services rolling out safeguards for all under 18s, very often they adapt those safeguards for different age groups. For instance:
Google has developed engaging and easy-to-understand materials will help children and their parents understand Google’s data practices in an age-appropriate way, with options for 6–8-year-olds, 9–12-year-olds and 13–17-year-olds.
TikTok has created two bands, 13-15 and 16-17 for many of its safety features, granting older teens greater scope to manage their experiences on the app.
Impact on regulation around the world
Lawmakers around the world are also taking inspiration from the Code, with politicians and campaigners in the United States, the EU, Australia and Canada – among others – now seeking to introduce similar provisions in their own countries. Often, as in the California Age Appropriate Design Code, passed this week, this is reflected in the adoption of the Code’s fundamental tenets, such as the definition of a child as anyone under 18 and applying regulation to all services children are likely to access in practice, rather than merely those directed or targeted at them. Given COPPA, which sets the age of digital adulthood at 13 and applies only to services directed at children below this age, has been the defining legislation around children’s data protection for almost 25 years, the Code’s success in shifting the paradigm on these issues in only 12 months is remarkable.
THE FUTURE OF THE CODE
As well as celebrating its successes, 5Rights continues to work behind the scenes to identify gaps in compliance, and spot systemic breaches of the Code’s standards. In October 2021, we sent a letter to the Information Commissioner’s Office detailing 12 widespread and systemic breaches across the sector, including inadequate age assurance, failure to enforce community standards and the continuing use of dark patterns and nudge techniques. We will continue to monitor potential breaches in the hope that our research will assist the ICO in any compliance action it initiates, to ensure that the Code really does mark a transformation in children’s experience of the digital world.
Perhaps the biggest impact the Code has had has been in showing that these changes are in fact possible. For years we heard that the online world was just fundamentally different, and that trying to introduce changes like this would bring the whole thing crashing down. But the Code has shown that the digital world can not only be changed, but improved – and that the boundless creativity of the tech sector can be placed at the service of children and their best interests, rather than for ever more ‘engagement’.