Age Appropriate Design Code turns 2: how has it affected children?
Saturday 2nd of September is the second anniversary of the Age Appropriate Design Code (AADC) marking two years that children in the UK have had a legally enforceable right to a high level of privacy protection online by design and default.
In the lead up to its introduction in 2020, and in the months ahead of its entry into force, the AADC led to a suite of practical changes in service design for the benefit of children.
In its second year, the Code’s influence has continued to grow, with policy makers and regulators across the world looking to adopt its underlying principles in their own legislation (including in California, which passed its own AADC in September 2022), and tech companies continuing to respond to demands that their services be made safer for children.
In the last 12 months, the Information Commissioner’s Office (ICO) has continued to build on its research and guidance of the Code to provide services more clarity and support. In May, the regulator issued stronger guidance for Edtech products and services and in February detailed guidance for games developers to ensure they develop products in compliance.
In a year that saw TikTok fined £12.7million and the Department for Education reprimanded by the ICO for the misuse of children’s data, the impact and spread of the Code has never been timelier.
The Code in action
Since its introduction, many of the world’s largest tech companies have introduced changes to the design of their services in recognition of the rights and needs of children. From turning off tracking and geolocation to introducing positive nudges and better transparency, the Code has been the catalyst for important changes for children online.
While we have yet to judge any service fully compliant, there have been a number of major changes which have taken them in the right direction:
Default settings: Many services have put in place privacy settings by default for their users under 18.
For example: Instagram now automatically sets the accounts of users under the age of 16 to private during initial account set-up, and Snapchat turns all under 18 accounts to private by default, meaning they do not appear as browsable public profiles. Google turns SafeSearch on by default for all under 18s.
Transparency: A number of services have taken steps to introduce community and privacy policies accessible to children – by either making them easier to understand or making them more engaging with images and cartoons.
For example: Discord launched a Safety Center that clearly defines community rules and regulations in age-appropriate language and what actions users can take to monitor and manage their use of the service and seek redress.
Recognition of evolving capacities and positive nudges: Some services have brought in controls to help users manage and curate their own experiences.
For example: Microsoft Edge, a web browser, has launched Kids Mode, with two modes available, one for children aged 5 to 8 and another for 9 to 12s. YouTube introduced a ‘new to you’ feature to encourages users to diversify the content they view – to “go beyond your usual videos”.
Empowerment tools and advertising: In February, Meta introduced measures which prevents users aged 13-17 across the world from seeing advertising based on their activity. Google has announced plans to stop advertising to all under 18s in the EU.
Services have also announced they will introduce greater empowerment tools. Meta will be making its reporting options more easily accessible, and Snapchat will allow all users to turn off tracking content personalisation.
Across the world
This year, the Code’s influence has continued to spread across the globe, with policy makers considering their own regulation underpinned by many of the key tenets of the Code.
Following the adoption of the Californian AADC in September 2022, children’s data protection legislation based on the Code’s fundamental principles was considered in Maryland, Minnesota, Nevada and New Mexico and similar codes were discussed in Argentina and Indonesia.
This year also saw the introduction of the EU’s landmark Digital Services Act (DSA) – which shares many of the principles of the Code and sets a new global standard for broader risk management, content moderation and transparency by tech companies.
The DSA has already driven further commitments and changes (often at a global level), and, with the largest companies having completed and delivered their risk assessments under the law last month, more is certainly to come.
Earlier this year, on the annual day of the Rights of the Child in March, UN High Commissioner for Human Rights Volker Türk called for states to prioritise the full implementation of General comment No. 25 (2021) on children’s rights in relation to the digital environment, and take up best practice – including the Code.
Looking into the future, the Code looks set to continue its international spread helping to create a global standard so children everywhere can enjoy a digital world they deserve.
Compliance
This year, 5Rights has continued to engage proactively with services and the ICO to support understanding and compliance of the Code.
We are carrying out ongoing investigations with a view to working actively on more issues with more companies in the coming year. In particular we are looking at how apps reflect data practices in their age ratings, how they gather and share geolocation data of children and how the design of some services pushes harmful material to children.
The future
In its second year, the Code has continued to demonstrate that it is possible to design a digital world which serves children’s best interests. With the Code’s influence spreading internationally, the promise of delivering a digital world children deserve is looking brighter.