Skip to main content

The Age Appropriate Design Code can protect children from AI harms – if properly enforced 

As the UK Government looks to close gaps around AI, 5Rights argues that stronger enforcement of the ICO’s Age Appropriate Design Code is critical to protect children from AI-related harms. 

A young child is sitting at their desk doing homework using an AI assistant. The AI assistant is depicted with an overlayed graphic element on the picture.

AI is rapidly becoming embedded in children’s lives – from the AI applied to the tech they use at school to the chatbots they talk to at home and the recommender systems and in-game agents they engage with as they play and socialise – raising urgent questions about how we can keep children safe using these technologies.  

Recent reports that AI tool Grok was used to generate child sexual abuse material illustrate the real-world harms that arise when companies fail to embed safety-by-design principles in their products.  

Data protection is a key component of children’s online safety. Without it, AI services can profile and manipulate children, encourage emotional dependence and misuse their data – all risks set out in 5Right’s Children & AI Design Code

As the UK Government looks to close regulatory gaps surrounding the application of AI, it must recognise that data protection has a critical and preventative role to play in protecting children from AI-related harms. 

The ICO created the Age Appropriate Design Code (AADC) to provide a holistic data protection framework to protect children online. Applied to AI systems, as demonstrated in our Code, the AADC can ensure companies minimise data collection, stop profiling and manipulating children, set protective default settings, and design systems in ways children can understand.  

Current regulation can protect children’s privacy and safety, but only when companies apply all 15 standards set out in the AADC. Yet no service has achieved full compliance. Last month, the ICO published its 2024–25 strategy update for the Age Appropriate Design Code, revealing that while some improvements have been made, such as changes to default privacy settings, no service has yet been confirmed as fully compliant. The update shows what proper regulatory attention can achieve but the ICO’s scattered enforcement has left gaps in protection for children in AI-driven online spaces: 

As the UK Government looks to close regulatory gaps around AI, it must embed strong data protection in regulation to ensure children are genuinely kept safe, and companies are held accountable.  

Equally, the ICO must deliver a clear plan for holistic enforcement of all 15 AADC standards, addressing the grave risks and challenges children face in AI-driven online environments. If it lets companies get away with following one or two standards, children’s online safety in AI-driven environments will remain compromised and the promise of the AADC will go unfulfilled.