Children & AI Design Code
The Code sets out a process to identify, evaluate, and mitigate the known risks of AI to children and prepare for the known unknowns. It requires those who build and deploy AI systems to consider the foreseeable risks to children by design and default.
Despite children making up 30% of the global population and being known as early adopters of technology, their needs, rights, and views are not represented in the public and policy debate on AI. So, while much has been said or suggested about oversight for AI, little has focused on children, and nothing practical has been done to ensure that children’s rights and development needs are met.
What you can find in this Code:
- Part One provides the context to the Code.
- Part Two provides advice and guidance on key considerations that are relevant at all stages of the Code.
- Part Three sets out the criteria that an AI system that impacts children must meet.
- Part Four describes potential risks to children.
- Part Five is the Code itself. It includes a checklist of key actions and guidance at each stage of the lifecycle of an AI system.
- Part Six provides further information, including key definitions and concepts, as well as stages of child and adolescent development and snapshot case studies to illustrate how the criteria might apply.
The Children & AI Design Code — developed by 5Rights in collaboration with leading experts in AI, child rights, law, and policy — provides a practical, actionable framework that spans the entire lifecycle of an AI system, ensuring children’s needs are prioritised by design and by default.