A child rights audit of GenAI in EdTech: Learning from five UK case studies
This report advances the DFC’s and 5Rights’ research on A better EdTech future for children and builds on earlier DFC research on EdTech and education data.
“Across all GenAI tools we studied, children’s perspectives were largely excluded from their design, governance and evaluation and all tools undermine children’s rights to privacy and protection from commercial exploitation.” (Ayça Atabey)
Executive summary
Generative artificial intelligence (GenAI) tools are increasingly embedded in digital services and products that are used for and in education (EdTech), raising urgent questions about their impact on children’s learning and rights. We take a holistic child rights approach to children’s learning to evaluate five GenAI tools used in education – Character.AI, Grammarly, MagicSchool AI, Microsoft Copilot and Mind’s Eye.
Using mixed sociolegal methods, including product walkthroughs, policy analysis and consultations with children, educators and experts around the world, we evaluate how these digital tools operate, and we assess the claims they make. These assessments are conducted in the light of the United Nations Convention on the Rights of the Child (UNCRC) and the Committee on the Rights of the Child’s General comment No. 25 regarding the digital environment.
Our primary focus is on how these tools uphold key rights under the UNCRC, including children’s rights to education (Article 28), privacy (Article 16), to be heard and have their views respected (Article 12), non-discrimination (Article 2), the principle of the best interests of the child (Article 3.1), the right to appropriate support for children with disabilities (Article 23), access to information (Article 17) and freedom of expression (Article 13).
While each GenAI tool offers the potential to facilitate learning through, for example, supporting creativity, communication and accessibility, each also presents notable risks. These risks arise because of opaque data practices, poor transparency, commercial exploitation through nudges, advertising and tracking, including from age-inappropriate adult website advertisers, all of which are incompatible with children’s best interests. Overall, many claimed benefits remain unverified, and the increasing presence of GenAI and its increasingly ‘by default’ integration reflects institutional or market priorities more than children’s needs and interests.
Across the five tools studied, children’s perspectives were largely excluded from their design, governance and evaluation. The case studies reveal that these tools undermine children’s rights to privacy and protection from commercial exploitation. The tools may support rights such as education, play, expression and access to information, potentially enhancing children’s learning. However, there is limited evidence for these benefits, especially a lack of evidence from diverse groups of children, younger children and those with disabilities.
Read more on the Digital Future for Children website.