5Rights and the LSE’S Digital Futures for Children Centre drive transformation in EdTech to protect children’s rights
New project drives urgent national conversation on whether technology used in the classrooms is meeting children’s right to education.
5Rights, in partnership with the LSE’s joint Digital Futures for Children Centre (DFC), has launched Better EdTech Futures for Children, a project exploring how technology is impacting children and their rights in the classroom – particularly their right to education, safety and privacy.
Education technology (EdTech) – the products and services schools use to support teaching, learning and day-to-day management of children’s education – remains largely unregulated in the UK beyond data protection laws, which research shows are poorly followed. Research by the Department for Education found 99% of school leaders and 98% of teachers report using EdTech products in school.
Building on the DFC’s research into the data practices of EdTech products, we will examine the current evidence for tech use in schools, consult directly with children, teachers and parents, and hold expert roundtables to understand how, when and if these products should be used in schools.
Significant gaps in protecting children’s rights in the classroom
In Parliament yesterday, members of the House of Lords raised findings from the project’s first research with Baroness Smith of Malvern, the Minister for Education. The debate focused on amending the Children’s Wellbeing and Schools Bill to create a framework of standards for the use of EdTech products in schools in England.
Key findings demonstrate how EdTech products currently fail to protect children’s rights:
- Children’s data exposed: Despite claims about its safety and privacy, AI-powered personalisation app MagicSchool AI enables tracking cookies for children as young as 12. This exposes children to tracking from commercial websites, including erotic and friend-finder websites. Grammarly allows companies like Google, Meta and Microsoft to use children’s education data for advertising purposes.
- Inaccurate AI chatbots exploit children’s vulnerabilities: AI chatbots used for classroom learning have been found to sometimes confuse fictional characters with real figures and provide students with inconsistent information. AI chatbot design can also trigger unhealthy emotional dependency, with some children reporting severe mental health struggles.
- Vulnerable children abandoned when seeking help: Researchers found that children in the UK reporting bullying or suicidal thoughts on the MagicSchool AI chatbot can be provided with US emergency helpline numbers instead of UK resources. Further testing from the researchers additionally revealed the system refuses to engage with children seeking help until they explicitly mentioned suicide multiple times.
The full report is available here.
Driving evidence-based change in EdTech
Over the coming months, 5Rights and the DFC will work together to examine how EdTech services and products used in schools impact children’s rights:
- Centring children’s views on EdTech: Children are rarely consulted about EdTech products they must use in school. We will hold consultations in schools and work with our newly established EdTech Youth Advisory Board to surface their views and opinions on the products they use – and what they want to change.
- Identifying how artificial intelligence in EdTech impacts children: The project’s first research output, A child rights audit of GenAI in EdTech, has unearthed how AI tools already used in UK schools actively undermine children’s rights to privacy, safety and education.
- Building expert consensus: Bringing together specialists in children’s rights, data protection and education to discuss the key issues related to the use of EdTech in the classroom.
With children across the UK required to use unregulated EdTech products daily in their classrooms, this project addresses a critical gap in protecting their rights to education, safety and privacy.