Skip to main content

New UK data law: what does it mean for children’s privacy?

The new UK data law introduces significant changes in children’s data protection, including new codes of practice for EdTech, AI and automated decision-making, following years of advocacy by 5Rights. 

A classroom of young students working on desktop computers, with educational software visible on the screens. A boy in a striped shirt and a girl in a yellow top are prominently shown, deeply engaged in a digital activity. A teacher stands in the background, observing the class.

The UK Government has passed the new Data (Use and Access) Act 2025, establishing changes to the UK’s data protection framework. The Act introduces new codes of practice for data processing in education technology (EdTech) and in AI and automated decision-making (ADMs), alongside updates to UK GDPR underpinning the Age Appropriate Design Code (AADC). 

These changes reflect the sustained leadership and advocacy of 5Rights, Baroness Beeban Kidron, the Digital Futures for Children Centre (DFC) and broader civil society, whose research, legislative engagement and advocacy shaped the Act’s provisions.  

Children’s right to privacy by design and default  

A significant win for children’s privacy came in the form of a key amendment to Article 25 of the GDPR, secured by 5Rights Honorary President, Baroness Beeban Kidron. This amendment requires services to account for the higher standard of protection children are entitled to when designing and carrying out data processing. It also reinforces that services must consider children’s differing needs at various ages and stages of development, a principle long championed by 5Rights. 

New code of practice for data use in EdTech 

The House of Lords also secured a Government commitment to introduce a new code of practice to address the high risks of processing children’s data in EdTech products and services. This is a welcome development amid growing concerns over how children’s data is handled in digital learning environments. 

Research by the joint LSE and 5Rights Digital Futures for Children Centre (DFC) uncovered widescale non-compliance with data protection by EdTech companies, with children routinely exposed to data profiling and tracking beyond the school gates. 

Supporting research further illustrated the issue, revealing that a child accessing teaching resources on Vimeo via Google Classroom was tracked by 92 third-party services – including Google, TikTok, Facebook and Amazon. Many children also reported feeling unhappy with their data being collected in this way. 

New code on AI and automated decision-making  

After peers, 5Rights and others in civil society raised the alarm over these systems, the Government committed to introducing a new code of practice on AI and automated decision-making. Government minister Lord Vallance of Balham confirmed that the code will include guidance about protecting data subjects, including children. 

Strengthening investigations into children lost to online harms  

The Data (Use and Access) Act also strengthens a provision originally secured in the Online Safety Act through a campaign led by the Bereaved Families for Online Safety and supported by 5Rights, which gave coroners powers to request data from tech companies in investigations where the online world is thought to have played a part in a child’s death. The new laws goes further by requiring tech companies to retain this data and introducing penalties for failure to comply.  

Updating the Age Appropriate Design Code amid GDPR changes 

The Act also introduces changes to the UK GDPR. These include broadening the definition of scientific research (Article 4) and of the instances where data can be further processed (Article 5), introducing new recognised legitimate interests (Article 6) that allow new bases for processing data without a balancing test, and changes to rights related to automated decision-making, including profiling (Article 22). These revisions risk undermining the safeguards children currently enjoy under the Age Appropriate Design Code. 

The Information Commissioner’s Office (ICO) will now review the Age Appropriate Design Code to assess how these changes, alongside the new codes of practice, may affect its scope.  

Since the AADC is recognised internationally as the global gold standard in children’s data protection, 5Rights has warned that the Government and regulator must send a strong signal that this level of protection must not be diluted. 

5Rights remains committed to supporting the ICO as they review the AADC and produce new codes of practice, ensuring the UK maintains and builds upon its global leadership in children’s data protection. We will continue to seek robust enforcement of these rules so that children receive the highest level of protection they are entitled to.