UK Government takes aim at manipulative digital design practices
The UK Government has announced new measures to strengthen online protections for children and young people, with a clear focus on tackling addictive and harmful digital design practices.
5Rights Youth Ambassadors give evidence to UK Parliamentarians on AI
5Rights Youth Ambassadors Eashaa and Niranjana, represented 5Rights at the UK Parliament this week, giving evidence to an inquiry of the All-Party Parliamentary Group (APPG) for Online Safety examining the impact of artificial intelligence on children.
Social media–style design is already in the classroom, new research finds
As Parliament debates banning children from social media, new research reveals that many of the same harmful design features are already embedded in the technology children use every day at school, raising concerns for children’s privacy, wellbeing and exposure to commercial exploitation in the classroom.
TikTok’s addictive design preliminarily found in breach of the Digital Services Act: a positive step towards protection of minors on online platforms
5Rights Foundation has been advocating for swift and robust enforcement of the DSA to protect minors online since its entry into force. The preliminary finding on TikTok addictive design is a long-awaited step to enforce European rules and finally deliver for children’s safety online.
Next steps for online safety in the UK: 5Rights sets the criteria for legislative action
In the coming weeks, Parliament will debate proposals on social media bans for under‑16s – a critical moment to strengthen the UK’s online safety framework. As children, parents, and civil society call for more ambitious protections, 5Rights Foundation is setting out clear criteria to ensure regulation delivers real‑world change.
UK Information Commissioner issues first financial penalty under the Children’s Code
The UK’s Information Commissioner’s Office (ICO) has fined MediaLab, owner of image-sharing platform Imgur, £247,590 for misusing children’s data, in a long-overdue enforcement action under the Age Appropriate Design Code.
UK government declines to introduce EdTech standards as classroom tech expands unchecked
The government’s refusal to introduce enforceable standards for educational technology leaves children’s rights secondary to commercial interests, as new research reveals mixed impact of AI and EdTech in schools.
Access limitations must be part of age-appropriate design, and effectively restrict companies from exploiting children
As legislators debate social media bans, 5Rights calls for access restrictions that deliver for children, arguing for tech-neutral measures that enforce existing restrictions on personalised services for under 13s and tiered default age-gating of risky features for teenagers.
Grok AI fails child safety: companies must build safely or face consequences
The discovery of child sexual abuse material on X’s Grok AI is the latest example of a pattern 5Rights has warned about for years. The tools to protect children exist. What’s missing is robust enforcement and the political will to hold companies accountable.
New ISO/IEC standard provides framework for privacy-preserving age assurance
5Rights contributed to the development of the new ISO/IEC 27566-1 standard, a new international framework for privacy-preserving age assurance systems.
Children are not test subjects: Joint Statement reaffirms children’s rights in the AI era
UN agencies and international organisations have come together to stress that states and tech companies must protect and respect children’s rights in the context of AI by design and by default.
The Age Appropriate Design Code can protect children from AI harms – if properly enforced
As the UK Government looks to close gaps around AI, 5Rights argues that stronger enforcement of the ICO’s Age Appropriate Design Code is critical to protect children from AI-related harms.
