IEEE analysis suggests growing global trend toward age-appropriate design
After international progress in 2025, there is growing consensus globally that digital systems must be designed with children’s rights and developmental needs in mind.
Five years of General comment No. 25: From promises to progress
General comment No. 25 gave the world a roadmap to realise children’s rights in the digital world. Five years on, the progress is real but leaders must act to systematically hold tech companies accountable.
UK Government takes aim at manipulative digital design practices
The UK Government has announced new measures to strengthen online protections for children and young people, with a clear focus on tackling addictive and harmful digital design practices.
5Rights Youth Ambassadors give evidence to UK Parliamentarians on AI
5Rights Youth Ambassadors Eashaa and Niranjana, represented 5Rights at the UK Parliament this week, giving evidence to an inquiry of the All-Party Parliamentary Group (APPG) for Online Safety examining the impact of artificial intelligence on children.
Social media–style design is already in the classroom, new research finds
As Parliament debates banning children from social media, new research reveals that many of the same harmful design features are already embedded in the technology children use every day at school, raising concerns for children’s privacy, wellbeing and exposure to commercial exploitation in the classroom.
TikTok’s addictive design found to likely breach the Digital Services Act
5Rights Foundation has been advocating for swift and robust enforcement of the DSA to protect minors online since its entry into force. The preliminary finding on TikTok addictive design is a long-awaited step to enforce European rules and finally deliver for children’s safety online.
From Bangkok to Bogotá, a safer internet requires tech accountability
On Safer Internet Day, experience from around the world makes one thing clear: a safer digital world for children will not come about by chance.
Next steps for online safety in the UK: 5Rights sets the criteria for legislative action
In the coming weeks, Parliament will debate proposals on social media bans for under‑16s – a critical moment to strengthen the UK’s online safety framework. As children, parents, and civil society call for more ambitious protections, 5Rights Foundation is setting out clear criteria to ensure regulation delivers real‑world change.
UK Information Commissioner issues first financial penalty under the Children’s Code
The UK’s Information Commissioner’s Office (ICO) has fined MediaLab, owner of image-sharing platform Imgur, £247,590 for misusing children’s data, in a long-overdue enforcement action under the Age Appropriate Design Code.
UK government declines to introduce EdTech standards as classroom tech expands unchecked
The government’s refusal to introduce enforceable standards for educational technology leaves children’s rights secondary to commercial interests, as new research reveals mixed impact of AI and EdTech in schools.
Access limitations must be part of age-appropriate design, and effectively restrict companies from exploiting children
As legislators debate social media bans, 5Rights calls for access restrictions that deliver for children, arguing for tech-neutral measures that enforce existing restrictions on personalised services for under 13s and tiered default age-gating of risky features for teenagers.
Grok AI fails child safety: companies must build safely or face consequences
The discovery of child sexual abuse material on X’s Grok AI is the latest example of a pattern 5Rights has warned about for years. The tools to protect children exist. What’s missing is robust enforcement and the political will to hold companies accountable.
