Overview
The UK, where 5Rights was founded, has pioneered digital regulation for children. It introduced the world’s first enforceable Age Appropriate Design Code in 2020, followed by the Online Safety Act in 2023, making it a key testing ground for policy innovation and implementation.
“A perfect digital world should be focused on online safety of the content. Every child should be informed about the type of content before they access it”
William, 15
Children’s experiences
Almost all 3-17-year-olds go online in the UK, mostly to watch videos, play video games, send messages to their friends and stay connected via social media. Nearly half of 11-year-olds who go online have a social media profile, despite a minimum age requirement of 13 for most social media sites. While watching videos, children are exposed to many advertisements and encouraged to spend cash as they are playing online games. Grooming cases and self-generated child sexual imagery are also on the rise, especially for younger children. 5Rights works hard to advocate that digital spaces likely to be accessed by children provide them with content and experiences appropriate to their age and evolving capacities.
Our work in the UK
5Rights works closely with policy makers and regulators and leads the work of the Children’s Coalition for Online Safety. We also partner with Bereaved Families for Online Safety to keep children’s online safety at the forefront of the political agenda. In partnership with the London School of Economics, 5Rights launched the Digital Futures for Children centre, dedicated to researching a rights-respecting digital world for children.
As part of our joint work with the Digital Futures for Children Centre, we are launching the Better EdTech Futures for Children project, which brings together young people across the UK to explore how technology and AI are shaping the classroom and to advocate for a more rights-respecting digital learning environment.
In focus
View all
Latest
View all
UK Information Commissioner issues first financial penalty under the Children’s Code
The UK’s Information Commissioner’s Office (ICO) has fined MediaLab, owner of image-sharing platform Imgur, £247,590 for misusing children’s data, in a long-overdue enforcement action under the Age Appropriate Design Code.
UK government declines to introduce EdTech standards as classroom tech expands unchecked
The government’s refusal to introduce enforceable standards for educational technology leaves children’s rights secondary to commercial interests, as new research reveals mixed impact of AI and EdTech in schools.
The Age Appropriate Design Code can protect children from AI harms – if properly enforced
As the UK Government looks to close gaps around AI, 5Rights argues that stronger enforcement of the ICO’s Age Appropriate Design Code is critical to protect children from AI-related harms.
The UK’s Online Safety Act turns two: what’s changed for children?
As the landmark legislation reaches a milestone, 5Rights examine progress, challenges and what comes next for children’s rights online.
