Overview
The UK, where 5Rights was founded, has pioneered digital regulation for children. It introduced the world’s first enforceable Age Appropriate Design Code in 2020, followed by the Online Safety Act in 2023, making it a key testing ground for policy innovation and implementation.
“A perfect digital world should be focused on online safety of the content. Every child should be informed about the type of content before they access it”
William, 15
Children’s experiences
Almost all 3-17-year-olds go online in the UK, mostly to watch videos, play video games, send messages to their friends and stay connected via social media. Nearly half of 11-year-olds who go online have a social media profile, despite a minimum age requirement of 13 for most social media sites. While watching videos, children are exposed to many advertisements and encouraged to spend cash as they are playing online games. Grooming cases and self-generated child sexual imagery are also on the rise, especially for younger children. 5Rights works hard to advocate that digital spaces likely to be accessed by children provide them with content and experiences appropriate to their age and evolving capacities.
Our work in the UK
5Rights works closely with policy makers and regulators and leads the work of the Children’s Coalition for Online Safety. We also partner with Bereaved Families for Online Safety to keep children’s online safety at the forefront of the political agenda. In partnership with the London School of Economics, 5Rights launched the Digital Futures for Children centre, dedicated to researching a rights-respecting digital world for children.
In focus
View allLatest
View allUK’s AI Opportunities Action Plan overlooks risks and potential for children
The UK Government has set a key marker of its plans for full-scale adoption of AI into the economy, making clear its intention to see its use scaled in education. Children must be a part of the conversation on the adoption of AI into services they have no choice but to use, with consideration of the opportunities and risks it poses.
5Rights Foundation escalates legal action against Meta over AI-generated child sexual abuse material on Instagram
Meta continues to ignore the prevalence of Child Sexual Abuse Material (CSAM) hosted and promoted on Instagram. So, we have urged Ofcom and the ICO to take action.
Holes not fixed: UK regulator publishes final proposals for tackling illegal harm online
Ofcom has published its final proposal for the Illegal Harms Code, and in its current iteration, it will fail to utilise the Online Safety Act to best protect children online.
Win for children: UK make safety by design a strategic priority
The UK Government has announced new statutory strategic priorities on the enforcement of the Online Safety Act, instructing Ofcom to enforce safety by design. A big step for children’s safety on World Children’s Day.