Skip to main content

News

Keep up to date with the latest development on data protection, children’s rights in the digital environment and more with our news articles and press releases.


Search all news
Filter by
Sort by
A wider shot showing the large Instagram logo on a screen. In the foreground, there is a silhouette of a person holding and using a smartphone, with the focus primarily on the logo. This is the full version of the close-up shown above.

Instagram is not doing enough to keep children safe

In response to Instagram’s latest new safety features, 5Rights believes more should be done to ensure children are fully protected on the platform.

A wider shot showing the same child sitting on a couch, holding the tablet while playing Roblox. The screen features colorful characters from the game, and the child is wearing a blue T-shirt. The entire tablet and part of the child’s seated posture are visible. This is the full version of the close-up shown above.

Gaming platform Roblox unsafe for children

5Rights urges regulatory action after new research exposes Roblox as unsafe for children, highlighting the platform’s failure to protect young users from predators and inappropriate content despite safety claims.

A young person is holding a phone, scrolling through Instagram.

5Rights challenges Meta’s inaction on AI-generated CSAM

Despite clear evidence from UK Police indicating the presence of Instagram accounts with over 150,000 followers sharing real and AI-generated Child Sexual Abuse Material (CSAM), Meta has failed to take decisive action. We have issued a legal letter demanding urgent intervention.

A wider shot showing a person holding a smartphone with the Instagram logo displayed on the screen. The person’s hands are visible as they hold the phone, and the screen is in focus, while the background is blurred. This is the full version of the close-up shown above.

5Rights calls out Meta for ignoring AI-generated child sexual abuse material on Instagram

Child safety charity, 5Rights has sent a legal letter to Meta detailing how it has ignored reports of illegal child sexual abuse material (CSAM) on its platform, Instagram.

A full view of the young girl sitting in the same position by the window. The girl is hugging her knees, with her head resting on them.

End-to-End Encryption and Child Sexual Abuse Material (CSAM)

The global distribution of child sexual abuse material (CSAM) is growing exponentially. In 2018, the CyberTipline at the National Center for Missing and Exploited Children received over 18 million reports of apparent child sexual abuse,…

A full view of the conference room at the Global Summit to Tackle Online Sexual Exploitation. The large room features tiered seating with participants working on laptops. At the front is a podium and a large screen. The high ceiling and skylight brighten the space, adding to the formal yet collaborative environment of the summit.

5Rights attends WeProtect Summit to tackle online child sexual exploitation

In December, the Global Partnership to End Violence Against Children convened the WePROTECT Global Alliance Summit to Tackle Online Child Sexual Exploitation. The Summit was co-hosted by the UK Government, the WePROTECT Global Alliance and the African Union.