Skip to main content

News

Keep up to date with the latest development on data protection, children’s rights in the digital environment and more with our news articles and press releases.


Search all news
Filter by
Sort by
An image of a glass office building. The lights are on, and blinds open at various lengths. The image appears to have been taken in the evening, as the sky has a darkening blueish hue. Attached to the outside of the building is a blue infinity logo with the company name 'Meta', which sits across the lower middle of the top floor.

Meta’s rollback on safety measures puts children at risk 

New changes announced by Meta will actively reduce existing protections for children. This is an irresponsible move – failure to implement systemic change must be challenged by regulators and policymakers worldwide as new laws and regulations come into force.

Close-up of a young person on a smartphone that displays the Instagram logo, a simplistic outline of a camera, with the words "from Meta" underneath. The background is blurred.

Meta continues to ignore the prevalence of Child Sexual Abuse Material (CSAM) hosted and promoted on Instagram. So, we have urged Ofcom and the ICO to take action.

A wider view of a voting scene showing a mother holding her young child in her arms as they stand behind a voting booth marked with the American flag and "VOTE." The surrounding area of the voting hall is visible, with other individuals in the background preparing to vote.

US elections: bipartisan support for youth online privacy and safety must continue

As the US prepares to enter a new legislative term, 5Rights calls for continued bipartisan support to advance children’s and teens’ privacy and safety online.

A wider shot showing the large Instagram logo on a screen. In the foreground, there is a silhouette of a person holding and using a smartphone, with the focus primarily on the logo. This is the full version of the close-up shown above.

Instagram is not doing enough to keep children safe

In response to Instagram’s latest new safety features, 5Rights believes more should be done to ensure children are fully protected on the platform.

A wider shot showing two young people in the background, dancing in front of a smartphone that is set up in a ring light stand. The phone screen shows the TikTok logo, and the scene suggests that they are filming a TikTok video. The blue sofa and a potted plant are visible in the background. This is the full version of the close-up shown above.

TikTok knows it is harming children

Internal TikTok documents reveal the company is promoting addictive design and targeting children, in full consciousness of the harms of its product

Australia: Children’s online safety measures must address systemic harms 

Australia: Children’s online safety measures must address systemic harms 

Bold new proposals from the Australian government to ban under 16s from social media speak to the abject failure of tech companies to provide age-appropriate services.

Teen girl with smartphone relaxing at home. She is lying on the floor, with legs resting on the sofa. There is an out-of-focus plant in the forground, to the left of the image.

Meta announces new changes for under 16s based on 5Rights principles 

In line with the requirements of the Age Appropriate Design Code, Meta’s new privacy settings for teen accounts on Instagram are a sign of promise but more work is needed.

A woman on the left and a man in the middle. Both are sat on wooden chairs. They are positioned underneath a wooden stable. You can the trees and evening light towards the back of the stable. To the right of the man is an empty chair, used in this campaign to signify the child 'lost' to social media.

Supporting families globally: our work with The Parents’ Network

5Rights is partnering with Archewell’s Parents’ Network to work with families of children severely impacted by online harms to call for online spaces to be designed with the needs and rights of children in mind.

The Berlaymont building. A close-up of the European Commission lettering which sits to the right of European Commission logo, in the middle of the image. A man walks past in a blur underneath the logo.

DSA turns 1: more potential for advancing children’s rights

Marking one year since the DSA’s enforcement for VLOPs, we look at the progress made by the European Commission and outline the need for strong guidelines and enforcement to protect child rights online.

A young person is holding a phone, scrolling through Instagram.

5Rights challenges Meta’s inaction on AI-generated CSAM

Despite clear evidence from UK Police indicating the presence of Instagram accounts with over 150,000 followers sharing real and AI-generated Child Sexual Abuse Material (CSAM), Meta has failed to take decisive action. We have issued a legal letter demanding urgent intervention.

A wider shot showing a person holding a smartphone with the Instagram logo displayed on the screen. The person’s hands are visible as they hold the phone, and the screen is in focus, while the background is blurred. This is the full version of the close-up shown above.

5Rights calls out Meta for ignoring AI-generated child sexual abuse material on Instagram

Child safety charity, 5Rights has sent a legal letter to Meta detailing how it has ignored reports of illegal child sexual abuse material (CSAM) on its platform, Instagram.

A teenage boy is sitting on a light green sofa. He is looking down at the smartphone in his right hand, and the smartphone is plugged into a charger. He face is showing limited expression, and the overall feel of the image is sad.

U.S. Surgeon General calls for action on young people’s mental health crisis

The U.S. Surgeon General is calling for warning labels on social media to alert to the fact that the services are “associated with significant mental health harms for adolescents”.