Instagram is not doing enough to keep children safe
In response to Instagram’s latest new safety features, 5Rights believes more should be done to ensure children are fully protected on the platform.
Gaming platform Roblox unsafe for children
5Rights urges regulatory action after new research exposes Roblox as unsafe for children, highlighting the platform’s failure to protect young users from predators and inappropriate content despite safety claims.
5Rights challenges Meta’s inaction on AI-generated CSAM
Despite clear evidence from UK Police indicating the presence of Instagram accounts with over 150,000 followers sharing real and AI-generated Child Sexual Abuse Material (CSAM), Meta has failed to take decisive action. We have issued a legal letter demanding urgent intervention.
5Rights calls out Meta for ignoring AI-generated child sexual abuse material on Instagram
Child safety charity, 5Rights has sent a legal letter to Meta detailing how it has ignored reports of illegal child sexual abuse material (CSAM) on its platform, Instagram.
End-to-End Encryption and Child Sexual Abuse Material (CSAM)
The global distribution of child sexual abuse material (CSAM) is growing exponentially. In 2018, the CyberTipline at the National Center for Missing and Exploited Children received over 18 million reports of apparent child sexual abuse,…
5Rights attends WeProtect Summit to tackle online child sexual exploitation
In December, the Global Partnership to End Violence Against Children convened the WePROTECT Global Alliance Summit to Tackle Online Child Sexual Exploitation. The Summit was co-hosted by the UK Government, the WePROTECT Global Alliance and the African Union.