5Rights challenges Meta’s inaction on AI-generated CSAM
5Rights calls out Instagram’s parent company, Meta, for wilfully ignoring the spread of Child Sexual Abuse Material (CSAM) on the platform.
Last month, 5Rights sent a legal letter to Meta accusing the company of ignoring the spread of AI-generated child sexual abuse images on Instagram, despite accounts being flagged to the service by police investigators.
The letter, submitted by law firm Shillings on behalf of 5Rights, references an investigation undertaken by the UK police unit OCCIT – Online CSEA Covert Intelligence Team. The report presents evidence of Instagram accountswith as many as 150,000 followers dedicated to sharing both real and AI-generated CSAM, including images of children removing clothes or depicted in highly-sexualised poses. Concerningly, these accounts are recommended through Instagram’s in-app features and are accessible to users – both adults and children – without restriction.
Alarmingly, some of these accounts are also providing links to external sites and services to purchase CSAM. In the most serious cases, these links lead to groups of adult men who are controlling children and directing them to produce CSAM for sale.
In January, investigators flagged four accounts through Instagram’s reporting feature, but no action was taken by Meta until the police made a formal inquiry several months later.
It is a criminal offence in England and Wales to distribute or show an indecent pseudo-photograph of a child, including AI-generated images. Meta’s inaction in removing this content is not only in breach of the law but also their published terms – including enforcement of their zero-tolerance policy on sharing sexual content involving minors in their Community Guidelines.
With millions of children using Instagram daily, their safety must be a priority and these widespread risks must be urgently addressed. 5Rights calls on Meta to immediately remove CSAM content, act robustly against accounts promoting it, and detail publicly how it will avoid a similar episode in the future. An investigation into Meta’s failure to routinely enforce its Terms of Service must also be launched.
Chair and founder of 5Rights, Baroness Beeban Kidron said:
“AI Child Sexual Abuse involves images of real children, it acts as a gateway drug for child sexual abuse, and it normalises abuse that has been a stain on the sector. Once again, Meta is failing children. It is clear that the consistent failure to moderate their products require urgent and radical design changes.”
5Rights will continue to monitor Instagram, and any action taken to prevent the exploitation of children online.
Read the Daily Mail’s exclusive coverage of this story