Instagram is not doing enough to keep children safe
Weeks after Meta announced the launch of “teens accounts”, Instagram announced on 17 October the rollout of new measures to help protect teens from sextortion on the platform. While this can be acknowledged as a positive step, more needs to be done to keep children safe on the social media platform.
5Rights welcomes the fact that Instagram finally recognises sextortion involving children as a massive problem on their platform, and that platform design plays a key role in encouraging or preventing this scourge.
The platform announced on Thursday that they will be rolling out new safety features designed to protect children on the app such as preventing users from directly screenshotting or screen recording ephemeral images or videos sent in private messages. However, these new measures point to the massive loopholes in the “teen accounts” launched just a few weeks ago.
“When it comes to action, much more needs to be done. It is positive that Meta is limiting the access users exhibiting scammy behaviour have to children’s information, but the question remains why this information is not private by default.”
Leanda Barrington-Leach, 5Rights’ Executive Director.
In regards to preventing screenshotting of ephemeral content, this may be a case of the proposed cure being worse than the disease, as this can be easily bypassed by someone intent on sextortion. It will also likely further encourage those who send unwanted sexual content to children, while denying victims of harassment the proof needed to access redress.
Already according to evidence submitted to Parliament in the UK, 90% of girls (and 50% of boys) say they are frequently or sometimes sent explicit photos or videos. More than half of girls who were sent unwanted sexual images said they did not report the incident.
The normalisation of sexualised content and behaviour online is contributing to rising sexual risks to children, with companies failing to address both the root causes and the escalating harms. In July, 5Rights sent a legal letter to Instagram’s parent company, Meta, detailing how it has ignored police reports of illegal child sexual abuse material (CSAM) on the platform.
We strongly encourage Instagram to share their risk assessment as regards sexual harms on their platform as well as ephemeral content sharing for children, and their research demonstrating the effectiveness of this proposed mitigation strategy.