The global distribution of child sexual abuse material (CSAM) is growing exponentially. In 2018, the CyberTipline at the National Center for Missing and Exploited Children received over 18 million reports of apparent child sexual abuse, equivalent to 50,000 reports a day.
Recently, however, a number of companies have announced plans to implement end-to-end encryption across their messaging services and search engines. This could mean that current technologies used to flag CSAM or detect grooming would no longer work, representing a boon for child sex offenders all over the world. As children’s privacy campaigners, we welcome moves to strengthen privacy online, but we reject suggestions that there is a binary choice between maintaining privacy and protecting children, particularly from child sexual exploitation and abuse.
5Rights Foundation, working closely with Professor Hany Farid, co-developer of CSAM-detection tool photoDNA, has been advocating for privacy-preserving solutions for the protection of children in encrypted environments. We have called on both the internet sector and governments around the world to ensure that end-to-end encryption is only implemented if it maintains or improves existing protections for children and that neither commercial considerations nor broader concerns be put ahead of the protection of children.