Skip to main content

Driven into the Darkness by Amnesty International

TikTok’s content recommender system is doing significant harm to children, Amnesty International’s reports confirm.

In partnership with the Algorithmic Transparency Institute (ATI), Amnesty International has published two reports, Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and I Feel Exposed: Caught in TikTok’s Surveillance Web. They detail that TikTok’s business model of vast data collection with its effective content recommender systems is subjecting young people to serious online harm.

Lisa Dittmer, a Researcher for Amnesty International asserts:

“The findings expose TikTok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible. They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,”

Children and young people being guided to harmful content is not a product of “bugs” but a designed and maintained feature of TikTok. If these digital platforms and services are to be safe, we need safety-by-design features engrained in these applications, with the rights of children at the centre.