Skip to main content

TikTok’s addictive design preliminarily found in breach of the Digital Services Act: a positive step towards protection of minors on online platforms

5Rights Foundation has been advocating for swift and robust enforcement of the DSA to protect minors online since its entry into force. The preliminary finding on TikTok addictive design is a long-awaited step to enforce European rules and finally deliver for children’s safety online.  

For too long, the digital environment has been structurally misaligned with the lived realities of children and designed to maximise their engagement rather than safeguard their wellbeing. Evidence shows that TikTok is consciously designed to be addictive: endless scrolls, short and hyper-stimulating content curated by an algorithm that can react to the slightest sign of boredom, cleverly timed notifications and reminders…

5Rights’ Disrupted Childhood report notably provides compelling evidence that persuasive, attention-driven design strategies are not incidental but are core to how platforms capture and hold children’s attention, often to the detriment of their mental health, autonomy and social development. 

Internal documents disclosed through litigation in the United States further reinforce these concerns. In a lawsuit brought by US states, previously redacted internal materials reportedly show that TikTok was aware of the addictive nature of its design and the risks posed to teenage users.

Leaked internal communications have also revealed how valuable children are considered commercially. A senior TikTok executive referred to teens as ‘the golden audience’, underscoring the commercial incentives underpinning design choices that maximise engagement rather than minimise harm.

To keep their young users engaged, the features might constantly expose them to harmful content that undermine their developments and rights. The result is that some children get served up streams of content promoting sexism, racism, body dysmorphia, self-harm, porn, or radicalism, and are led down algorithmic rabbit holes that they sometimes cannot escape even if they want to.

In our Pathways report, we have demonstrated that it takes just one click on TikTok to go from a search for “slime” to porn. A child can go for a search from “trampolining” to pro-anorexia content in just 3 clicks, and to self-harm in 15 clicks. A new child account can be recommended suicide content within less that 3 minutes. It is also extremely hard to reset the algorithm. For example, children experiencing eating disorders say they cannot escape the content. A child who once engaged with such content will get 12 times more of it that a user who has not, with the algorithm serving it up every 39 seconds. Nearly 1 in 3 users of TikTok felt bad about their bodies at least weekly.

The Digital Services Act (DSA) provides with the legal tools to act against these risks and harms that children and young people face online. 5Rights has been calling for a strong and robust enforcement of the rules to ensure the safety, privacy and security of all children on online platforms.

With the publication of its preliminary findings on the TikTok investigation, the European Commission reflects these concerns, recognising that features such as infinite scrolling, autoplay and highly personalised recommendation systems can fuel compulsive use, blur the boundaries between online and offline childhood, and magnify risks that we identified in our research.   

We welcome the Commission’s willingness to finally hold major platforms to account under EU law and to foreground the best interests and rights of children in digital policy and enforcement. This is a critical inflection point: addressing design harms at their source is essential if children’s online experiences are to be safe, empowering and healthy. We are now calling for a strong and swift decision on this file and on the others ongoing investigations under the DSA rules on protection of minors.