Skip to main content

TikTok knows it is harming children

As 14 US Attorney Generals sue TikTok for fuelling a teen mental health crisis, internal TikTok documents reveal the social media company is promoting addictive design and targeting children, in full consciousness of the harms of its product, and in clear disregard of online safety laws.

A wider shot showing two young people in the background, dancing in front of a smartphone that is set up in a ring light stand. The phone screen shows the TikTok logo, and the scene suggests that they are filming a TikTok video. The blue sofa and a potted plant are visible in the background. This is the full version of the close-up shown above.

Internal TikTok communications, gathered through a bipartisan national investigation into the company launched in 2022, show that the company seeks to hook users, in particular children. According to TikTok’s own research, it only takes 260 videos – the equivalent of 35 minutes on the app – to form a habit and it concludes that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety” as well as “interference[ing] with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”

“TikTok’s design choices exploit the neurotransmitter dopamine, which helps humans feel pleasure as part of the brain’s reward system to encourage reinforcement,” the suit filed by the California Attorney General read. Dopamine ‘rewards’ can lead to addictive behaviour, particularly when rewards are unpredictable.

According to reporting from NPR based on internal TikTok documents leaked from the case, TikTok executives knew their proposed time-management tools would have negligible impacts, yet decided to release and heavily promote such tools anyway, choosing a success metric of ‘increased public trust’ rather than any meaningful impact on screen time. Tests showed that the tool prompting a pause after 60 minutes only reduced daily usage by about 1.5 minutes, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool.

The States say that the internal documents show that TikTok considers users under 13 a “critical demographic” and knowingly targets them and collects their data without parental consent.

“TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” said California Attorney General Rob Bonta in a statement. “TikTok must be held accountable for the harms it created in taking away the time — and childhoods — of American children.”

“One TikTok executive referred to American teens as ‘the golden audience,’ and also stated ‘It’s better to have young people as an early adopter,'” Bonta said in an interview with NPR. “They deployed a suite of manipulative features that exploited young people’spsychological vulnerabilities.”

TikTok is widely used by children, with the company estimating that 95% of smartphone users under 17 use the app. Internal research show that “across most engagement metrics, the younger the user, the better the performance.” An internal document about “younger users/U13” reveals that TikTok instructs its moderators to not take action on reports of underage users unless their account identifies them as under 13.

The leaked internal documents also suggest the company is aware that its current moderation practices are ineffective, and children are still shown content promoting suicide and eating disorders. TikTok has acknowledged that a concerning amount of content slips through their filters, including over 30% of ‘normalizing pedophilia’ and 100% of ‘fetishising minors’. Indeed, a previous 2022 Forbes report alleged that groomers have been seen to use the platform to encourage teens to strip on camera. 

This news comes against the backdrop of previous leaks, suggesting that TikTok purposely demotes content from users they deem unattractive, disabled or ‘poor’ looking. The connection between TikTok’s algorithm and beauty filters and impacts on youth mental health and body image are known areas of concern and are now a key aspect of the legal suits. 

“Beauty filters have been especially harmful to young girls,” New York Attorney General Letitia James wrote in a statement for the suit. “Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems.”

One leaked internal report seen by NPR analysed TikTok’s main video feed and saw “a high volume of … not attractive subjects” were filling everyone’s app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful. “By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote.

The District of Columbia’s suit alleges that TikTok traps teens in online “filter bubbles” exposing them to the kinds of content that the platform claims not to allow, including videos about weight-loss, body-image, and self-harm content. An internal TikTok document notes that users are placed in these bubbles after just 30 minutes of continuous use.

The lawsuit also targets TikTok’s live-streaming feature, accusing it to function essentially as a “virtual strip club”, with the company getting 50% of the profits from the live sexual exploitation of children. An internal investigation revealed a significant number of adults messaging children to strip live, with over 1 million “gifts” sent to children in such transactions in just one month.

A year ago, more than 40 US States sued Meta on similar grounds, accusing the social media giant of designing products that are deliberately addictive and “exploit and manipulate” children.

Both companies are members of NetChoice, the industry body that is seeking to block the implementation of the Californian Age Appropriate Design Code (AADC), passed in 2022 with unanimous bipartisan support.  

While the California AADC is yet to come into force, in the UK, where the AADC is in force since 2021, the Information Commissioner’s Office in 2023 fined TikTok £12,7 million for exploiting the data of more than a million under 13s. The European Commission has also opened enforcement actions against TikTok under the EU’s Digital Services Act (DSA) ‘in areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content’. This is on top of other DSA enforcement actions, which argue that TikTok is not fulfilling its reporting obligations as a VLOP (Very Large Online Platform).