What we can learn from Italy’s moves against TikTok

The Italian Data Protection Authority (DPA) has opened an investigation following the death of a 10-year-old girl in Palermo who took part in a “blackout challenge” on TikTok. It has imposed an immediate block on data processing by TikTok on users whose age cannot be established with certainty. This follows ongoing investigations into TikTok’s ‘poor attention to the protection of children.’ 

A TikTok spokesman told the Guardian

“The safety of the TikTok community is our absolute priority, for this motive we do not allow any content that encourages, promotes or glorifies behaviour that could be dangerous.” 

In December, the Italian DPA raised concerns about easy to-circumvent minimum age requirements to register, poor transparency and clarity in published terms, unspecified data storage periods and low-privacy default settings in its notice to TikTok, and gave the service 30 days to respond. TikTok did not respond to the regulator’s notice, but soon after introduced a wave of changes to its service globally. These changes introduced new privacy and safety settings for users under 16, including default private accounts, restrictions on commenting features to ‘Friends’ only, and switching algorithmic profile recommendations ‘off’ by default. 

These changes are a step in the right direction but will only make a difference to children’s experiences if TikTok is able to verify which of its users are under 16, and therefore are subject to higher privacy and safety settings. The Italian DPA believes TikTok’s age assurance process, which asks users to enter their date of birth on registration, is not sufficiently robust to protect children. With the introduction of more robust age assurance, TikTok can expect to see its user base drop – assuming that millions of its users are under the age of 13. This could be a reason that TikTok have been largely silent on matters surrounding its age assurance process.  

Italy’s moves against TikTok come as the EU considers proposals under the Digital Services Act, which will require companies to provide greater transparency over algorithms. The proposed regulation will force companies like TikTok to be more transparent about the way it uses children’s data to feed algorithmic recommendation systems that promote, spread and amplify content to targeted audiences. When that content is harmful, as was the case with the “blackout challenge”, these recommendation systems can have devastating consequences.  

Measures to tackle harm on social media platforms often centre around content moderation, reporting and removal, and pay little attention to the role of data collection and processing in creating risk to children. Addressing harmful content in this way and on the scale required on larger platforms will turn content moderation into a game of whack-a-mole. TikTok’s ‘blackout challenge’ is a failure of content moderation, but it is also a failure to assure the age of users and deliver age-appropriate services to children.  

As Dr. Jennifer Cobbe and Jatinder Singh describe in their paper, Regulating Recommending: Motivations, Considerations, and Principles,  

“…content that by itself or when seen only by a relatively small number of people isn’t necessarily an issue, but when algorithmically combined with other, similar content or disseminated to a large audience can contribute to systemic problems. Interventions focused on the hosting of content itself miss, to a large extent, issues relating to algorithmic dissemination.” 

5Rights commends the Italian DPA on its intervention to protect children online. Companies must prioritise children’s safety and privacy in the design and operation of their services, and be held to account by regulators. Among other measures, this means addressing the algorithmic recommendation systems that disseminate harmful content to children. The “blackout challenge” is not the first harmful challenge to surface on TikTok and sadly, it is unlikely to be the last. But by delivering age-appropriate services that build in protections for children, the risk of harm will be greatly reduced.