Enforcing the DSA for children: a moral imperative; and a strategic EU move

Last Saturday, the Digital Services Act (DSA) came into full force across the EU, empowering the European Commission and national authorities to enforce, among others, a ground-breaking requirement that all online platforms ensure a high level of privacy, safety and security for children. Whether or not they deliver – and how quickly – will be a critical test for the value of the new law and the EU’s claim to global leadership in digital regulation.

The European Commission has already said that implementing and enforcing the DSA for children is a priority. This makes sense given the rights-focused objectives of the act, the massive impact the current lack of regulation is having on children’s safety, well-being and development, and the uniquely robust existing regulatory framework for children’s rights online upon which the DSA implementation can draw.

Prioritising children is a moral imperative. They are one in three users, the most vulnerable, yet systematically ignored and exposed to risk and harm due to system design which prioritises reach and engagement over privacy and safety.

As 5Rights Foundation research has shown, children’s profiles are routinely recommended to strangers, who may also access their location and be able to contact them directly or even see into their bedrooms; children are exposed to never-ending streams of content pushed by recommender systems for its shock and engagement value, regardless of its suitability, including violence, porn, eating disorder and self-harm; children are constantly pestered by notifications and lured back from real life with dubious promises of rewards or popularity. And the list unfortunately goes on and on.

All the while, companies are gathering, processing, analysing and selling troves of children’s data. As children suffer, tech companies are thriving. Indeed, it was telling how the very day after Meta CEO, Mark Zuckerberg, apologised in the US Congress to the parents of children who had died from harms caused by the company’s products, Meta announced record-breaking profits.

There can be no doubt that prioritising enforcement for children is right. But it is also smart, and not only because of the political capital involved in delivering on what can be a priority for a very high proportion of voters. It is also smart because, when it comes to children, enforcers can capitalise on a uniquely comprehensive legal and regulatory framework for children’s rights, clearly elaborated in the context of the digital environment. From specific guidance on how the UN Convention on the Rights of the Child applies online, to a plethora of regulatory initiatives for age appropriate design, in particular under GDPR, and technical industry standards for age appropriate design of service – the playbook for children’s privacy, safety and security is already well established, and courtesy of 5Rights, compiled ready for use under the DSA.

The EU can – and must – act fast. Monday’s announcement that the European Commission is opening an investigation into TikTok for addictive design and failing to protect minors is an encouraging start. Without a credible stick, rules are easily ignored. But pro-active compliance, and success in the courts, will be aided by clearer, more detailed guidance. By swiftly encoding clear guidelines for companies based on existing law and best practice for children, the EU can prove it can move fast and mend things – a welcome outcome for all.

The EU has a shot at becoming the safest place to be online for children, and its companies and innovators the big breakers for the next generation. If it fails, it risks being surpassed – like in so many other sectors – by the global regulatory trend of child online protection. If it fails, the commercialisation of children’s presence and data online can become a generational emergency: a next climate crisis – only this time we cannot wait 30 years to act.

Read the report on the best practices baseline for the implementation of the Digital Services Act for children.