DSA turns 1: more potential for advancing children’s rights
25 August 2024 marked the one-year anniversary since the Digital Services Act (DSA) came into force for Very Large Online Platforms (VLOPs), such as Facebook, Instagram, Snapchat, YouTube or Tik Tok. But how has the DSA impacted these online services so far, and what does it mean for the safety of children online?
Adopted in 2022, the DSA regulates digital services to create a safer online environment where users’ rights are protected, while also leveling the playing field for businesses – and the protection of children’s rights is among its main objectives.
The progress so far
In the last year, VLOPs had to assess the risks their services posed to children’s rights, among others, and the European Commission has since taken decisive steps to ensure compliance.
The Commission has opened formal proceedings against TikTok over concerns that the platform may have breached the DSA when it comes to the protection of minors as well as the risk management of addictive design and harmful content. Similar action has been launched against Meta in May and requests for information have been sent to YouTube, TikTok, Snap and Meta, notably regarding Facebook and Instagram’s advertising practices and the circulation of child sexual abuse material on Instagram. Following one of such requests, TikTok has also blocked its Lite Rewards programme in the EU due to its potential risk for children.
These actions confirm the strong commitment by the European Commission to prioritise children’s rights in the enforcement and implementation of the DSA. Together with the establishment of a dedicated working group on the protection of minors in the DSA Board – the governing body gathering national enforcers – this commitment will be key to make the Regulation work for children.
Clear guidelines and standards for a high level of privacy, safety and security
This strong implementation and enforcement of the DSA for children’s rights also requires clear and concrete guidelines for companies and national regulators to ensure a high level of privacy, safety and security for children.
In this regard, the work of the European Commission to draft guidelines on article 28(1) of the DSA is of paramount importance and a welcome step in the right direction.
As highlighted in our report, “A high level of privacy, safety, and security for minors,” these guidelines should be grounded in the principles of the UNCRC General comment No. 25 on children’s rights in the digital environment and other available best practices, instruments and technical standards such as CEN-CENELC CWA 18016 on age-appropriate design.
These guidelines should:
- Provide a framework for platforms to assess if they are likely to be accessed by children and to conduct child rights impact assessments.
- Mandate privacy-preserving and age-appropriate measures to mitigate such risks, including age assurance where necessary
- Focus on outcome-based approaches that prioritise the best interests of the child in service design.
- Recommend dedicated roles within companies and regulators for the protection of minors.
- Provide that ongoing re-evaluation and improvement of compliance, involving children in the process.
The Commission and national regulators also have the opportunity to complement this with technical standards, which can ensure consistent application of the DSA’s provisions related to children and provide practical guidance for companies to ease compliance and innovate in this space.
The EU has the potential to become the safest online environment for children, paving the way for European companies and innovators to lead the next generation of digital advancements and develop technology that not only protects but also benefits children and society for years to come.