Snapchat under EU investigation for inadequate protection of minors
The European Commission has opened its second case under the Digital Services Act, targeting Snapchat for failure to ensure a high level of privacy, safety and security for children.

Launching the formal investigation into Snapchat on 26 March, the European Commission said it is concerned the popular social media provider is allowing under 13s onto the service and failing to ensure an age-appropriate experience for teenagers. Snapchat’s default settings are inadequate, as are its moderation, transparency and reporting processes, the Commission says, pointing at dark patterns in their design. Snap is recommending children to strangers, and enabling their exposure to illegal content and exploitation.
This brings Snap into non-compliance with the EU’s landmark law to create a safer, more accountable online environment, the Digital Services Act (DSA). In particular, Snap is unlikely to meet the bar of the Guidelines on the protection of minors, which detail how platforms must implement age-appropriate design to ensure the “high level of privacy, safety and security for minors” required by the Act’s Article 28.
“The EU has powerful tools at its disposal to hold tech companies accountable and end exploitation of children. Children value these services but they do not feel safe on them. Enforcing the Digital Services Act can change that. ”
Leanda Barrington-Leach, Executive Director of 5Rights Foundation
The Commission’s preliminary assessment highlights systemic issues in Snap’s recognition of children, design and functioning. It points to the lack of “sufficient safeguards to protect children from exposure to harmful content, contact, conduct, and other risks.” The Commission specifically notes that Snap is “exposing minors to grooming attempts and recruitment for criminal purposes, as well as to information about the sale of illegal goods, like drugs, or age-restricted products, such as vapes and alcohol.”
Recognising children online and catering for them in the digital environment is a moral and legal imperative. Currently, adults can register as minors and minors as adults on Snap, which creates enormous risks and harms. It also prevents children from having access to age-appropriate experiences, safer defaults and proportionate protections on the service.
This investigation does not occur in isolation. Coming shortly after the Commission’s preliminary findings against TikTok, and on the same day as its preliminary findings on porn platforms, it signals that a robust and swift enforcement of the DSA is possible. It also demonstrates that Article 28 and its Guidelines are not merely theoretical safeguards, but actionable tools capable of driving platforms’ accountability.
This is precisely how the DSA was intended to function. By scrutinising systemic risks embedded in platform design, rather than reacting solely to individual pieces of harmful content, the regulation shifts responsibility upstream. It incentivises companies to build safer systems from the outset, rather than retrofitting protections after harm has occurred.
“We welcome these new preliminary findings as an encouraging development. Effective enforcement is essential to ensure that the DSA lives up to its promise. Investigations like this one send a clear message: platforms operating in the European Union must take their responsibilities seriously, particularly when children are involved.”
Leanda Barrington-Leach, Executive Director of 5Rights Foundation
For 5Rights, these recent developments under the DSA represent important steps towards a digital environment where young users can engage, explore, and connect without being exposed to undue risks and harms. It is of the upmost importance that all investigations dealing with protection of minors are expedited to swiftly operationalise concrete protections for younger users.