Snapchat under EU investigation for inadequate protection of minors
The European Commission has opened its second case under the Digital Services Act, targeting Snapchat for failure to ensure a high level of privacy, safety and security for children.

Launching the formal investigation into Snapchat on 26 March, the European Commission said it is concerned the popular social media provider is allowing under 13s onto the service and failing to ensure an age-appropriate experience for teenagers. Snapchat’s default settings are inadequate, as are its moderation, transparency and reporting processes, the Commission says, pointing at dark patterns in their design. Snap is recommending children to strangers, and enabling their exposure to illegal content and exploitation.
This brings Snap into non-compliance with the EU’s landmark law to create a safer, more accountable online environment, the Digital Services Act (DSA). In particular, Snap is unlikely to meet the bar of the Guidelines on the protection of minors, which detail how platforms must implement age-appropriate design to ensure the “high level of privacy, safety and security for minors” required by the Act’s Article 28.
“Core features such as disappearing messages, Snapchat streaks, the discovery tab, My AI and live location sharing make this service both highly consuming and very risky for children. Children are too often drawn to a fun and fast-paced social environment only to find themselves trapped in a much darker place of anxiety, low self-esteem, sleep-deprivation, bullying, grooming and exploitation. Snap is profiting from harming children. This has to stop.”
Leanda Barrington-Leach, Executive Director of 5Rights Foundation
The Commission’s preliminary assessment highlights systemic issues in Snap’s recognition of children, design and functioning. It points to the lack of “sufficient safeguards to protect children from exposure to harmful content, contact, conduct, and other risks.” The Commission specifically notes that Snap is “exposing minors to grooming attempts and recruitment for criminal purposes, as well as to information about the sale of illegal goods, like drugs, or age-restricted products, such as vapes and alcohol.”
The Article 28 Guidelines require platforms to take measures to both protect children and enable age-appropriate experiences. Services should for example set minor’s accounts to private by default, ensure children are not recommended harmful content or contact, that they have agency over their feeds, and disable any persuasive, manipulative or addictive features.
“The EU has powerful tools at its disposal to hold tech companies accountable and end exploitation of children. Children value these services but they do not feel safe on them. Enforcing the Digital Services Act can change that. ”
Leanda Barrington-Leach, Executive Director of 5Rights Foundation
Following close on the heels of the Commission’s preliminary findings against TikTok in February and accompanied by the publication of preliminary findings on porn platforms, this investigation is welcome evidence that the DSA and its Article 28 Guidelines can effectively hold tech companies accountable for their system design and its impact on children. 5Rights will continue to support the European Commission and the Member State Digital Services Act Coordinators in their implementation and enforcement of the DSA, driving positive change towards a safe and empowering digital environment for young people.