Meta’s inadequate age assurance likely in breach of the Digital Services Act
The European Commission has found Meta’s age assurance methods on Instagram and Facebook to be inadequate and linked this failure to the company’s incomplete and arbitrary risk assessment.

According to Meta’s own terms and conditions, users under 13 should not be allowed to access their services. Yet, yesterday the European Commission has issued preliminary findings that Meta’s apps, Instagram and Facebook, are employing inadequate age assurance methods to prevent under-13s from accessing their services, as well as failing to mitigate risks of harm.
These preliminary non-compliance findings are the latest development in a broader investigation under the Digital Services Act (DSA) launched in 2024, amid longstanding accusations that Meta’s platforms promote harmful content to fuel user engagement, purposely employ addictive and harmful design features, and are in breach of EU privacy laws, among others. While the ongoing formal proceedings address several allegations of DSA breaches – including some related to systemic digital architecture beyond content moderation – today’s findings focus specifically on misconduct related to age assurance.
“Children under 13 should not face systems that are designed to shape their perception and influence their behaviour, and mostly, keep them addicted for as long as possible. These preliminary findings give a strong signal to demonstrate that if robustly and strictly enforced, the DSA could truly deliver for children.”
Manon Letouche, Head of EU Affairs, 5Rights Foundation
The Commission found that both platforms are employing ineffective measures to keep younger children off the apps and failing to identify and remove accounts of those who circumvent restrictions. More specifically, regulators have found Meta’s tools to be insufficient and “difficult to use” and linked this failure to perform access restrictions to the company’s “incomplete and arbitrary risk assessments”, an obligation under Art. 34 DSA and a concern that 5Rights and partners had previously flagged.
The Commission is now requiring Meta to implement more robust measures for age assurance and improve its risk assessment methodology, in a manner that is privacy-preserving manner and prioritises users’ safety. Failure to remedy these breaches can trigger a final non-compliance decision and economic fines.
Coming amid rising pressures across Member States to follow Australia’ example in adopting a blanket social media ban, these findings shows that there is another way forward. What is urgently needed is strong enforcement of the rules that hold tech companies accountable and proportionate, privacy-preserving and robust age assurance that enables platforms and services to deliver age-appropriate experiences to children.