Skip to main content

Landmark verdict on Google and Meta shows why Age-Appropriate Design Codes are critical to protecting children online

A landmark jury ruling in California has found Meta and Google liable for designing products that addict and harm children.


Landmark verdict on Google and Meta shows why Age-Appropriate Design Codes are critical to protecting children online

A landmark jury ruling in California has found Meta and Google liable for designing products that addict and harm children. While the private action case was not brought under the California Age-Appropriate Design Code (AADC), the case reflects the Code’s influence and the victory vindicates 5Rights’ pioneering approach: shifting the focus from content to system design.  

For years, debates around online harms in the US have been constrained by arguments about speech and the limits of regulation under the First Amendment. This case cuts through that by focusing on how platforms are designed, and how they incentivise engagement, warp personal agency and exploit developmental vulnerabilities in children. The court in this case has recognised that these systems can and should be scrutinised as products, not just as neutral hosts of user content. 

This is a significant moment for the enforcement of the AADCs across the US, and similar laws globally. The Code, championed by 5Rights and now in law across 5 US States, was built on the principle that risks to children must be addressed upstream by design, rather than relying solely on content moderation or user responsibility as safeguards. 

For over a decade, 5Rights has evidenced that children’s experiences online are shaped by systems designed to maximise engagement. Among others, our Pathways research has consistently shown that children are spending more time online than they feel they should, that they feel unable to stop scrolling and that they feel pressure to present themselves as older and more daring than they are.  

Internal company documents confirm these are the predictable outcomes of platforms designed to keep users engaged at any cost. In a 2020 internal exchange, Meta employees describe their own platform as “a drug” and themselves as “basically pushers”, comparing social media’s hold on users to gambling. 

These are not the words of a company acting in ignorance, but the words of one that knew what it was doing to children and chose to continue regardless. 

Voluntary change was never coming and across the world, regulators have been too cautious in enforcing existing laws and many countries have yet to establish any legal basis for protecting children online. But this is now beginning to change. Governments are increasingly turning toward regulation which tackles service and product design, recognising that the root causes of harm lie in how services are built. Age-Appropriate Design Codes are now in place in California, Maryland, Vermont, South Carolina and Nebraska, with legislative proposals advancing across many more. In the UK and Europe, frameworks such as the Online Safety Act and the Digital Services Act are embedding risk-based and systemic duties on platforms. Beyond this, countries including Brazil and Indonesia are actively developing approaches that place children’s rights and safety at the centre of digital regulation with the AADC principles at the core. 

Leanda Barrington-Leach, Executive Director at 5Rights says:


This robust and increasingly coherent regulatory framework is delivering practical change for children. Research into the impact of regulation on children’s digital lives found that between 2017 and 2024, Meta, Google, TikTok and Snap alone made 128 recorded changes to their platforms in support of child privacy and safety — with a peak of 42 changes in 2021, the year the UK’s Age-Appropriate Design Code came into force. 

The California verdict, alongside the recent ruling in New Mexico, represents one of the first confirmations in a court of law of what children have known for years: that the harms they experience online are not accidental, but the result of deliberate design choices. Together, these cases mark a turning point in legal accountability for tech companies. Now it falls to governments and regulators to respond – with robust enforcement, clear legislation, and the resolve to stop passing responsibility onto the very children they have a duty to protect.