Access limitations must be part of age-appropriate design, and effectively restrict companies from exploiting children
As legislators debate social media bans, 5Rights calls for access restrictions that deliver for children, arguing for tech-neutral measures that enforce existing restrictions on personalised services for under 13s and tiered default age-gating of risky features for teenagers.

Tech companies have exploited children for far too long, deliberately designing and aggressively pushing services that are addictive and manipulative, with devastating consequences for an entire generation. Parents’ concerns about social media and online harms are well-founded, and children themselves tell us they find technology useful but want more age-appropriate designs and better protections.
In response, policy makers in the EU, the UK, and beyond are taking initiative to ban children under 15 or 16 from accessing social media. While understandable, the restriction of access to specific categories of services through blanket bans is misguided. 5Rights calls for age restrictions as part of age-appropriate design that are tech-neutral, targeted and proportionate, to protect children as they grow. These age restrictions should not be implemented in isolation but within a broader strategy for age-appropriate digital services grounded in children’s rights, both protecting and empowering children to access the digital world in increased autonomy, according to their evolving capacities.
A tech neutral approach
Targeting only social media through a blanket ban could push children toward other unregulated but equally problematic services such as gaming, AI chatbots or even EdTech platforms. Companies apply the same harmful design choices – addictive features, data exploitation, algorithmic manipulation – and expose children to a wide range of risks across the entire digital ecosystem, from social media and gaming to EdTech to AI tools and beyond. Children may be blocked from social media yet remain exposed to identical exploitative and risky practices everywhere else, which is why effective policy requires tech-neutral restrictions on these practices and the companies that employ them.
Personalised services should by default not be accessible to children under 13
5Rights supports the implementation of access restrictions in line with existing laws, including data protection regulations. The processing of children’s data has enabled companies to exploit them through personalised services. Such high-risk services – including social media but also many games and AI chatbots – should by default not be accessible to younger children, who do not have the cognitive maturity to critically evaluate digital interactions, let alone services designed to shape their perceptions and influence their behaviour.
Despite existing age limits, companies allow very young children to access their platforms and defy privacy rules by excessively processing their data and deploying harmful features. Companies can no longer put in place ineffective age gates and then close their eyes to children bypassing them en masse. The prohibition on personalised services for under 13s must be enforced.
Tiered age-appropriate access restrictions should protect all children, including teenagers, from high-risk features
All under 18s have specific rights to protection and enjoyment of age-appropriate experiences, wherever they are. Older children must not be forgotten.
Tiered age-appropriate access restrictions should protect all children, including teenagers, from features, content and spaces that are high-risk for their age. They should ensure that all children are protected, according to their evolving capacities, as they access the digital world in increased autonomy. Age assurance and differentiated access levels through tiered default settings should ensure features and functionalities become available to children only when they reach a certain age, protecting and empowering them in line with their developmental stage.
Thoughtfully implemented, access restrictions are a key element of age-appropriate design. They should however be complemented by other measures, such as age-appropriate information, warnings, positive nudges, safety filters and reporting mechanisms.
Policy-makers must not be satisfied with partial, band-aid solutions; they must fight for a better digital world for children
Children should not be banned from the digital world, but companies that exploit them should be banned from accessing them. Regulators must urgently and robustly enforce market restrictions against tech companies that expose children to risk and harm, and support the full implementation of obligations for age-appropriate design. Wherever children are, they should be both protected and empowered to grow, participate and thrive.
“Tech companies have been unashamedly exploiting children for too long. Laws banning personalised services such as social media, games and AI chatbots for under 13s must be robustly enforced, and new measures are needed for the tiered age-gating of risky features for teenagers.
Most importantly, companies must not be let off the hook for ensuring that all children on their services in practice – whether older children or those clever enough to get around age-gates – are safe.
Children deserve a better internet, and it is our duty to build one that is safe and helps them connect, learn and thrive as they grow.”
Leanda Barrington-Leach, 5Rights Foundation Executive Director