Access restrictions to protect children and their rights in the digital environment
5Rights Foundation welcomes the will of the EU and its Member States to take further and robust action to protect children from unsafe and exploitative practices by technology providers and ensure children’s rights are fully respected and upheld in the digital age.
In view of national initiatives and the proposal to consider complementing established EU legal and regulatory requirements (notably enshrined in the GDPR, AVMSD, DSA and AI Act) with a ‘digital age of majority to access social media’, 5Rights calls for a thoughtful approach to access restrictions which respects and upholds children’s rights and in no manner dilutes or distracts from corporate responsibility for the privacy, safety and security of all children accessing their services.
This position sets out 4 key points:
1. All under 18s have specific rights to protection and enjoyment of age-appropriate experiences, wherever they are. Older children must not be forgotten, and all services where children are in practice must be safe for them.
2. Access restrictions should be thoughtfully implemented as part of age-appropriate design and in line with existing law:
- Personalised services – including social media but also many games and AI chatbots – should by default not be accessible to children under 13.
- Tiered age-appropriate access restrictions should protect all children, including teenagers, from high-risk features.
3. Access restrictions must not be implemented in isolation, but complemented by other measures to protect, support and empower children through age-appropriate design of service.
4. Children should not be banned from accessing the digital world, but companies that exploit them should be banned from accessing them. New burdens must not be put on children or parents; instead, regulators must urgently and robustly enforce market access restrictions against tech companies that fail to comply with the law.