The digital services that children and young people use are not designed to meet their needs or uphold their rights. Many services simply ignore the presence of child users (under 18s) altogether.
Design decisions are driven more by the commercial requirement for data than by advanced consideration of a child’s best interests. Where design decisions are taken to promote their welfare online, they often come after the fact - provoked more by tragedy and public outrage than by any prior assessment of the impact on children and young people.
The failure to anticipate the presence, rights and needs of children and young people, by design and default, severely limits the potential of digital technology as a positive force in their lives.
Companies must design digital services that cater for vulnerabilities, needs, and rights of children and young people by default. To fulfil its potential, digital technologies must be directed towards children and young people’s flourishing. Retrofitting safety features into a service only after under 18 year olds have experienced harm or allowing their rights to be routinely undermined is simply not good enough.
Crucially, this principle of advanced consideration must apply to all the digital services that children and young people are likely to access in reality, not just those services that are specifically targeted at them.
Meeting the needs of childhood development or delivering on children’s rights is not optional. Governments and policymakers need to prioritise the development of robust standards for the design and development of digital technology, and regulate to require that children’s safety, rights, and privacy are upheld by design and default.
What is 5Rights doing?