Spain advances privacy-preserving solution for age verification
The Spanish Ministry of Digital Transformation on 1 July launched “technical specifications for age verification”, with an app allowing users to obtain tokens proving they are over 18 that can be used to access pornography sites.
“We welcome the commitment of the Spanish regulator to address this issue which is a critical component to ensure age-appropriate experiences for children online,” said 5Rights Executive Director Leanda Barrington-Leach. “It is positive that more stakeholders are working on practical solutions that are effective and rights-respecting. We hope this will contribute to the development of a robust regulatory framework for age assurance systems based on best practice as set out in the industry standard IEEE 2089-1.”
When age-assurance systems are in place, they should meet the following minimum standards:
- Adhere to data minimisation in order to be privacy-preserving, only collecting data that is necessary to identify the age, and age only, of a user
- Protect the privacy of users in line with GDPR and other data protection rules and obligations
- Be proportionate to the risk of harm arising from the service, or a feature of the service, and the purpose of the age assurance solution used
- Be easy for children to understand and consider their evolving capacities
- Be secure and prevent unauthorised disclosure or safety breaches
- Provide routes to challenge and redress if the age of a user is wrongly identified
- Be accessible and inclusive to all users, particularly those with protected characteristics
- Do not restrict children from services or information that they have a right to access
- Provide sufficient and meaningful information for a user to understand how the age assurance system works, in a format and language they can easily understand – including children
- Be effective in assuring the actual age, or age range, of a user
- Anticipate that users may not tell the truth, and do not rely solely on this information.
Effective age verification is a necessity for access to products and services that are legally restricted by age, such as pornography or gambling. Age assurance more broadly, including age estimation techniques, is an important starting point to provide age-appropriate experiences; it should not be used to shut children out, or instead of age-appropriate design of service. In many cases, making services safe for the youngest users is a preferable option that precludes the need for knowing age.
Recommended Reads
5Rights Foundation welcomes landmark UK legislation to protect children from online predators
The UK becomes the world’s first country to criminalise tools used to create AI-generated child sexual abuse material as part of four new measures to take on online predators.
UK’s AI Opportunities Action Plan overlooks risks and potential for children
The UK Government has set a key marker of its plans for full-scale adoption of AI into the economy, making clear its intention to see its use scaled in education. Children must be a part of the conversation on the adoption of AI into services they have no choice but to use, with consideration of the opportunities and risks it poses.
Meta’s rollback on safety measures puts children at risk
New changes announced by Meta will actively reduce existing protections for children. This is an irresponsible move – failure to implement systemic change must be challenged by regulators and policymakers worldwide as new laws and regulations come into force.
AI regulation must keep up with protecting children
An analysis of the growing role of AI in children’s lives, highlighting risks to their privacy, mental health, and education while exploring legislative efforts to ensure robust protection frameworks.