How data-hungry companies expose children to risk online: the case of recommendation systems

New 5Rights project uncovers how recommendation systems powered by data are leading to harm for young people online

Recommendation systems are a defining feature of the digital world. They suggest to children videos they should watch, information they should read, games they should play, and even who they should be friends with.

When used effectively, they can play an important role in helping children navigate the online world, to refine the masses of content available in a way that is supportive and can diversify their information ecosystem.

But too often these systems are configured primarily to meet commercial goals without prioritising the best interests of children. 5Rights Foundation’s latest Risky by Design case study identifies nine features which make use of these automated decision-making processes in ways that can lead to harm. They include:

  • Targeted advertising, where services collect user data to build ‘profiles’ that can be used to target highly individualised adverts. This could include pushing age-restricted products and services to children.
  • Friend/follower suggestions. Designed to expand networks, they pressure children to connect with strangers or accounts featuring inappropriate material.
  • Autocomplete, where some search engines suggest possible search terms based on the first few letters entered by the user. These can interrupt, misinterpret and possibly redirect a child’s thought process, sometimes towards extreme, stereotypical or unwelcome views.

A lack of transparency around the design and operation of recommendation systems is a major obstacle to addressing the risks they create.

Around the world, legislators and regulators are becoming more aware of the need for greater oversight of systems like these, and are working to mitigate the risks associated with them. For example, in the UK the Joint Committee scrutinising the Online Safety Bill has recommended that the design of algorithms be a central part of a Safety by Design code of practice, requiring services to address risks created by recommendation systems before harm can occur, and that the highest-risk services should be obliged to commission annual audits of their algorithms.

But in the meantime, providers should follow best practice when designing and operating recommendation systems, as set out in this case study. With greater transparency and effective oversight – and privacy-preserving age assurance that gives children age-appropriate experiences online – those designing digital products and services can reduce the risks that recommendation systems pose to young people.

About Risky by Design

Every digital service or environment is the product of a series of design decisions that shape the experiences of young people online.

Risky by Design is a 5Rights Foundation project that examines common design features and how they pose risks to young people in the digital world. It’s an illustration of why products and services must be designed with the needs and rights of children in mind.