5Rights Foundation is calling for mandatory rules for design of digital services after new research shows online accounts registered to children are being targeted with sexual and suicide content.
The research, undertaken in partnership with Revealing Reality, establishes the pathways between the design of digital services and the risks children face online. It shows services such as Facebook, Instagram and TikTok are allowing children, some as young as 13 years old, to be directly targeted within 24 hours of creating an account with a stream of harmful content. Despite knowing the children’s age, the companies are enabling unsolicited contact from adult strangers and are recommending damaging content including material related to eating disorders, extreme diets, self-harm and suicide as well as sexualised imagery and distorted body images.
These services are not deliberately designed to put children at risk, but this new research shows the risks they pose are not accidental. These are not “bugs” but features. Revealing Reality interviewed engineers and designers who explained they design to maximise engagement, activity and followers — the three drivers of revenue, not to keep children safe.
To test a child’s experience, the researchers created avatars to simulate the experience without putting anyone at risk. These new accounts, each based on a real child, liked and searched for content and reflected the behaviour of the child on which they were based.
The research found:
This Pathways research connects the dots between how digital products are designed, and the impact they have on the lives of children.