Skip to main content

Poki: a case study in service redesign for children

Poki, a free gaming platform with more than 60 million users worldwide, has overhauled its UK service to comply with the Age Appropriate Design Code.

“It will break the internet”, “It will dumb down the experience of adults, or lead to violations of their privacy”… the reactions from tech companies and lobbyists to the pioneering UK Age-Appropriate Design Code (AADC) were hyperbolic.

Yet since its passage into law in 2020, slowly but surely – and often quietly – changes have taken place that demonstrate just the opposite. Companies can design – or redesign – their services to comply with the AADC’s 15 standards for children’s data protection, without undermining their business viability, shutting children out, or impinging upon adults’ experience.

“Poki’s redesign demonstrates that tech platforms – when pushed – can deliver privacy for kids (and adults). This was a ground-breaking engagement between 5Rights Foundation and Poki that proves regulation works, and benefits can accrue to all.”

– Leanda Barrington-Leach, Executive Director at 5Rights

Yet since its passage into law in 2020, slowly but surely – and often quietly – changes have taken place that demonstrate just the opposite. Companies can design – or redesign – their services to comply with the AADC’s 15 standards for children’s data protection, without undermining their business viability, shutting children out, or impinging upon adults’ experience.

When 5Rights Foundation first approached Poki in early 2023, the popular gaming platform was in clear breach of the Code. The free service was monetising children’s data and exposing them to an array of risks to their privacy and, ultimately, their safety. Despite evidence that the platform was being accessed by millions of users in the UK, many of them children in the UK as young as 6 from both homes and schools, Poki was:

  • tracking children by default
  • embedding monitoring technology without the consent or knowledge of users
  • sharing children’s data with third parties, often for “unspecified purposes”
  • nudging and misleading children into lowering their privacy protections.

Children’s profiles, precise geolocation and detailed data relating to their gaming practices and habits were directly shared with some 360 third parties, from advertisers and marketing firms to analytics companies and data brokers, located from the UK to the US, China and beyond.

Following a March 2023 legal letter from 5Rights and 8 months of subsequent exchange, Poki has radically overhauled its system design to comply with the Age-Appropriate Design Code. The changes made include:

  • changing default settings to high privacy
  • restricting cookies
  • switching out advertising based on profiling for contextual ads
  • ending precise location tracking
  • making the privacy policies more intelligible and accessible.

In a surprise move, Poki chose to implement the changes for all UK users, meaning that adults benefit from the changes required for children by the Code. Since the AADC came into force, tech companies large and small have implemented changes to protect children. Among them: Google instituted ‘safe search’ for all under 18s; YouTube and Snapchat defaulted all accounts of under 18s to high privacy settings; Instagram stopped unknown adults from messaging children; TikTok turned off notifications through the night; Pinterest stopped showing children advertising. Positive as all these moves are, they rarely amount to a comprehensive review of service design to ensure compliance with all the 15 interdependent standards of the Code. The Poki case demonstrates that it is perfectly feasible for companies to implement the provisions of the AADC and comprehensively review their services to ensure they are safe and privacy-preserving by design and default. It speaks to the effectiveness of the AADC and existing UK GDPR as a means to drive positive change, as well as to hold companies to account when required.