Skip to main content

Raft of tech changes to protect children as new rules come into force

Today new legislation comes into force which requires tech companies – such as social
media, search, gaming and streaming platforms — to give children and young people
specific protections for their data.

A wider shot of the same scene, featuring a young person lying on a bed. They are wearing a navy blue polka dot shirt, with black headphones resting around their neck. The person is holding a smartphone, gazing intently at the screen while relaxing, with a laptop placed in the background on the bed. This is the full version of the close-up shown above.

The Age Appropriate Design Code (AADC) is first-of-its kind legislation, giving children high privacy for their personal data, and instructing companies to change features that use data to expose children to risks and intrusion.

Several big tech companies have already made the changes required by the AADC, impacting their policies not just in the UK, but across their global operations.

  • Instagram will no longer allow unknown adults to direct message under 18s
  • TikTok users under the age of 16 will have the accounts set to private by
    default.
  • Google will stop targeted advertising to under 18s, taking children out of the
    business model. They have also introduced safe search by default.
  • YouTube will remove auto-play, to prevent children being fed endless videos.
  • A vast number of wellbeing features offering time off and time out have been
    introduced across the sector.

From the 2nd of September 2021, the Information Commissioner’s Office (ICO) will
begin monitoring for compliance with the regulation. Companies who breach the code
and put children at risk will be eligible for a fine of up to £17.5 million or 4% of their
annual worldwide turnover.

Baroness Beeban Kidron, Chair of 5Rights, said:

“For years, the tech sector has neglected to protect children.
This has led to a toxic digital environment with harmful
content and risky behaviours amplified and promoted – it is
commercial exploitation of a vulnerable population, causes
widespread distress and at worst ends in tragic self-harm and
loss of life.”

“This new legislation recognises for the first time that the
digital world, like the real world, must treat children
differently – observe their rights, ensure their privacy and
promote their wellbeing. It is the work of scores of
individuals, campaigners and parliamentarians, in and out of
government, and I thank them all for their commitment.”

“Importantly we see lawmakers in the US, EU, Australia and
Canada mirroring the provisions of the AADC, and many of
the changes will be available to children around the world.
This marks a new era of responsibility from the tech industry.
It’s a great day for children and their parents and puts the UK
at the front of child online protection globally.”

First piece of legislation for children’s data marks new era

Baroness Kidron, Chair of children’s digital rights charity 5Rights, introduced the AADC
into the Data Protection Act 2018 to transform how digital services interact with
children. Key changes include:

  • The code protects everyone under 18: previously teenagers were treated the
    same as adults, but they will no longer be left vulnerable to harmful data
    practices.
  • Companies are now responsible for ensuring that they do not use children’s
    data to serve to under 18s detrimental content. This will prevent children from
    being targeted with harmful content, such as that which promotes suicide or
    self-harm.
  • Companies can no longer claim their services are not aimed at kids. The AADC
    introduces ‘likely to be accessed by children’ as the scope of the regulation,
    meaning the code applies to the wide range of services that children use in
    practice.
  • The code bans ‘nudge techniques’, meaning platforms are no longer allowed to
    encourage children to provide unnecessary personal data. For example, a
    child’s real time location must not be made publicly available, and kids will no
    longer be encouraged to stream to large groups of unknown adults.
  • Services must now undertake data protection impact assessments to assess
    the risks of their service.
  • Children must be informed about parental controls and the nature of monitoring
    of their activities by their parent or carer.
  • Digital products and services must be designed in the best interests of children.
    The code specifically states that the best interest of a child must be accounted
    as a primary consideration when in conflict with commercial interests.

These changes will build in a high bar of data protection to digital products and services
that are likely to be accessed by children.