UK Government must uphold children’s privacy in new data law
Last week, the UK Government published new draft legislation which will make changes to key principles of GDPR. The Data (Use and Access) Bill has the aim of “harness[ing] the power of data for economic growth, to support a modern digital government, and to improve people’s lives.”
5Rights welcomes the draft proposals in the bill which will increase accountability of tech firms for children’s safety by allowing independent researcher access to data from social media companies to help build understanding of online harms and measures to support access to data for coroners investigating the deaths of children. We also welcome measures to increase the accountability of the Information Commissioner’s Office by introducing a new duty to consider children’s vulnerabilities in the deployment of its regulatory duties.
However, we are concerned that many of the most problematic aspects of the previous government’s attempts to water down data protection remain – including the use of personal data for commercial scientific research, the ability for companies to reuse data without consent, and the use of personal data in automated decision-making. Children’s personal data is exploited and used against their best interests by online platforms every day – giving children a high level of privacy and data protection is an essential part of keeping them safe.
We will be studying the detail of the bill and seeking assurances from the Government that vital protections children currently have under the Age Appropriate Design Code, introduced in the Data Protection Act 2018, will be maintained.
5Rights recently called on the ICO to take stronger action against tech firms for wide-scale noncompliance with the Code. The Code is a key part of the UK’s child online safety legislative framework, setting out 15 standards for how businesses must treat children’s data – including putting in place default privacy settings which ensures that their location is private and they are not bombarded with harmful content through toxic algorithms or negatively impacted by addictive design features.