Friday 19 November was a busy day in the office for those of us working in the data and privacy field, as the government consultation on data reform closed. The almost 150-page consultation document articulates the government’s plans for how to generate a ‘Brexit dividend’ as the UK moves away from GDPR (European data protection legislation). However, it is hard to see how chipping away at newly won data rights or this version of ‘free and responsible flow of personal data’ is designed to benefit children, or indeed the wider public. In reality there is little chance of realising any economic benefit from these proposals while at the same time, they undermine many important pillars of the Age Appropriate Design Code and the old GDPR-based data protection regime (the Data Protection Act 2018).
The Age Appropriate Design Code (AADC) is the UK’s ground-breaking data protection code that has set the globe thinking about how children’s privacy intersects with safety and participation in the digital world. It sets out 15 standards that digital services likely to be accessed by children need to follow. The coming into force of the code in September 2021 was foreshadowed by a number of announcements from all the major technology platforms, from Google to TikTok to Facebook, making adjustments to their services in order to meet some of the requirements of the code. There is a certain irony to the situation we currently find ourselves in, where around the world from the EU to US to Australia, people are considering mirroring the code while domestically the Data Act seems to undermine it.
The proposals as currently formulated would undermine the requirement for data controllers’ primary consideration to be the best interests of the child (standard 1), it would reduce transparency in some key areas (standard 4), it would reduce the obligation to ensure that their data is not used in ways that have been shown to be detrimental to their wellbeing (standard 5), it would negate the need to have default settings on high privacy (standard 7), it would make meaningless the requirement to minimise the data they collect (standard 8), it would allow children’s data to be shared without a compelling reason and taking into account the best interests of the child (standard 9), it would increase the profiling of children by default (standard 12), would allow nudge techniques (standard 13) and it would restrict the provision of accessible tools to help children exercise their data rights (standard 15). Nothing in the new data regime should undermine these requirements or indeed any aspect of the code.
If the proposed reforms do not cement or increase digital rights including those delivered by the AADC, then possibly the rationale for altering our data protection regime is economic. However, according to the government’s own impact assessment, if all the changes proposed in the consultation were implemented, it would generate a net benefit to the UK economy of £1.5bn over 10 years. This may seem like a lot but when compared to the size of the economy as measured by GDP - £1,960bn - this represents an annual increase of just 0.08%. This benefit to the economy is undermined by the very real and serious risk that the UK loses not only user protections, but also its adequacy decision from the EU, which allows data to flow freely between the EU and UK in the post-Brexit environment. Losing adequacy would cost UK business an estimated £1.6bn over 10 years in additional compliance costs2 and does not include the resultant loss of trade with the EU, which is likely to be even higher. It is therefore hard for anyone to claim that we are reforming our data regime for economic reasons.
This analysis confirms that economics cannot be the driving force behind the proposal. We must therefore look in more detail at the risks to children’s digital rights.
The proposal wants to change the subject access request regime, which currently entitles everyone to demand data held on them by data controllers. The proposed changes have two broad aims; to make it easier for data controllers to reject requests and to implement a nominal fee for a data subject to access their data. Subject access requests are an invaluable tool to promote accountability and enable individuals to challenge decisions or data uses that discriminated or harmed them, often from companies that are vastly more powerful than they are. Imposing subject access request fees would disempower individuals and allow organisations to profit by collecting fees from their users to access their own data. No child should be deterred from exercising their data rights nor have their request unreasonably denied. Indeed, the action that is actually needed in this area is for all companies to be required to ensure that there are automated tools for anyone to download data held about them.
The second major area that the consultation seeks to reform is the Information Commissioner’s Office itself – the UK’s data regulator – as well as its relationship to government. Some of the proposals around regulatory cooperation, increasing the range of studies that need to be publicly available, and requiring companies to have documented and accessible complaints procedures, are to be welcomed. These are balanced against some more problematic suggestions including an obligation to have ‘regard to innovation and economic growth’ when enforcing our data rights, as well as giving the Secretary of State more powers to direct or approve of codes of practice or guidance. It is vital the government does not confuse the role of the ICO, nor interfere too much, especially in its work to ensure compliance with the Age Appropriate Design Code. Crucially, a proportionate, trusted and well enforced regime provides certainty and sustainability to a sector that has been riven by scandal and lost public trust. Establishing an accountable and enforceable regime will provide the necessary environment for growth and innovation.
The final two principal areas of concern involve the reform of the legal basis that data controllers can use to collect and process data. The consultation proposes to extend and alter the use of ‘legitimate interests’ as a legal basis for processing personal data. ‘Legitimate interests’ as a legal basis is different to the other legal bases contained in GDPR since it is not purpose-specific. It instead requires that the processing is ‘necessary’ and the controllers’ interest balanced with the interests of the data subject. The government is proposing to create a list of activities where legitimate interests could be claimed without any balancing of the interests of data subjects, even in the case of children’s data – something that goes against the core principles of the code and many of its standards. Many of the proposed legitimate interests are so generic, such as ‘internal research and development’ or ‘managing or maintaining a database to ensure that records of individuals are accurate and up to date,’ that it is hard to think of data that could not be processed using one of the justifications above. Rather than expanding its use, the government should retain the existing definition and requirements around legitimate interest and additionally impose a duty on organisations who rely on legitimate interests to make their assessments publicly available.
And finally the bill proposes to remove the requirement for digital services to gain consent in order to place cookies on a person’s device. Cookies are small pieces of software that allow digital services to transmit data and track us online. It is hard to see how it could be acceptable for digital service providers to be free to place any cookie they wish, collecting any data that they wish, on any user’s system – especially when that user is a child – without their knowledge or consent. It is vital that new data protection legislation is not used to relax existing regulatory protection, particularly those provided to children by the Age Appropriate Design Code.
Thankfully the government does also seek feedback on the benefits of using ‘automated privacy signals’ or ‘APS’. These are systems that enable people to set their default preferences around data processing that can be communicated automatically to digital services that the person visits. This means that a person no longer needs to read and understand the specific data policies contained in the terms and conditions, but would enable their preferences to be communicated and their rights respected. Two exciting systems being developed and deployed right now are the US-based Global Privacy Control and EU-based Advanced Data Protection Control. The children and young people that we work with have consistently asked to be able to set a single set of preferences that reflect their individual expectations and tolerances, that can be used across all the services that they use. They are clear that while they are willing and would welcome engaging on a single occasion with privacy questions (something that advanced privacy signals enable), they think the status quo does not constitute informed consent in any meaningful way.
What these initiatives from Global Privacy Control and Advanced Data Protection Control demonstrate is that there are existing technical solutions that allow users to take effective control of their online data and easily communicate their preferences around data collection, sale and advertising. If implemented more widely, and if publishers and advertisers respected user-defined preferences, these tools would enable those who wanted to exclude themselves from tracking-based advertising to do so. The massive potential impact of these tools is also why there has been so little voluntary adoption from publishers. To be transformative, they will need legislative backing.
These four areas are just some of those covered by the consultation. It also seeks to make it easier for researchers to get access to data, even where they are not bound by public interest, curtail our rights to an explanation from automated decision systems, as well as radically reshape the international data transfer arena that, if implemented, will have a dramatic impact on the UK’s domestic and international data regime.
The consultation marks the start of a long journey before we see new law in this area and will give stakeholders plenty of time to ensure that the shortcomings of the proposals are highlighted and improved. The Age Appropriate Design Code and forthcoming Online Safety Bill are world leading pieces of legislation advancing our privacy and protections online. It would be a real shame, and a government failure, if all the progress made was undermined by a future data bill.