Last week, following on almost immediately from the publication of the UK government’s long-awaited Online Harms response, came the European Commission’s plans for a Digital Services Act (DSA).
As with the Online Harms Bill in the UK, the Digital Services Act sets out legislative proposals applying to an array of digital services, from the small “picks and shovels” hosting and caching websites that provide the internet’s underlying infrastructure to the large social media platforms and online marketplaces that absorb much of its traffic.
If approved by the European Parliament and the Union’s Member States, the Act would be the first reform in twenty years to the framework of obligations that applies to digital services connecting consumers to goods, services and content. The Act seeks to…“give better protection to consumers and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and lead to fairer and more open digital markets.”
More specifically, the Act will include obligations to introduce measures to combat illegal content online, and effective safeguards for users, including the possibility to challenge platforms’ moderation decisions.
The proposal also includes obligations on very large online platforms to prevent abuse of their systems by taking risk-based actions, including oversight through independent audits of their risk management measures. Should these audits reveal evidence that platforms are failing to comply with their requirements under the Act, a new set of National Digital Service Coordinators will have the power to impose fines of up to 6% of the platform’s annual turnover.
This is a crucial step in the right direction that is to be welcomed. The underlying issue for young people’s experience of the digital world is that the majority of products and services they use are not designed with their specific needs and vulnerabilities in mind.
A duty to conduct impact assessments would ensure that young people’s best interests are of primary consideration in the design and delivery of digital services. Impact assessments can help service providers think about the likely risks and unintended consequences of features of their services, once identified, service providers can then address these risks and take action to mitigate them by designing and adapting their services accordingly
An important issue that the DSA seeks to address, but which is given little attention in the Online Harms response, is that of recommender systems, that is the set of algorithms that direct users towards certain forms of content. Under the new EU proposal, service providers will be required to adhere to a number of transparency standards, including details of the algorithms used for recommendations. Transparency is an important way of determining how algorithms amplify, encourage, moderate and/or discourage particular behaviours and outcomes.
Recommendation algorithms power the majority of services directed at young people. 70% of views on YouTube are a direct result of its recommendation algorithm, and 80% of content hours watched on Netflix come from its recommendation algorithm. These algorithms are characterised as personalisation for the user based on what they have watched, shared or interacted with previously, but it has become increasingly clear that recommender algorithms amplify more extreme content and are responsible for spreading harmful content. Algorithms are used not only to recommend content but to recommend children as ‘friends’ or ‘followers’ to adults, or to recommend videos of partially clothed prepubescent children to adults who have viewed similar content previously. Without transparency and algorithmic oversight, it is impossible to understand how these systems create the conditions for harm to come to young people in online spaces.
With the arrival of both the DSA and the Online Harms response, the starting pistol has been fired on a process that is going to take many months. There is a long way to go, but both the DSA and Online Harms response set out legislative proposals, that take a step in the right direction. It’s now important that the wider public sector, civil society, academia and leading experts have a chance to engage with the process as the proposals move into draft legislation, to ensure both fulfil their purpose of keeping people safe online. This is an important period for both the UK and the EU as we begin to see new obligations on digital service providers set out in law that will help to build a digital world that young people deserve.