Skip to main content

Bereaved families push for critical amendments to Online Safety Bill

Last week, five bereaved families came to Parliament to support Baroness Kidron’s amendments to the Online Safety Bill that would ensure parents and coroners can access data held by technology companies in cases where they may have played a part in a child’s death.

A young girl with blonde hair in braids, wearing a colourful striped shirt, stands facing a window, her hands resting against her chin. Her reflection is visible in the glass as she looks out, seemingly deep in thought.

These families have battled with tech companies for information relating to the deaths of their children. Their attempts to retrieve data have been obstructed and their pleas for information ignored. The details of the algorithms that targeted vulnerable children with harmful material are still being withheld by companies refusing to accept culpability. In response to their shared experiences, these parents have formed the Bereaved Parents for Online Safety group, to demand changes to the Online Safety Bill.

On 5 December, the day the Online Safety Bill returned to the House of Commons, the families of Molly Russell, Frankie Thomas, Olly Stephens, Sophie Parkinson and Breck Bednar sat in the gallery as MPs debated the Bill. Their presence was noted by speakers from all sides of the House, who later met the families in the central lobby to lend support to their campaign and share condolences.

Baroness Kidron’s amendments give Ofcom the role of mediator between families and companies in cases where a regulated company may hold relevant information when a child has died. The amendments would also require coroners to consider if technology companies hold information relevant to an inquest, and require those companies to preserve, share and where necessary, redact information, to protect the privacy of other users. These amendments will also impose tough sanctions on senior managers who fail to hand over evidence, building on the existing responsibilities the Bill gives to managers in relation to information requests. All of the proposed amendments would preserve existing data protection and privacy rights.

“Losing a child is every parent’s worst nightmare. But to then have to fight for years against intransigent tech platforms just to find out the type of content which was being accessed can only add to the trauma.” – Baroness Nicky Morgan, former Culture Secretary

Coroners for the cases of Frankie Thomas and Molly Russell were unable to acquire data from companies that they believed to be relevant to their inquests. In Frankie’s case, the coroner closed the inquest noting that the company in question, Wattpad, had “more than minimally contributed” to Frankie’s death by suicide, but had failed to provide relevant information.

Similarly in Molly’s case, both the coroner and Molly’s family spent five years trying to find out from Instagram and Pinterest what Molly was experiencing online before she ended her life. When Meta finally handed over information, the court saw that Molly had shared, saved or liked over 2,000 posts related to depression, self-harm or suicide, many of which were recommended to her. This was crucial to understanding how Instagram’s profiling and recommendation systems had created the steady drip-feed of harmful content that contributed to Molly’s state of mind over several months. The coroner for Molly’s inquest said that the content she had seen “normalised, glamorised and even glorified” self-harm, and that some of it was “nearly impossible” to watch.  His verdict concluded that Molly died from an act of self-harm while suffering from depression and “the negative effects of online content” that had “more than minimally contributed” to her death.

“We can no longer leave bereaved families and coroners at the mercy of social media companies. There is a dire need for managing this process to make it more straightforward, more compassionate and more efficient. The experience of living through Molly’s prolonged inquest, is something that no family should have to endure.” 

– Ian Russell, father of Molly Russel

The current system is not set up to deal with these tragic cases. The majority of children do not have a will and few think about how they would want their digital assets, including online accounts, to be handled if they died. Similarly, coroners do not have any existing duties to consider the role that digital products or services may have played in a child’s death, nor the powers to demand information from companies when they do.

“These families suffer agony trying to uncover what their children were looking at in the days and weeks leading up to their deaths, and how much of the material was recommended to them through the algorithms used by tech companies to maximise profits. We need to end the tech sector’s tactics of obfuscation and create a transparent, independent process for all to avoid further tragedies of this kind.”

– Baroness Beeban Kidron