This Teenager’s Death Must Mark A Watershed Moment For Social Media!

106

Even with the legal powers granted by the inquest and substantial aid from the Metropolitan Police, the Russell family’s lawyers still had to fight tooth and nail to prize every piece of Molly’s data from the tech companies, reports Sky News.

Online safety bill

When the news of 14-year-old Molly Russell’s death came to light in 2019, it seemed to mark a watershed for social media.

The case was so shocking, so haunting, that it seemed impossible to imagine social media sites could possibly stay the same.

Molly’s father, Ian Russell, became a champion for online regulation, a cause the government eagerly picked up, promising to fix the problem once and for all with its Online Safety Bill.

Even the social media companies were jolted into action.

As public outrage grew, Instagram – where, in the months before her death, Molly “liked” videos of people falling off buildings or jumping in front of trains, presented in a way that glamourised suicide – changed its guidance to ban “graphic” suicide and self-harm content.

Pinterest, which emailed Molly a list of “ten depression pins you might like”, started suggesting mental health exercises to anyone searching for self-harm material.

The changes were small, especially compared to Ian Russell’s demands, but they appeared to signal a shift in mood. At last, decisive action was coming.

Three years on – and five years from Molly’s death, in November 2017 – and we are still waiting for it to arrive.

The Online Safety Bill finally made it to the House of Commons in April this year, only to be met with fierce opposition from digital rights groups, industry figures and Tory MPs concerned about its impact on free speech.

One of those MPs is now prime minister, and although Liz Truss has confirmed the Bill will proceed, she has said “there may be some tweaks required”.

Importance of mental health

The social media companies have also cooled on the idea of a campaign to safeguard mental health.

At the coroner’s inquest into Molly’s death, which concluded today, a senior executive at Instagram’s parent company Meta argued that many of the posts Molly viewed before her suicide were “safe”, saying they might have been a “cry for help”.

Asked by the Russell family’s lawyer whether it was appropriate to expose a depressed 14-year-old to the cries of adults with severe mental health problems, Elizabeth Lagone replied that Meta worked with experts, but was unable to name any who supported her position.

The message was clear: Meta, which owns WhatsApp and Facebook as well as Instagram, would not be changing its approach, nor altering the basic nature of the algorithms which actively promoted and recommended messages of despair to Molly Russell. With some rare exceptions, most tech companies take the same line.

Still in danger

No wonder Ian Russell told the inquest that, as far as he was concerned, young people were still in danger on social media, a view supported by the Children’s Commissioner for England, who released a survey showing that 45% of children aged eight to 17 have seen content online they felt was inappropriate or made them worried or upset.

Delivering his verdict today, the coroner concluded that the material viewed by Molly Russell “was not safe” and “shouldn’t have been available for a child to see”.

Andrew Walker went further, directly implicating the social media firms in Molly’s death. Rather than delivering a verdict of suicide, he concluded that she died “from an act of self-harm, suffering from depression and the negative effects of online content”.

On the larger question of reform, however, Mr Walker remained silent. As a matter of law, he cannot make recommendations, only express concerns. He will now compile a “prevention of future deaths” report, but its conclusions will inevitably be limited.

So where does that leave the search for safety on social media? Given the flaws of the Online Safety Bill and the recalcitrance of social media firms, it may seem as if it has reached another dead end.

The bleakest corners of the internet

But the effort has not been wasted, because the inquest into Molly Russell’s death has shone a light into one of the bleakest corners of the internet, exposing in forensic detail how it came to envelop a sad, lonely teenage girl.

This is far rarer than you might imagine. Although social media appears open and available to view, that is an illusion. We never know precisely what someone else’s feed looks like, nor how they are poked and prodded as they move around the web.

We don’t have access to the algorithms which serve up material to us whether we want it or not. We don’t even have accurate data on people’s behaviour online, which is jealousy hoarded by the tech companies. They conduct their own studies, thousands of them, but we are never allowed to see the full results.

Even with the legal powers granted by the inquest and substantial aid from the Metropolitan Police, the Russell family’s lawyers still had to fight tooth and nail to prize every piece of Molly’s data from the tech companies.

“If Instagram and Pinterest had been based in the US, not Ireland, their attention and interest would have been similar to Snapchat – that is nothing,” said Oliver Sanders KC, the Russell family’s lawyer.

“We have compiled a detailed picture of her online world. It is a fragmentary picture but we are lucky to have it. It is very unlikely it will be achieved again.”

Without firm evidence, it has been hard to prove the existence of harm, and harder still to build a plan for lasting reform.

Now we have evidence of the firmest possible kind, ruled on by a coroner. Just one case, yes, but one that was expressive of so many others.

Years may have passed, but for many at the inquest it felt as if it was 2019 all over again.

Surely, Molly’s death will mark a watershed for social media. This time, surely, it must.

Did you subscribe to our daily Newsletter?

It’s Free! Click here to Subscribe

Source: Sky News

LEAVE A REPLY

This site uses Akismet to reduce spam. Learn how your comment data is processed.