‘Negative effects of online content’ led to UK teen’s death, coroner says


A 14-year-old girl in the UK died from an act of self-harm while suffering from the “negative effects of online content”, a coroner said. The case has once again put social media in the limelight, with questions being raised on the impact it is having on the youth.

Molly Russell was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness,” Andrew Walker ruled at North London Coroner’s Court.

Also Read | OnlyFans bribed Meta employees to put rival creators on terror watchlist

Walker said that it would not be “safe” to conclude that it was suicide, and the teenager in fact died from an “act of self-harm while suffering depression”. Some of the content she viewed was “particularly graphic” and “normalised her condition”, said Walker.

Russell saved, shared or liked 16,300 posts on Instagram in the six-month period before her death, with 2,100 related to depression, self-harm or suicide, the inquest was told.

Russell, from Harrow in northwest London, died in November 2017. Her family started a campaign highlighting the dangers of social media.

“Molly was a thoughtful, sweet-natured, caring, inquisitive, selfless, beautiful individual — although a few words cannot possibly encapsulate our wonderful girl,” her father Ian said in a statement.

“We have heard a senior Meta (Instagram parent company) executive describe this deadly stream of content the platform’s algorithms pushed to Molly as ‘safe’ and not contravening the platform’s policies.

“If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly.

“It’s time the toxic corporate culture at the heart of the world’s biggest social media platform changed,” he urged.

During a week-long hearing, family’s lawyer, Oliver Sanders, asked Elizabeth Lagone, the head of health and wellbeing at Meta, why the platform allowed children to use it when it was “allowing people to put potentially harmful content on it”.

Also Read | Why Facebook’s ‘move fast and break things’ era still haunts it?

“You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this,” he said.

Lagone apologised after being shown footage that Russell had viewed. 

Children’s charity NSPCC said the ruling “must be a turning point”, stressing that any delay to a government bill dealing with online safety “would be inconceivable to parents”.

(With inputs from agencies)


How useful was this post?

Click on a star to rate it!

Leave a Reply

Your email address will not be published. Required fields are marked *