Do you ever check what your kids are watching or doing online? It’s well-known that social media can have damaging effects and keeping tabs with what they are watching is critical for their well-being.
In a ruling on Friday, after a two week investigation, five years after a young girl’s suicide, a coroner said social media content is likely to have contributed to the death of Molly Russell, 14, a British teenager. The girl took her life in 2017, after reading and viewing thousands of posts related to suicide, self-harm and depression.
Delivering his concluding remarks, coroner Andrew Walker noted that Russell died of an act of self-harm. Although she was suffering from depression, consuming negative online content fueled her actions. Walker observed that the exposure to the disturbing content might have worsened the adolescent’s depressive disorder, prompting her to take her life.
In the six months leading to her untimely death, Russell is believed to have viewed 2,100 posts on Instagram about suicide, self-harm, and depression. Besides subscribing to different websites that showed content that is not appropriate for teenagers, she also had 469 pinned images on her Pinterest board related to the same subject. Walker noted that the apps and websites’ algorithm recommended content that led to binge viewing of videos, images, and disturbing texts that Russell didn’t necessarily request or search. Worse, the content encouraged the teenager to harm herself and demoralized her from seeking help.
Coroner Walker’s report was backed by a child psychiatrist Dr. Navin Venugopal, who agreed that Molly had accessed very stressful and disturbing content that rendered her sleepless for weeks.
At the end of the inquest, Ian Russell, Molly’s father, addressed members of the press briefly. He expressed his sadness about his daughter’s tragic end and informed the press there were other similar cases. However, he said there is hope and called on anyone struggling to seek help from support organizations or people they trust instead of relying on online content, which may be harmful.
At another press conference that evening, Mr. Russell condemned statements made by Elizabeth Lagone, a senior Meta executive, who claimed that most of the posts Molly viewed were safe for children of Molly’s age. If the content is as safe as Meta officially claims, Molly’s suicide may have been prevented.
“If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly,” Mr. Russell lamented.
Pinterest and Meta acknowledged that they made a mistake on their end. The head of community operations for Pinterest admitted during the investigation that the platform was not secure when Molly utilized it. They extended their apologies and expressed how sorry they were that Molly was exposed to unsuitable graphic information.
According to human rights activists, the judgment could push social media platforms to take responsibility for safeguarding children visiting their sites. The ruling should remind tech companies to take accountability for neglecting children’s safety for commercial gains, said NSPCC’s CEO, Sir Peter Wanless.
The ruling was supported by Prince William, who affirmed through a Twitter post that children’s online safety isn’t an afterthought. The Prince also added that no family should undergo what Ian Russell and his family went through.