Skip to content
Naked Security Naked Security

Teen entered ‘dark rabbit hole of suicidal content’ online

Molly Russell's grieving father has backed a psychiatrists' report, saying that tech companies must be forced to hand over anonymized data.

You’re fat. You’re worthless. You don’t deserve to be alive.
Those are the kind of comments left on social media posts as innocent as a picture of a flower, as Sarah Lechmere – who has struggled with eating disorders – told the BBC. Social media posts also pointed her to pro-anorexia sites that gave her “tips” on how to self-harm, she said.
This is precisely why UK psychiatrists want to see social media companies forced to hand over their data – and to be taxed into paying – for research into the harms and benefits of social media use. The report, published by the Royal College of Psychiatrists, contains a forward written by Ian Russell, the father of Molly Russell, a 14-year-old who committed suicide in 2017 after entering what her father called the “dark rabbit hole of suicidal content” online.
Ian Russell describes how social media’s “pushy algorithms” trapped Molly, sequestering her in a community that encourages suffering people not only to self-harm but to also avoid seeking help:

I have no doubt that social media helped kill my daughter. Having viewed some of the posts Molly had seen, it is clear they would have normalized, encouraged and escalated her depression; persuaded Molly not to ask for help and instead keep it all to herself; and convinced her it was irreversible and that she had no hope.
… Online, Molly found a world that grew in importance to her and its escalating dominance isolated her from the real world. The pushy algorithms of social media helped ensure Molly increasingly connected to her digital life while encouraging her to hide her problems from those of us around her, those who could help Molly find the professional care she needed.

Ian Russell backs the report’s findings – particularly its calls for government and social media companies to do more to protect users from harmful content, not only by sharing content but also by funding research with a “turnover tax” that will also provide training for clinicians, teachers and others working with children, to help them identify children struggling with their mental health and to understand how social media might be affecting them.

A new regulator and a 2% tax on big tech companies

Last year, the UK government announced plans to set-up an online safety regulator to improve internet safety. The College is calling for that regulator to be empowered to compel social media companies to hand over their data.
As far as funding for research and self-harm prevention training goes, the UK has passed the Digital Services Tax. Scheduled to go into effect in April 2020, it will impose a 2% levy on the revenues of search engines, social media platforms and online marketplaces that “derive value from UK users.” That 2% will be assessed on digital companies’ global turnover.
Dr. Bernadka Dubicka, chair of the child and adolescent faculty at the Royal College of Psychiatrists and co-author of the report, said that she’s seeing more and more children self-harming and attempting suicide as a result of their social media use and online discussions. Whatever social media companies are doing to protect their most vulnerable users, it’s not enough, she said:

Self-regulation is not working. It is time for government to step-up and take decisive action to hold social media companies to account for escalating harmful content to vulnerable children and young people.

In November 2019, Facebook included Instagram in its transparency report for the first time. Facebook is getting better at finding self-harm content before it spreads: it said that since May, it’s removed about 845,000 pieces of suicide-related content, 79% of which it was able to proactively find before users reported it.

Privacy implications

The College said that the data to be collected from tech companies would be anonymous and would include the nature of content viewed, as well as the amount of time users are spending on social media platforms.
The civil rights group Big Brother Watch told the BBC that it agrees with the importance of research into the impact of social media, but that users must be “empowered to choose what data they give away, who to and for what purposes”.
The campaign group’s director, Silkie Carlo, said young people should have “autonomy” on social media “without being made to feel like lab rats”. She noted that in the wake of the 2014 Cambridge Analytica scandal, data and privacy rights are facing “significant threats” online, and that user trust is low. That’s why user control should be treated as a priority, she said.

Being online is bad for kids

While the psychiatrists say there’s need for more research, there’s already a growing body of research that’s demonstrated that excessive use of digital devices and social media is harmful to children and teens. Back in January 2018, after Facebook had rolled out Facebook Messenger for Kids, children’s health advocates said that the app was likely to “undermine children’s healthy development” and urged Facebook to ban it.
Some of the findings cited by the Campaign for a Commercial-Free Childhood (CCFC):

  • Eighth graders who are heavy users of social media have a 27% higher risk of depression, while those who exceed the average time spent playing sports, hanging out with friends in person, or doing homework have a significantly lower risk.
  • US teenagers who spend three hours a day or more on electronic devices are 35% more likely, and those who spend five hours or more are 71% more likely, to have a risk factor for suicide than those who spend less than one hour.
  • Teens who spend five or more hours a day (versus less than one hour) on electronic devices are 51% more likely to get less than seven hours of sleep (the recommended amount is nine hours). Sleep deprivation is linked to long-term issues like weight gain and high blood pressure.
  • A study by UCLA researchers showed that after five days at a device-free outdoor camp, children performed far better on tests for empathy than a control group.

Parents, you can check out the BBC’s article for a list of the College’s advice on how to negotiate your children’s online use.

Latest Naked Security podcast


Click-and-drag on the soundwaves below to skip to any point in the podcast.


isn’t this the same kind of talk they had when printed books first came out? “sharing ideas is dangerous” and what not. I think its way too vague to say “children that spend x hours on devices are at a higher risk for y”. It would be more responsible to look into WHAT kids are getting into. With all that being said, as a father of a girl, I’ll be keeping a much closer eye on her online activities than I would before reading this.


Another example of a society that can’t take responsibility for their own actions. The fact of the matter is Social Media didn’t force this child to use their services. Social media isn’t responsible for the child’s supervision. It’s easier to blame large companies with lots of money rather than a failure in parenting.


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!