Recently, HuffPost UK searched for self-harm images on Instagram to see just how accessible the content was to young users. It took a matter of seconds to come across a self-harm hashtag with tens of thousands of posts, including some incredibly disturbing images.
Increasing amounts of pressure have been put on social media firms to clamp down on this type of content since the death of 14-year-old Molly Russell.
She was found dead in her bedroom in November 2017 after showing “no obvious signs” of severe mental health issues. Her family later discovered she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.
Molly’s father, Ian Russell, claimed the algorithms used by Instagram enabled Molly to view more harmful content, possibly contributing to her death.
“Social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post,” he said.
“In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide.”
Mosseri met with Matt Hancock today, after the Health Secretary last week warned that social media sites could be banned if they failed to remove harmful content.
Jackie Doyle-Price, the minister for suicide prevention, also said sites should be held to account and legally treated as publishers so they could be punished for allowing graphic self-harm images and abusive images to be shared.
Useful websites and helplines: