Instagram and Facebook will hide content about suicide, self-harm, and eating disorders from children, says the social media platforms’ owner Meta.
Under the new rules, users aged under 18 will not be able to see this type of content on their feeds, even if it is shared by someone they follow. Users must be at least 13 to sign up for Instagram or Facebook.
The platforms will instead share resources from mental health charities when someone posts about their struggles with self-harm or eating disorders.
Teens will be automatically placed into the most restrictive content control setting on Instagram and Facebook, which makes it more difficult for them to come across sensitive content.
“We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these apps,” the company said in a blog post.
Meta will roll out the measures on Facebook and Instagram over the coming months.
The measures are welcome but don’t go far enough, according to an adviser to a charity set up in memory of a British teenager who died from self-harm after consuming damaging content online.
Molly Russell, a 14-year-old girl from Harrow, northwest London, was found dead in her bedroom in November 2017 after watching 138 videos related to suicide and depression online.
In a landmark ruling at an inquest in 2022, a coroner ruled she died not from suicide, but from “an act of self-harm while suffering from depression and the negative effects of online content”.