Instagram and Facebook will hide content about suicide, self-harm and eating disorders from children, says the social media platforms’ owner Meta.
Under the new rules, users aged under 18 will not be able to see this type of content on their feeds, even if it is shared by someone they follow. Users must be at least 13 to sign up for Instagram or Facebook.
The platforms will instead share resources from mental health charities when someone posts about their struggles with self-harm or eating disorders.
Teens will be automatically placed into the most restrictive content control setting on Instagram and Facebook, which makes it more difficult for them to come across sensitive content.
“We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these apps,” the company said in a blog post.
Meta will roll out the measures on Facebook and Instagram over the coming months.
The measures are welcome but don’t go far enough, according to an adviser to a charity set up in memory of a British teenager who died from self-harm after consuming damaging content online.
Molly Russell, a 14-year-old girl from Harrow, northwest London, was found dead in her bedroom in November 2017 after watching 138 videos related to suicide and depression online.
In a landmark ruling at an inquest in 2022, a coroner ruled she died not from suicide, but from “an act of self-harm while suffering from depression and the negative effects of online content”.
Please use Chrome browser for a more accessible video player
4:09
Andy Burrows, an adviser to the Molly Rose Foundation, said teenagers continue to be “bombarded with content on Instagram that promotes suicide and self-harm and extensively references suicide ideation and depression”.
“While Meta’s policy changes are welcome, the vast majority of harmful content currently available on Instagram isn’t covered by this announcement, and the platform will continue to recommend substantial amounts of dangerous material to children,” he said.
Mr Burrows added: “Unfortunately this looks like another piecemeal step when a giant leap is urgently required.”
Research by the charity showed that, on Instagram, almost half of the most-engaged posts under well-known suicide and self-harm hashtags last November contained material that glorified suicide and self-harm, referred to suicide ideation or otherwise contained themes of misery, hopelessness or depression.
Read more:
Meta ‘intentionally addicts children to social media’
Father calls for end to algorithms ‘pushing out harmful content’
Friend of 14-year-old who died from self-harm speaks out
Please use Chrome browser for a more accessible video player
2:11
Much of the content identified was posted by meme-style accounts which Mr Burrows is concerned would not be covered by the new measures.
Suicide is the third leading cause of death among 15 to 19-year-olds in the UK, according to figures published last year by the Office for National Statistics.
Meta and other social media companies are facing pressure from governments across the world to improve the safety of children using their platforms.
In the UK, the Online Safety Act, which became law in October 2023, requires online platforms to comply with child safety rules or face hefty fines.
Ofcom, the communications regulator, is currently drawing up its guidelines on how the laws will be enforced.
In the US, Mark Zuckerberg, Meta’s chief executive, will testify before the Senate on child safety alongside the bosses of other tech giants at the end of January.
The EU’s Digital Services Act, which came into force last year, forces online giants to better police content published on their platforms within the European Union.