NHS to Offer Instagram Advice on Removing Harmful Images

Peter Russell

February 11, 2019

A pledge by Instagram to remove graphic images of self-harm and suicide was welcomed by the Government.

The decision came after the father of Molly Russell, 14, who took her own life, said he believed the social media site was partly responsible for his daughter's death.

The family found material relating to suicide when they looked at her account on the image-sharing platform after her death in 2017.

In response to growing public concern about children's online safety, Instagram admitted that "we are not where we need to be on self-harm and suicide" and promised to "do more to keep the most vulnerable people who use Instagram safe".

Matt Hancock, the Secretary of State for Health and Social Care, who described the death of Molly Russell as "every parent's modern nightmare" welcomed the pledge and said Instagram would be offered expert NHS advice on which images should be taken down.

In an interview with The Sun on Sunday, Mr Hancock said the issue was "far too important to be left to the whims of social media companies".

Removing Graphic Images

Mr Hancock was reported to have given Instagram bosses 2 months to deal with the problem of self-harm pictures.

In response, the company said it would:

  • Not allow any graphic images of self-harm, such as cutting, on Instagram and continue to remove posts that promoted or encouraged suicide and self-harm

  • Not show non-graphic, self-harm related content – such as healed scars – in search, hashtags and the explore tab, although it would not remove this type of material entirely as it did not want to stigmatise or isolate people in distress or who were posting such content as a 'cry for help'

  • Support people posting and searching for self-harm related content by directing them to organisations that could help

  • Continue to consult with experts to find out what more it could do, such as blurring certain images with a 'sensitivity screen'

In a statement, Adam Mosseri, head of Instagram, said self-harm and suicide were "complex issues" and it had previously allowed content that shows contemplation or admission of self-harm "because experts have told us it can help people get the support they need". However, more recent advice had focused on "the potential to unintentionally promote self-harm".

Safer Internet Use

Sophie Beer, manager of the mind and body project at Addaction, which supports young people's mental health, said: "For many young people, when they are having negative thoughts and struggling to manage their emotions, their go-to distraction is often to pick up their mobile phone and go on social media.

"When they open their newsfeed the perfect bodies and lifestyles that greet them can reduce their self-esteem. And the prevalence of 'self-harm content' can influence young people.

"But many also say social media helps them feel less isolated and more connected with friends. It has become an integral part of their lives; any recommended regulations need to take this into consideration. 

"We need to remember that self-harm is often caused by a multitude of factors.

"Social media companies need to work with experts to do more to educate young people about safer internet use and minimise the potential of exposure to harmful content.  But we also need to increase efforts to engage young people face-to-face and teach them positive ways of coping with life’s pressures.

"Open conversations around mental health in school and at home are key to helping young people reach out for the right support."

Nick Hodgson, media manager at the Royal College of Psychiatrists, tweeted that it was good to see social media companies like Instagram "finally waking up to the harm their products can cause".

Last week, the UK's chief medical officers issued advice on screen time and social media use.

They said that although a recent analysis of available evidence found a lack of evidence that they were linked to mental health and social issues in children and young people, it was right to take a precautionary approach.

The chief medical officers also said technology companies needed to take some of the responsibility for young people's safety online.
 

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....