NEW YORK ― Instagram, the popular picture-based social media platform with more than 80 million users, contains "alarming" prosuicidal or self-harm images, despite its stated policy against such content, new research suggests.
Investigators report that suicidal and self-harm themes they found on Instagram were often graphic in nature and included acts in progress.
"Our findings are quite alarming. It appears that the Instagram policy of identifying and removing content with suicidal/self-harm themes is ineffective," lead author Arshya Vahabzadeh, MD, resident physician in child and adolescent psychiatry, Massachusetts General Hospital in Boston and McLean Hospital in Belmont, Massachusetts, said.
The study was presented here at the American Psychiatric Association's 2014 Annual Meeting.
Lack of Monitoring?
"A lot of our patients engage with social media, and they often describe incidents of bullying or difficult interactions they've had with their peers. It's a very topical issue, something we are seeing more and more," Dr. Vahabzadeh told Medscape Medical News.
"There is almost nothing on the photographic content of Instagram in the literature. No one has really looked at this in a comprehensive and in-depth way," he added.
Dr. Vahabzadeh and coauthor Holly Peek, MD, MPH, did a search on Instagram using the search term, "hashtag suicide" (#suicide), and analyzed the content of the first 60 images and quotes they found.
More than a quarter of the images (27%) had either a prosuicide (15%) or pro-self-harm (12%) theme, and none of them identified mental health resources or help, they report.
For example, 4 images showed bleeding cuts, including active cutting; 1 image was instructive on the method of cutting, and 1 image described the cuts as "beautiful." Four images visually demonstrated actual methods of suicide, 2 with hanging and 2 with pill overdose.
"It's tricky identifying who is posting these suicidal images on Instagram because you can't really analyze their profiles, but we felt the majority of people seem to be female, and they post multiple times on the same theme, which makes me think these are people who need help," said Dr. Vahabzadeh.
"We are very worried by our findings," he said. "Only 5% of the images we saw demonstrated any sort of hopeful theme. Most of the pictures had a lot of depressive content, and you have to wonder who is looking at this; it could very well be vulnerable individuals or children, and what do the parents know about this? Instagram now has videos, which is even more worrisome."
Although Instagram has a policy of identifying and removing content with suicide/self-harm themes, it does not appear to be monitoring itself, Dr. Vahabzadeh concludes. "There are hashtags for things like, 'I want to die,' and you can search for many different hashtags on the same theme."
Instagram does have a message that appears onscreen before a user enters the "#suicide" area and that directs the user to resources for suicide prevention. "But once you are in the area, you are in, and I think it could be overwhelming for vulnerable individuals," he said.
Dr. Vahabzadeh said he sent a request to Instagram for comment but did not receive a response.
An email request for comment from Medscape Medical News to Instagram generated this response:
"While Instagram is a place where people can share their lives with others through photographs, any account found encouraging or urging users to embrace anorexia, bulimia, or other eating disorders; or to cut, harm themselves, or commit suicide will result in a disabled account without warning. We believe that communication regarding these behaviors in order to create awareness, come together for support and to facilitate recovery is important, but that Instagram is not the place for active promotion or glorification of self-harm."
Instagram would not attribute the statement to a particular spokesperson.
"We must engage with social media networks and the wider public to ensure that these issues do not go unnoticed and attempts are made to remedy them," Dr. Vahabzadeh said.
Interpret With Caution
Reached for comment, Yi-Chun (Yvonnes) Chen, PhD, of the William Allen White School of Journalism and Mass Communication at the University of Kansas in Lawrence, urged caution in interpreting the results from this "exploratory study."
"I would not go as far as claiming that the policy of self-harm isn't strictly enforced, for a number of reasons," she said. "It's uncertain whether these 60 images and quotes were representative of the self-harm content on Instagram. Analyzing a large number of Instagram images is a challenge, and I understand why the authors chose to analyze the first 60," she said.
"It's also uncertain whether these 60 images reflected the real-time conversations about self-harm, given the constant updates and instantaneous nature of social media. Finally, based on the Instagram policy, Instagram also relies on users' help to enforce the level of content appropriateness. It'd be difficult for Instagram to examine every post carefully without the support of its users," Chen said.
"Instead of enforcing strict policy or guidelines, Instagram has the power and means to educate its users about this content. It is likely that users who posted these self-harm images or quotes may tend to attract certain audiences who may be less likely to report to Instagram about content violation. However, there are opportunities for an education campaign," Chen added.
The authors and Dr. Chen report no relevant financial relationships.
American Psychiatric Association's 2014 Annual Meeting. Abstract NR1-55. Presented May 3, 2014.
Medscape Medical News © 2014 WebMD, LLC
Send comments and news tips to news@medscape.net.
Cite this: Prosuicidal Posts on Instagram 'Alarming' - Medscape - May 03, 2014.
Comments