Machine Learning, MRI Accurately Identify Suicidal Intent

Pauline Anderson

November 07, 2017

Machine learning combined with fMRI accurately identifies young adults with suicidal thoughts, new research shows.

Using fMRI and computer-generated algorithms to measure the brain's response to death, suicide, and other concepts, researchers reliably distinguished youth with suicidal thoughts from control persons and accurately identified individuals who had made a suicide attempt.

This is important because most people do not report suicidal feelings. Research shows that almost 80% of patients who die by suicide denied they had suicidal thoughts during their last contact with a mental healthcare professional.

"The biological processes that are involved in suicidal thinking and behavior reflect changes in the way people feel and think about related concepts," study author David Brent, MD, a child and adolescent psychiatrist and professor of psychiatry, University of Pittsburgh Medical Center and School of Medicine, told Medscape Medical News.

Dr David Brent

"This approach gives us hope that we will be able to more precisely target those issues and be able to help more people."

The study was published online October 30 in Nature Human Behavior.

Machine Learning Classifier

The study included 17 patients with suicidal thoughts, many of whom had been recently discharged from an inpatient facility, and 17 healthy volunteers who had no personal or family history of psychiatric disorder or suicide attempts.

The groups were matched with respect to intelligence, sex ratio (24% male) and age (mean age, about 22 years).

The researchers assessed history of suicide attempt with the Suicide History Form and Suicide Intent Scale. They assessed severity of suicidal ideation using the interviewer-rated Columbia-Suicide Severity Rating Scale (C-SSRS) and the self-reported Adult Suicide Ideation Questionnaire (ASIQ).

While the participants were exposed to various stimuli, researchers used fMRI to view various brain regions.

The stimuli included three word groups, each of which had 10 words. The word groups were related to suicide, negative affect, and positive affect.

The 30 stimulus items were presented six times in random order. Each item was displayed for 3 seconds. The 3-second display was followed by a four-second interval, and longer intervals were included periodically.

Participants were asked to actively think about the concepts to which the stimulus words referred.

A psychiatrist was present during testing of the patients with suicidal ideation "to ensure they were safe when they were invited to think about these things," said Dr Brent.

The researchers distinghished the groups through the use of computer-generated algorithms.

"When you're looking at patterns of brain activation, you're talking about a huge amount of data," said Dr Brent. "So in order to separate the responses and classify them, you can't use conventional methods very effectively."

The machine-learning classifier identified suicidal individuals with a very high accuracy rate (0.90; P < .000001), correctly identifying 15 of the 17 suicidal participants and 16 of the 17 control persons (sensitivity = 0.88, specificity = 0.94, positive predictive value [PPV] = 0.94, negative predictive value [NPV] = 0.89).

Key Brain Regions

The investigators found that the high degree of accuracy in classification remained after statistically controlling for group differences, such as variations in anxiety and history of childhood trauma.

The concepts that most strongly distinguished the groups were death, cruelty, trouble, carefree, good, and praise.

The most discriminating brain regions included the left superior medial frontal area, the medial frontal/anterior cingulate, the right middle temporal area, the left inferior parietal area, and the left inferior frontal area. All these regions have repeatedly been strongly associated with self-referential thought.

"It seemed to indicate that if you gave a word like 'suicide' or 'death,' the ideators would think about 'my death,' whereas the controls would just think of the concepts of death without necessarily thinking of themselves," said Dr Brent.

For individuals with suicidal thoughts, there also appeared to be some disconnection in prefrontal activation, said Dr Brent.

"When healthy people experience a negative thought or a negative emotion, they are able to effectively redirect it by activating parts of the prefontal cortex to modify things. We didn't see that to the same extent in the people who were suicidal."

An intervention such as cognitive-behavioral therapy may help change the way a patient thinks about these concepts, said Dr Brent.

He noted that researchers have developed a game that conditions people to associate "self" with positive thoughts and words related to "suicide" with negative thoughts. Use of this game has resulted in decreased self-harm and suicidal behavior, he said.

"So you may be able to try to help people uncouple things that are driving them towards unhealthy conclusions and behavior."

The machine-learning approach could also identify when patients are improving and are becoming less fixated on suicide. "In that way, it could also have a therapeutic benefit," said Dr. Brent.

Neural Signatures

The authors noted that the six concepts that were altered in those with suicidal ideation include items from all three stimulus categories ― one related to suicide, two negative concepts, and three positive concepts.

"The valuation of what is important and good in life and what is not seems to be altered in ideators," the authors write. "Our results provide a neurally based, quantitative measure of this alteration."

In previous research, the study's lead author, Marcel Adam Just, PhD, DO Hebb Professor, Department of Psychiatry, Carnegie Mellop University, Pittsburgh, identified the neural "signatures" of different emotions.

Using those algorithms, he and his colleagues identified the "signatures" for "sadness," "shame," "anger," and "pride" within the neural representations of the six concepts.

In the group with suicidal ideation, the concept of "death" evoked more shame than in the control group, whereas the concept of "trouble" evoked more sadness. In addition, "trouble" evoked less anger and the positive concept "carefree" evoked less pride in the group with suicidal ideation.

This type of neurally acquired information may provide specific targets for intervention. For example, said Dr Brent, if a suicidal patient is feeling a lot of shame, a psychotherapist could talk about this with the patient.

"It may give us a sense of things that are driving people toward suicide that they may or may not be able to express explicitly."

Within the suicidal ideation group, the machine-learning classifier was able to distinguish the nine patients who had made a suicide attempt from participants who had not attempted suicide.

The concepts that best discriminated between those who attempted suicide and those who did not were "death," "lifeless," and "carefree." These terms include two suicide-related concepts and one positive concept. The most discriminating brain regions here were the left superior medial frontal area, the medial frontal/anterior cingulate, and the right middle temporal area.

"Fascinating" Finding

There were also differences in emotional signatures between those who had attempted suicide and those who had not. For example, the concept of death evoked less sadness in those who attempted suicide.

"We speculate that for those who are conflicted about engaging in a suicidal act, the thought of facilitating death is shameful, whereas those ideators who have made an attempt show greater attraction to and acceptance of death, and hence less sadness in thinking about it," the authors note.

The ability of the machine-learning approach to identify people who had attempted suicide and the fact that they had a different emotional experience than those with suicidal thoughts who had not made an attempt were "the most fascinating" aspects of the study.

Again, this may point to some therapeutic possibilities, he said.

The researchers did not investigate sex differences in patterns of thinking when exposed to the various concepts.

Dr Brent acknowledged that some suicidal patients may not want others to know what they are planning and so might simply block out thoughts that would give away their intentions.

At present, it is not practical to use fMRI clinically to identify patients with suicidal ideation. But Dr Brent and his colleagues hope to eventually develop a simple computer task or screening test to identify people at risk.

Dr Just is investigating the use of EEG instead of fMRI, which would be less expensive and more widely accessible.

Other researchers are using machine-learning and data from electronic health records to identify patients who are at risk of attempting suicide within the following 7-day period, said Dr Brent

He noted that using machine-learning of neural representations of suicide is "not a silver bullet." He described the new research as "proof of concept."

"It's a way of opening a window into how people think, and possible mechanisms, but in terms of public health ways of identification, I think that screening either though electronic health record or other kinds of self-report, or through computer tasks, will be a much more efficient way to go."

Pioneering Research

Commenting on the study for Medscape Medical News, E. David Klonsky, PhD, professor, Department of Psychology, University of British Columbia, Vancouver, Canada, whose research interests include suicide, described it as "fascinating" and "pioneering."

"It combines two promising technologies ― machine learning and brain imaging ― to advanced knowledge about suicide risk and prediction."

However, said Dr Klonsky, like many studies of new approaches, there are considerable obstacles to applying these methods to improve risk detection and prevention.

For example, he said, in real-world clinical settings, most patients have depression or another psychiatric disorder, and health professionals try to identify those at heightened risk for suicide "above and beyond their psychiatric diagnosis."

But this study compared individuals with suicidal ideation to healthy control persons who had no history of suicidal ideation and no personal or family history of a psychiatric disorder.

"As a result, differences observed in the study between suicide ideators and controls could be due to differences between one group with psychiatric disorders ― the ideators ― and one group without such disorders ― the controls."

Like Dr Brent, Dr Klonsky also found the ability of the machine-learning approach to distinguish suicidal ideators who had attempted suicide from those who had not was "the most exciting" finding.

"Most commonly cited risk factors for suicide, such as depression and hopelessness, are strong correlates of suicidal ideation but are poor predictors of attempts among ideators. If the study's technique can illuminate differences between attempters and ideators, that could advance both suicide theory and prevention."

The research was partially supported by the National Institute of Mental Health and an Endowed Chair in Suicide Studies at the University of Pittsburgh School of Medicine. The study authors and Dr Klonsky have disclosed no relevant financial relationships.

Nat Hum Behav. Published online October 30, 2017. Abstract

For more Medscape Psychiatry news, join us on Facebook and Twitter.


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.