Here's a fact that may shock you: according to the World Health Organisation, approximately one million people commit suicide each year. That figure roughly equates to around one person consciously taking their own life every 40 seconds, which adds up to more than 3,000 deaths every single day, week after week and month after month. If these people had died any other way other than nooses or pills, slashed wrists or vertiginous heights, then a loss of life on this scale would be classified as n0thing short of genocidal. It would be a worldwide epidemic.
But these are suicides, self-inflicted wounds, and thus regarded as unpreventable, to say nothing of those many people who feel suicidal and yet cannot go through with the violence required, or those who attempt suicide multiple times and yet never die. Our society simply shrugs and says "what a shame" and things stay much the same. Part of the reason why suicide is so pervasive (apart from the fact that it is an act of free will on the part of the suicidal individual to take their own life) is that it is often difficult to spot and diagnose. Often in fact, by the time someone is discovered to be suicidal, it is already too late.
If you want an example of how suicide can often remain undetected, then look no further than the late actor and comedian Robin Williams. The celebrity was a beloved star of many movies, and despite his open attitude towards his own struggles with his mental health, the public was stunned when Williams took his own life in August of 2014. Had others known beforehand that he was struggling with feelings of intense anguish and self-loathing then professionals could have made an attempt to save his life. But we don't always pick up on subtle clues, or read people properly. But what if we could exchange human fallibility for a programmable machine mind? What if the best detector of suicide wasn't a man or a woman, but an artificial intelligence?
An article published in the scientific journal Nature: Human Behaviour, entitled
Machine Learning of Neural Representations of Suicide and Emotion Concepts Identifies Suicidal Youth, claims that researchers from Carnegie Mellon University have employed an AI to observe brain scans from selected patients to determine which of them was suicidal. The article itself claims that: "The assessment of suicide risk is among the most challenging problems facing mental health clinicians, as suicide is the second-leading cause of death among young adults. Furthermore, predictions by both clinicians and patients of future suicide risk have been shown to be relatively poor predictors of future suicide attempts."
Researchers from Carnegie Mellon University closely observed 34 individual participants, 17 of whom were already suicidal. The participants had their brains scanned using magnetic resonance imaging (MRI). While they were scanned, the participants were shown select words related to suicide and words and phrases that related to a number of positive and negative emotions. When the researchers determined which terms the participants deemed suicide-related, they were able to pinpoint five regions of the brain and six words to identify the participants that were suicidal. Based on this, the researchers were then able to build an effective algorithm to identify suicidal patients using the relevant data.
The results determined that neurotypical participants, in stark contrast to those who were suicidal, displayed markedly different reactions to the words they were shown. For instance, when the suicidal people were shown the word “death,” their amygdala, the part of the limbic system responsible for processing guilt, fear and shame, showed far more activity than it did in the control. Of the 34 participants, the AI managed to identify 15 out of 17 suicidal patients and a further 16 of 17 members of the control. The researchers then divided the suicidal participants into two groups: those people who had previously attempted to take their own lives, and those who hadn't. The AI employed the same algorithmic model and thus correctly categorised 16 of those 17 patients.
The article in question also notes: "T
he identification of differential patterns of regional activation could suggest brain regions to target using brain stimulation techniques, such as trans-cranial magnetic stimulation or trans-cranial direct current stimulation. The identification of altered emotional responses to suicide-related concepts could prove very useful to a psychotherapist in trying to improve the patient’s attraction to life and decrease the attraction to suicide and death."
This research will do little to assuage fears that AI and robots will be taking over more jobs in the future, but for those people who are concerned about this issue, diagnostic tools such as this one are extremely welcome, and will doubtlessly save a great many lives in the future if properly applied. If you or anyone else you know is grappling with suicidal thoughts and feelings then please seek urgent professional medical care, or call the
Samaritans on 116 123 for help and free 24-hour counselling.