This brain scan can determine if a criminal is guilty or innocent
In a recent episode of hit TV show Black Mirror, we were left shaken as we saw British architect Mia hooked up to technology that could see straight into her mind and capture her memories. Indeed, the idea that someone can read your thoughts has been a mainstay of science fiction.
If you're a fan of Black Mirror, you'll know the harrowing story ends even more devastatingly than it begins. However, as devotees will also know, the show has an uncanny habit of predicting the future. And while tech giants have a fair way to go before they invent a computer that can read your mind and record your memories, that's not to say that something similar hasn't already been created.
In fact, multiple companies out there have marketed fMRI lie detectors: neuroscientific brain scans that are able to determine whether or not a person is lying by scanning their brain. The machine works by detecting neurological responses - picking up on the activation of decision-making areas in the brain when a person tells a lie - and using these to determine whether an individual has intimate knowledge of a crime. Amazingly, experts are claiming that it has the ability to determine whether the potential criminal is innocent or guilty.
There's a strong chance you may be thinking "but we already have something like this". The polygraph lie detector test, which measures and records several physiological indices such as blood pressure, pulse, respiration, and skin conductivity, has been around for almost one hundred years now and is often used as an interrogation tool by law enforcement and federal government agencies such as the FIB and the CIA.
However, experts have discovered that, when it comes to lying, our brains are significantly more likely to give us away than our sweaty palms or spikes in heart rate. When researchers from the University of Pennsylvania's departments of Psychiatry and Biostatistics and Epidemiology tested both lie detector tests, they found that neuroscience experts without prior experience in lie detection, using fMRI data, were 24 per cent more likely to detect deception than professional polygraph examiners reviewing polygraph recordings.
The study saw participants be asked to secretly write down a number between three and eight. Afterwards, every person was administered a Concealed Information Test while hooked to a polygraph and then lying inside an MRI scanner. During both sessions, they were instructed to answer "no" to questions about all the numbers, making one of the six answers a lie. The results were then evaluated by three polygraph and three neuroimaging experts separately and compared to determine which technology was better at detecting the lie, with the MRI scan coming out on top.
Unsurprisingly in this day and age, this technology is far from the only one used in relation to criminal activity. In 2017, neuroscientists used brain scans to spot the difference between people who committed crimes on purpose and those who broke the law through sheer reckless behaviour. Impressively, it was the first time that people's intention to perform a criminal act has been decoded in a brain scanner, an important breakthrough as the punishment a criminal receives can be profoundly influenced by their intentions at the time the crime was committed.
Take the case of Oscar Pistorius, for example. Intention was key in his infamous 2013 case. Did he really believe that girlfriend Reeva Steenkamp was an intruder, or did he aim to kill? Initially found guilty of culpable homicide - the rough equivalent of involuntary manslaughter - the former athlete was sentenced to just six years in prison. However, a South African appeal court later more than doubled his sentence, finding him culpable for murder and condemning him to 13 years and five months in jail. If a scan had detected brain activity inferring that he definitely either had or hadn't intended to commit murder, things may have turned out a lot different for the former Olympic athlete.
In spite of an initial successful experiment - where they scanned the brains of 40 people while they took part in a computerised task that offered them rewards to carry suitcases across a border - scientists are keen to see similar scans from hundreds, if not thousands, of more people before drawing any strong conclusions on this intent technology. However, the initial positive results suggest that we could be seeing brain scans play a serious role in court cases in the future.
However, this is not to say MRI brain scans are perfect when it comes to recognising fabrications. Experts have given several different estimates as to just how accurate this type of lie detector test is, with many agreeing that they are around 80 to 90 per cent correct. Back in 2010, defence attorneys for Lorne Semrau, a psychologist accused of defrauding Medicare and other health-benefit providers, had sought to introduce fMRI brain scans to show that their client had no intention of cheating the government and other insurers. But after a pre-trial hearing that featured testimony from scientists on both sides of the issue, Magistrate Judge Tu Pham concluded that the scans don't measure up to the standards of scientific evidence required by federal courts.
Even when two companies marketing fMRI lie detectors, No Lie MRI in California and Cephos in Massachusetts reported accuracy rates from 75 to 98 percent in 2010, many insisted it was just not enough to introduce them into the court decision-making process. "That's not good enough," Joy Hirsch, director of the Program for Imaging and Cognitive Sciences at Columbia University. "Someone's life could be in the hands of this technology."
Although this technology isn't appropriate for use in the courts at the moment, there's a chance we could be using it in the future. But the question on experts' minds is: should we? There's a chance that using this technology may go hand in hand with a legal violation of an individual's right to privacy and many have questioned whether neuroscientific techniques should be permitted in court.
"In the United States, current rules of federal evidence provide strict criteria, which constrain how brain science can be used," explains Professor Giordano. "Yet, threats to individual rights persist when considering the use of neurological evidence. These threats include vague definitions of what constitutes the "private domain" of the mind, how this relates to the right to privacy, and a lack of guidelines for informed consent when using neuroscientific evidence."
The idea of leaving someone's life up to a piece of machinery is daunting. Already we type our banking details into a computer in the blink of an eye, and add our personal information to social media sites without a second thought. Are we ready to put someone's life into a computer's soulless hands? Is the opportunity to determine guilt or innocence once and for all too good to pass up, or should we heed Black Mirror's warning and leave technology out of it?