A group of paralysed musicians just delivered a breathtaking performance using their brainwaves
A performance by a group of paralysed musicians is proof that technology can help people in truly astonishing ways. The project was carried out jointly by Plymouth University's Interdisciplinary Centre for Computer Music Research (ICCMR) and London’s Royal Hospital for Neuro-disability.
It aims to help people collaborate to create music who would otherwise be unable to do so. This includes those with locked-in syndrome, where a patient is completely mentally present but cannot move their body, and ALS (also known as Lou Gehrig's disease or amyotrophic lateral sclerosis) for which the viral "Ice Bucket Challenge" raised both awareness and more than $100m.
However, these particular patients are in a unique position insofar as they used to play - and have a great understanding of - music. Without the creative output, however, they are resigned only to listen and never to perform. But with the help of cutting-edge neurological technology, they are able to use their brainwaves to do what their bodies can’t.
“We have been working with the RHN for around four years,” explains Professor Eduardo Miranda, Director of the ICCMR, “and our collaboration is having a hugely positive impact on everyone involved and changing perceptions at the same time.
“Our work is giving people an opportunity to put their physical impediments aside,” he continues, “and use music to communicate in ways that would not normally be possible because of their medical conditions. It is an amazing example of research being taken out of the laboratory and into the real world, with both inspiring and very emotional results.”
“This is a truly magical experience,” stated Steve Thomas, one of the performers involved in the project. “It is a chance to play with other severely disabled musicians, and it actually sounds impressive.”
The project was co-managed by PhD student Joel Eaton, with support from London’s Royal Hospital’s Dr Sophie Duport. “Music as a mechanism for neuro-feedback presents an interesting medium for artistic exploration, especially with regard to passive BCI control,” writes Eaton in his abstract for the accompanying research paper, entitled The Space Between Us : Evaluating a multi-user affective brain-computer music interface. “Passive control in a brain-computer music interface (BCMI) provides a means for approximating mental states that can be mapped to select musical phrases, creating a system for real-time musical neuro-feedback. This article presents a BCMI for measuring the affective states of two users, a performer and an audience member, during a live musical performance of the piece titled The Space Between Us.”
As well as a number of able-bodied performers, it uses a rubber cap which picks up on the brain’s electrical signals and - by extension - the paralysed performer’s brain activity. They can choose between a number of musical phrases and this information is then sent to a performer who plays as instructed.
“For the Paramusical Ensemble, each of the four patients connected to the BCMI generates the musical parts to be performed by a different member of the string quartet in real-time,” the university explains on their website. "The participants are given four options of musical phrases displayed on a panel, which they can select by staring at lights flashing next to them."
“The BCMI detects which phrase has been selected by each participant - by reading the electrical activity of their visual cortex - and sends the phrases to the string quartet to perform,” they continue. “The resulting piece,” they add, “lasts for up to 20 minutes” and - aptly - is called The Space Between Us. The documentary, meanwhile, was created by Tim Grabham with help from Professor Eduardo Miranda.
“The system adapts to the affective states of the users and selects sequences of a pre-composed musical score,” adds PhD student Joel Eaton in his extract. “By affect-matching music to mood and subsequently plotting affective musical trajectories across a two-dimensional model of affect, the system attempts to measure the affective interactions of the users, derived from arousal and valence recorded in EEG.”
“Furthermore, an affective channel of communication shows potential for enhancing collaboration in music-making in a wider context, for example in the roles of therapist and patient.”
It’s hoped that the success of this project will help raise awareness of these techniques of enabling paralysis patients to communicate in a musical manner. Clearly, while some forms of technology are criticised for frivolity, there are ways in which new innovations are making a difference in the lives of those who stand to gain the most.