A device attached to the brain Tired Enables the patient to communicate by decoding his signals Brain And converts them into letters on a screen.
It’s about a The new method was introduced by scientists from Germany and Switzerland – including a Greek researcher in the diaspora.Allowed to communicate with a patient who is unable to speak or move, in a state of complete exhaustion in a locked-in state in his body, which transmits APE-MPE.
The study, published in the journal Nature Communications, was led by Niels Birbaumer of the University of Tbingen and Jonas Zimmermann of the Vice Center for Bio Neuro Imaging in Geneva. Thanks to the use of the new Brain-Computer Interface (BCI), how verbal communication is possible with patients with Inclusion Syndrome.
Such devices are able to restore communication with people who have lost the ability to move or speak. So far, research in this area has focused on patients with amyotrophic lateral sclerosis (ALS), also known as motor neurone disease, which is a neurodegenerative disease that leads to a gradual loss of spontaneous muscle control.
To date, various technologies have been developed that allow these patients to communicate with their eyes or facial muscles. But when muscle control is completely lost, patients also lose the ability to communicate. They remain in a state of “inclusion” in their body.
This time, after more than two years of research with the 34-year-old man, the researchers, using an auditory neuro feedback system (a type of BCI), completely “surgically attached it to the motor cortex of the brain.” They were able to form words and styles that allowed them to perform.
How the patient “spoke”
The patient was given auditory feedback on his neuronal activity and instructed to match the frequency of a sound with the activity of brain neurons in a way that could be interpreted as “yes” or “no”. The patient is also able to modify the stimulation of his neurons based on auditory feedback, so he chooses letters to form words and phrases for communication.
Brain signals are recorded by implanted micro electrodes and then decoded in real time using an artificial intelligence (machine learning) system. The algorithm interprets the brain’s signals as “yes” or “no”. Another program reads the letters of the alphabet aloud and through audio feedback the patient can choose “yes” or “no” to each letter, so slowly in this way, if a letter is rejected or accepted, he can form a whole word or sentence. .
Zimmerman said: “The study answers the long-standing question of whether people with complete involvement syndrome who have lost all spontaneous muscle control, including eye or mouth movements, lose their ability to generate commands. Communication. Our study was the first to communicate with someone who did not, so the BCI interface is now the only way to communicate.
Use the device even at home
The researchers said This method can be used even at the patient’s home. However, before the new method could be widely used clinically, they suggested that more research be done on more patients to confirm its safety and efficacy. The number of people with ALS worldwide is steadily increasing and is expected to exceed 300,000 by 2040, many of whom have reached a point where they can not even speak.
Ioannis Vlachos, a neuroscientist at the Wyss Center, who holds a PhD (2011) in Computational Neuroscience from the German University and a degree from the NTUA’s School of Electrical Engineering, was instrumental in developing the new method. Of Freiburg.
* File photo
Prone to fits of apathy. Unable to type with boxing gloves on. Internet advocate. Avid travel enthusiast. Entrepreneur. Music expert.