Dr. Ariel Tankus states that what he does is “mind reading.”
Tankus, of Tel Aviv University Hospital, the Health Sciences Center and Ichilov Hospital, said he and his team of researchers programmed a computer to “read” a patient’s mind and express his thoughts by decoding signals emitted by neurons in his brain.
These findings offer hope for people who are unable to speak after stroke, brain injury or amyotrophic lateral sclerosis (ALS), also known as Gehrig's disease.
This revolutionary research It was published. In June in the prestigious magazine Neurosurgery.
Tankos collaborated with Dr. Ido Strauss, of Tel Aviv University's Faculty of Medical and Health Sciences and director of the Functional Neurosurgery Unit at Ichilov.
For their research, they worked with an epilepsy patient who agreed to have electrodes implanted deep in his brain. The implanted electrodes are a few centimeters long and contain eight to nine tiny wires, “as small and thin as a human hair,” Tancus said.
Surgeons implanted electrodes to pinpoint the “very strong electrical currents” that might be causing the patient's seizures. The electrodes then allowed surgeons to pinpoint the exact area of the brain where they could operate.
For Tankus, this was perhaps the “only opportunity” to record the activity of a single brain cell.
In the first part of the experiment, Tankus asked the patient to repeat two sounds, called phonemes, which are the basic elements of speech. He pronounced “a” as “ah” and “e” as “eh.”
These phonemes are the starting point for building a language that allows the computer to “speak for the patient,” Tancus explained.
The researchers then recorded brain activity each time the patient pronounced the vowels.
The electrodes were implanted in parts of the brain — the anterior cingulate cortex and the orbitofrontal cortex — that are associated with cognitive and emotional processes, but not typically associated with speech, Tankus says.
“I discovered these neurons in the cerebral cortex and how they relate to speech,” Tankos continued.
He then used artificial intelligence algorithms to “train the computer” to distinguish between the two voices by recognizing brain activity.
In the second part of the experiment, the patient silently imagined saying “A” or “E.” The computer then predicted, in real time, what the patient said based on “multiple neurons in the brain,” Tancus said. “The computer is speaking for the patient.”
He stressed that “the research aims to help people with complete paralysis to speak again.” ALS patients cannot communicate “not even by blinking.”
In the future, when a patient is in the early stages of ALS and is still able to speak, electrodes will be implanted in their brain. Artificial intelligence algorithms developed by the research team will then be able to decode the brain patterns. The computer will then be programmed to understand what the patient is trying to say.
Although more research is needed, Tankus believes that even in ALS, a person's brain has activity similar to normal activity, but “can't express itself.”
“In the future, the computer will understand the patient and speak on his behalf,” Tankos said.
“If a paralyzed person can tell us ‘yes’ or ‘no’ when we ask them questions like ‘Are you hungry?’ or ‘Are you in pain?’, a computer can help them answer,” Tankus continued.
“This can be a huge change for a patient who is completely paralyzed and confined.”
Israel is important to you…
…then it's time to get to work. the The Times of Israel Committed to the existence of a Jewish Israel and democraticIndependent journalism is one of the best safeguards of these democratic values. If these values are important to you too, help us by joining community