Home Doctor NewsNeurology News Specialized brain devices can predict what you are thinking before you speak: Study

Specialized brain devices can predict what you are thinking before you speak: Study

by Pragati Singh
brain

A new study at the California Institute of Technology demonstrates how devices implanted into people’s brains have the ability to precisely detect which phrases a tetraplegic participant was just thinking rather than speaking or miming.
“Online internal speech decoding from single neurons in a human participant,” a novel discovery, was presented at the 2022 Society for Neuroscience conference in San Diego. The researchers used the opportunity to show off their brain-machine interfaces (BMIs), which might eventually help speech-impaired patients.

“You may already have seen videos of people with tetraplegia using BMIs to control robotic arms and hands, for example to grab a bottle and to drink from it or to eat a piece of chocolate,” says Sarah Wandelt, a Caltech graduate student in the lab of Richard Andersen, James G. Boswell Professor of Neuroscience and director of the Tianqiao and Chrissy Chen Brain-Machine Interface Center at Caltech.

“These new results are promising in the areas of language and communication. We used a BMI to reconstruct speech,” says Wandelt, who presented the results at the conference on November 13.

Previous research has found that examining brain signals captured from motor regions when a participant whispered or mimed phrases can help anticipate participants’ speech. Internal communication, on the other hand, is far more difficult to foresee since it does not entail any movement, argues Wandelt. “In the past, algorithms that tried to predict internal speech have only been able to predict three or four words and with low accuracy or not in real time,” Wandelt says.

The current study is the most accurate at predicting internal words yet. Brain signals were collected from single neurons in the supramarginal gyrus, which is located in the posterior parietal cortex. In a prior study, the researchers discovered that this brain region represented spoken words.

The team has now expanded their results to include internal speech. The researchers then taught the BMI gadget to distinguish the brain signals produced when the tetraplegic subject talked internally or thought particular phrases. This training session lasted roughly 15 minutes. They then displayed a word on a screen and instructed the participant to speak it internally.

The results indicated that the BMI algorithms could identify eight words with up to 91 percent accuracy.
The research is still in its early stages, but it might benefit people with brain injuries, paralysis, or disorders that impact speech, such as amyotrophic lateral sclerosis (ALS).

“Neurological disorders can lead to complete paralysis of voluntary muscles, resulting in patients being unable to speak or move, but they are still able to think and reason. For that population, an internal speech BMI would be incredibly helpful,” Wandelt says.

“We have previously shown that we can decode imagined hand shapes for grasping from the human supramarginal gyrus,” says Andersen. “Being able to also decode speech from this area suggests that one implant can recover two important human abilities: grasping and speech.”

The researchers also point out that BMIs cannot be used to read people’s minds since the device would have to be trained in each person’s brain independently, and they only operate when the person concentrates on the word.

The National Institutes of Health, the Tianqiao and Chrissy Chen Brain-Machine Interface Center, and the Boswell Foundation financed the study, which is now being peer reviewed but has not yet been published. Aside from Wandelt and Andersen, other Caltech contributors include David Bjanes, Kelsie Pejsa, Brian Lee (PhD ’06), and Charles Liu. Lee and Liu are Caltech visiting associates who also teach at USC’s Keck School of Medicine.

 

You may also like