SCIENCE
Mind Reading Computer System May Help People With Locked-in Syndrome
- Written by: Webmaster
- Category: SCIENCE
Imagine living a life in which you are completely aware of the world around you but you're prevented from engaging in it because you are completely paralyzed. Even speaking is impossible. For an estimated 50,000 Americans, this is a harsh reality. It's called locked-in syndrome, a condition in which people with normal cognitive brain activity suffer severe paralysis, often from injuries or an illness such as Lou Gehrig's disease.
"Locked-in people are unable to move at all except possibly their eyes, and so they're left with no means of communication but they are fully conscious," says Boston University neuroscientist Frank Guenther.
Guenther works with the National Science Foundation's (NSF) Center of Excellence for Learning in Education, Science and Technology (CELEST), which is made up of eight private and public institutions, mostly in the Boston area. Its purpose is to synthesize the experimental modeling and technological approaches to research in order to understand how the brain learns as a whole system. In particular, Guenther's research is looking at how brain regions interact, with the hope of melding mind and machine, and ultimately making life much better for people with locked-in syndrome.
"People who have no other means of communication can start to control a computer that can produce words for them or they can manipulate what happens in a robot and allow them to interact with the world," Guenther says about his research.
His team demonstrated two experiments on the day Science Nation stopped by. In one experiment, run by assistant research professor Jonathan Brumberg, a volunteer shows how she uses a speech synthesizer to make vowel sounds just by thinking about moving a hand or foot. She never moves her body or says anything.
"We use an EEG cap to read the brain signals coming from her brain through her scalp," explains Brumberg, who tracks the brainwaves with a computer. "Depending on what body part she imagines moving, the cursor moves in different directions on the screen. Brumberg explains that he is able to, "translate those brain activities into audio signals that can be used to drive a voice synthesizer. We've mapped the "uw" sound to a left hand movement, the "aa" sound to right hand movement, and the "iy" sound to a foot movement."
As the subject sits perfectly still, the cursor starts to move freely across the screen. Each of those sounds is represented by three circles on a computer screen. The subject needs to get the cursor into the center of any of the three circles to get the synthesizer to make the right vowel sound.
We watch as the subject imagines moving her left hand to get the cursor to move right into the center of the "uw" circle, and we hear a synthetic "uw" droning from the synthesizer. Brumberg has experimented on locked-in patients, too, and the results have been startling.
"We started with helping a locked-in patient regain an ability to make certain vowel sounds and that was amazing. He hasn't been able to talk in years and the first time he made a movement with our formant synthesizer, he nearly, you know, jumped out of his chair with excitement," says Brumberg. "Although the patient has no actual voluntary movement, involuntary motor actions are often seen when the patient gets excited."
Guenther says this technology holds great promise not just for locked-in patients. "We hope these technologies would be applied to people that have other communication disorders that cause them to be unable to speak," he says. "This sort of thing would allow them to produce synthetic speech, which could be used to talk to the people around them and mention their needs."
In another experiment, graduate student Sean Lorenz takes a robot out for a spin using only brainwaves. The checkerboards on the sides of the screen flash at slightly different frequencies. To the naked eye, the differences are subtle. "But the neurons in his visual cortex start firing in synchrony with the checkerboard he's looking at and so we can pick up the frequency and from that, determine which choice he was trying to make, left, right, forward or backward, for example." explains Guenther.
For locked-in patients, Guenther adds, "If they're pointing their eyes at a visual screen, they can focus their attention on one of the different frequencies and they can manipulate what happens in a robot or in a computer."
According to Guenther it's just a matter of time before these technologies are commercially available. It's all part of a vision that pairs biology with technology to find a way out--for those who are locked-in.