Give it a thought -- and make it so

May 02, 2000

Glancing at a stereo and turning it on with a thought may have once been science fiction, but inside a virtual world at the University of Rochester, people are listening to music by simply wishing it so. Outfitted with a virtual reality helmet and a computer program adept at recognizing key brain signals, volunteers use their thoughts to take actions like those of any apartment dweller-turning on the television or the stereo, for instance. The line of research, which links a brain and computer in a near real-world environment, may someday allow patients with extreme paralysis to regain some control of their surroundings, say the project's developers, and could eventually eliminate keyboards and computer mice as the go-betweens connecting our thoughts and the actions we wish to see in our environment.

While several teams around the world are working on brain-computer interfaces (BCI), computer science graduate student Jessica Bayliss is the first to show that detection of the brain's weak electrical signals is possible in a busy environment filled with activity. She has shown that volunteers who don a virtual reality helmet in her lab can control elements in a virtual world, including turning lights on and off and bringing a mock-up of a car to a stop by thought alone. Though all this is currently taking place only in virtual reality, the team is confident that the technology will make the jump to the "real world" and should soon enable people to look around a real apartment and take control in a way they couldn't before.

"This is a remarkable feat of engineering," says Dana Ballard, professor of computer science and Bayliss' adviser. "She's managed to separate out the tiny brain signals from all the electric noise of the virtual reality gear. We usually try to read brain signals in a pristine, quiet environment, but a real environment isn't so quiet. Jessica has found a way to effectively cut through the interference."

The National Institutes of Health is supporting Bayliss' research because it may someday give back some control to those who have lost the ability to move. A person so paralyzed that he or she is unable even to speak may be able to communicate once again if this technology can be perfected, explains Bayliss. By merely looking at the telephone, television or thermostat and wishing it to be used, a person with disabilities could call a friend or turn up the heat on a chilly day. Bayliss hopes that someday such people may even be able to operate a wheelchair by themselves simply by thinking their commands.

"Virtual reality is a safe testing ground," says Bayliss. "We can see what works and what doesn't without the danger of driving a wheelchair into a wall. We can learn how brain interfaces will work in the real world, instead of how they work when someone is just looking at test patterns and letters. The brain normally interacts with a 3-D world, so I want to see if it gives off different signals when dealing with a 3-D world than with a chart."

The brain signal Bayliss listens for is called the "P300 evoked potential." It's not a specific signal that could be translated as "Aunt Nora" or "stop at the red light," but rather a sign of recognition-more like "That's it!"

"It's as if each neuron is a single person who's talking," explains Bayliss. "If there's just one person, then it's easy to hear what he's saying, but the brain has billions of neurons, so imagine a room full of a billion people all talking at once. You can't pick out one person's voice, but if everyone suddenly cheers or oohs or aahs, you can hear it. That's what we listen for, when several neurons suddenly say 'that's it!' "

Bayliss looks for this signal to occur in sync with a light flashing on the television or stereo. If the rhythm matches the blinks of the stereo light, for instance, the computer knows the person is concentrating on the stereo and turns it on. A person doesn't even have to look directly at the stereo; as long as the object is in the field of view, it can be controlled by the person's brain signals. Since it's not necessary to move even the eyes, this system could work for paralysis patients who are completely "locked in," a state where even eye blinks or movement are impossible.

The virtual apartment in which volunteers have been turning appliances on and off is modeled after Bayliss' own. Such a simple, virtual world is the first step toward developing a way to accurately control the real world. Once Bayliss has perfected the computer's ability to determine what a person is looking at in the virtual room, the next hurdle will be to devise a system that can tell what object a person is looking at in the real world. BCI groups are also close to surmounting another obstacle-that of attaching the sensors to the head. Right now dozens of electrodes must be attached to the scalp one at a time with a gooey gel, but Bayliss says dry sensors are just around the corner, and simple slip-on head caps should not be far behind.

"One place such an interface may be very useful is in wearable computers," Ballard says. "With the roving eye as a mouse and the P300 wave as a mouse-click, small computers that you wear as glasses may be more promising than ever."

BCIs are divided into two categories: biofeedback and stimulus-response. Bayliss uses the latter approach, which simply measures the response the brain has to an event. Biofeedback is a method where a person learns to control some aspect of his or her body, such as relaxing, and the resulting change in the brain can be detected. Though many BCI groups use this approach, Bayliss decided against it because people must be trained, sometimes for a year or more, and not everyone can learn to accurately control their thought patterns.

Bayliss and Ballard work in the University's National Resource Laboratory for the Study of Brain and Behavior, which brings together computer scientists, cognitive scientists, visual scientists, and neurologists to study neural functions in complex settings. The laboratory's research combines tools that mimic real-world sensations, such as virtual reality driving simulators and gloves that simulate the feel of virtual objects, with sensory trackers that measure eye, head, and finger movements. Recently the lab added virtual people, robot-like actors with which volunteers can interact in a limited way.

So in the future will we all be wearing little caps that will let us open doors, channel surf and drive the car on a whim? "Not likely," Bayliss says. "Anything you can do with your brain can be done a lot faster, cheaper and easier with a finger and a remote control."

University of Rochester

Related Brain Articles from Brightsurf:

Glioblastoma nanomedicine crosses into brain in mice, eradicates recurring brain cancer
A new synthetic protein nanoparticle capable of slipping past the nearly impermeable blood-brain barrier in mice could deliver cancer-killing drugs directly to malignant brain tumors, new research from the University of Michigan shows.

Children with asymptomatic brain bleeds as newborns show normal brain development at age 2
A study by UNC researchers finds that neurodevelopmental scores and gray matter volumes at age two years did not differ between children who had MRI-confirmed asymptomatic subdural hemorrhages when they were neonates, compared to children with no history of subdural hemorrhage.

New model of human brain 'conversations' could inform research on brain disease, cognition
A team of Indiana University neuroscientists has built a new model of human brain networks that sheds light on how the brain functions.

Human brain size gene triggers bigger brain in monkeys
Dresden and Japanese researchers show that a human-specific gene causes a larger neocortex in the common marmoset, a non-human primate.

Unique insight into development of the human brain: Model of the early embryonic brain
Stem cell researchers from the University of Copenhagen have designed a model of an early embryonic brain.

An optical brain-to-brain interface supports information exchange for locomotion control
Chinese researchers established an optical BtBI that supports rapid information transmission for precise locomotion control, thus providing a proof-of-principle demonstration of fast BtBI for real-time behavioral control.

Transplanting human nerve cells into a mouse brain reveals how they wire into brain circuits
A team of researchers led by Pierre Vanderhaeghen and Vincent Bonin (VIB-KU Leuven, Université libre de Bruxelles and NERF) showed how human nerve cells can develop at their own pace, and form highly precise connections with the surrounding mouse brain cells.

Brain scans reveal how the human brain compensates when one hemisphere is removed
Researchers studying six adults who had one of their brain hemispheres removed during childhood to reduce epileptic seizures found that the remaining half of the brain formed unusually strong connections between different functional brain networks, which potentially help the body to function as if the brain were intact.

Alcohol byproduct contributes to brain chemistry changes in specific brain regions
Study of mouse models provides clear implications for new targets to treat alcohol use disorder and fetal alcohol syndrome.

Scientists predict the areas of the brain to stimulate transitions between different brain states
Using a computer model of the brain, Gustavo Deco, director of the Center for Brain and Cognition, and Josephine Cruzat, a member of his team, together with a group of international collaborators, have developed an innovative method published in Proceedings of the National Academy of Sciences on Sept.

Read More: Brain News and Brain Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to