Researchers explain how the brain integrates head position and acoustics

December 16, 2002

MADISON--The slightest turn of the head can significantly change the way a person or animal detects sound. A subtle tilt alters the angle at which high-frequency sound waves hit the ear, providing cues to localize the sound. To use those cues, the brain must put what it hears into the context of the position of the head. Until recently, scientists were not sure how this was done.

Now researchers at University of Wisconsin Medical School appear to have the explanation. They have discovered that in the cochlear nucleus, the first sound-processing station in the brain, certain cells accomplish the job by integrating the two kinds of information, each of which travels along a distinct pathway.

The researchers compared activity in both pathways, examining currents running through synapses--or signal-transmitting junctions--in fusiform cells of the cochlear nucleus. To their surprise, they learned that synapses transmitting acoustic information were not influenced by previous activity--they were stable. On the other hand, synapses carrying information about head and ear position were continually strengthened or weakened depending on the amount of activity-they were plastic.

The study, by Donata Oertel of the Department of Physiology at the University of Wisconsin Medical School, and Kiyohiro Fujino, now of the Departmenty of Otolaryngology at Kyoto University Graduate School of Medicine, appears in the December 16 Proceedings of the National Academy of Sciences.

The auditory system's main responsibilities are to locate sounds, analyze their properties, and then recognize what they mean. The initial duties take place in the cochlear nucleus.

"Sound localization is an especially important function of the auditory system because it allows us to figure out what's happening around corners, in the dark or when vision can't help," said Oertel, a UW professor of physiology who is an expert on the cochlear nucleus. For locating sounds on the horizontal plane-those coming from the left or right of the head-factors such as relative sound intensity in each ear, and the difference in sound's arrival time at each ear are important cues. Cells in the ventral cochlear nucleus are responsible for pinpointing horizontal sounds.

"But you don't have those cues in the vertical plane. If you're trying to distinguish sounds coming from above or below the head, or in front of or behind it, time and intensity differences at the two ears don't help at all," Oertel said. "High frequency sound waves are distorted differently when they are heard from straight on rather than high up, and the asymmetry of our ears distorts the sound waves in another way when they come from the front or back."

Following the lead of other investigators, including some at UW, Oertel looked to the dorsal cochlear nucleus for the source of sound detection on the vertical plane. In her earlier research, she showed that fusiform cells, the principal cells of the dorsal cochlear nucleus, are activated by two sets of dendrites, or threadlike arms of nerve cells. One set detects sounds through auditory nerve fibers; the other carries information about the position of the ears, head and neck through parallel fibers.

In the current work, Fujino, who was a post-doctoral fellow with Oertel, used a technique called patch-clamping to record activity at the synapses of single fusiform cells. The experiment showed remarkable differences.

Currents evoked by activating signals through the parallel fibers were greatly strengthened with increasing use and weakened with decreasing use. This plasticity presumably aids in adapting to differing head positions, Oertel said. Signals evoked through the auditory nerve, which are involved in sound processing, were stable and not influenced by use.

Oertel said it is extremely rare for single cells to exhibit both the plasticity and the stability she and Fujino found.

"The observation that the strength of synapses can vary as a function of their activity has been of great interest because it underlies the brain's ability to learn and respond to the environment," she said. "However, if this part of the auditory system were plastic, it would cause what we hear now to be confused with what we heard just moments before."

Oertel's knowledge of the cochlear nucleus and its role in understanding how the brain uses sound is helping computer scientists in their efforts to develop computer speech processors.

"The engineers are making computers that work in a way that's similar to how the brain functions," she said. "The computers are being made to work well even in noisy environments."
-end-


University of Wisconsin-Madison

Related Brain Articles from Brightsurf:

Glioblastoma nanomedicine crosses into brain in mice, eradicates recurring brain cancer
A new synthetic protein nanoparticle capable of slipping past the nearly impermeable blood-brain barrier in mice could deliver cancer-killing drugs directly to malignant brain tumors, new research from the University of Michigan shows.

Children with asymptomatic brain bleeds as newborns show normal brain development at age 2
A study by UNC researchers finds that neurodevelopmental scores and gray matter volumes at age two years did not differ between children who had MRI-confirmed asymptomatic subdural hemorrhages when they were neonates, compared to children with no history of subdural hemorrhage.

New model of human brain 'conversations' could inform research on brain disease, cognition
A team of Indiana University neuroscientists has built a new model of human brain networks that sheds light on how the brain functions.

Human brain size gene triggers bigger brain in monkeys
Dresden and Japanese researchers show that a human-specific gene causes a larger neocortex in the common marmoset, a non-human primate.

Unique insight into development of the human brain: Model of the early embryonic brain
Stem cell researchers from the University of Copenhagen have designed a model of an early embryonic brain.

An optical brain-to-brain interface supports information exchange for locomotion control
Chinese researchers established an optical BtBI that supports rapid information transmission for precise locomotion control, thus providing a proof-of-principle demonstration of fast BtBI for real-time behavioral control.

Transplanting human nerve cells into a mouse brain reveals how they wire into brain circuits
A team of researchers led by Pierre Vanderhaeghen and Vincent Bonin (VIB-KU Leuven, Université libre de Bruxelles and NERF) showed how human nerve cells can develop at their own pace, and form highly precise connections with the surrounding mouse brain cells.

Brain scans reveal how the human brain compensates when one hemisphere is removed
Researchers studying six adults who had one of their brain hemispheres removed during childhood to reduce epileptic seizures found that the remaining half of the brain formed unusually strong connections between different functional brain networks, which potentially help the body to function as if the brain were intact.

Alcohol byproduct contributes to brain chemistry changes in specific brain regions
Study of mouse models provides clear implications for new targets to treat alcohol use disorder and fetal alcohol syndrome.

Scientists predict the areas of the brain to stimulate transitions between different brain states
Using a computer model of the brain, Gustavo Deco, director of the Center for Brain and Cognition, and Josephine Cruzat, a member of his team, together with a group of international collaborators, have developed an innovative method published in Proceedings of the National Academy of Sciences on Sept.

Read More: Brain News and Brain Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.