Nav: Home

Researchers study how cochlear implants affect brain circuits

June 30, 2016

Four-year-old William Wootton was born profoundly deaf, but thanks to cochlear implants fitted when he was about 18 months old, the Granite Bay preschooler plays with a keyboard synthesizer and reacts to the sounds of airplanes and trains, while still learning American Sign Language.

"He has done extremely well," said William's mother, Jody Wootton. "He really appreciates music and is learning to speak."

First approved for adults in the 1980s, cochlear implants have been used by hundreds of thousands of people worldwide. The implant bypasses most of our normal hearing process, electronically connecting a microphone directly to the cochlear, the structure in the inner ear that collects nerve signals from the ear and sends them to the brain.

But not all children respond as well as William to the implants.

"Cochlear implants are very successful for some kids, but we don't understand why some kids do well and not others," said Professor David Corina of the Center for Mind and Brain at the University of California, Davis.

Supported by a five-year grant from the National Institutes of Health, Corina and Lee Miller, associate professor of neurobiology, physiology and behavior at UC Davis, are working to understand why some children respond better to the implants than others.

'Balance of power' between auditory and visual brain areas

One idea is that areas of the brain that are not being used, such as the auditory cortex in profoundly deaf children, get taken over for other functions, such as visual processing. When the child gets an implant, that part of the brain is no longer available to support hearing.

"We're using measures of brain function to get a snapshot of the 'cerebral balance of power' and how it is influencing auditory and visual experiences," Corina said. The ultimate goal is to identify clinical interventions that would help children better adapt to using cochlear implants, Miller said.

Now about a year into the study, Corina and Miller are recruiting children from 18 months to 8 years old who use cochlear implants, as well as hearing children in the same age group. They use electroencephalography, or EEG, to measure brain activity during visual and auditory processing.

During the experiment, the children watch a cartoon while a mixture of specially designed speech is played to them. The speech is designed to elicit responses from the different levels of processing in the auditory system, "from the ear to deep cortex," Corina said.

"It takes time for speech to move through the auditory system and there are different levels at which the visual system could interfere, if it does," he said.

The researchers plan to recruit about 60 children a year into the study, which began in 2015, and follow them for five years to track their progress.

Bilingual in sign language

Many American children grow up with more than one spoken language. Is adding a signed language any different?

"For some kids, their first language may be signed," Corina said. "How does this affect cerebral balance?"

The important thing is that children grow up linguistically capable in whichever languages they use, he said.

William, for example, is now in a preschool program at Ophir Elementary School near Auburn, which uses American Sign Language in addition to English. So far, he's embracing both spoken and signed languages and transitions between the two, his mother said.

"We're absolutely pleased to have got the implants. It's really changed our lives and changed his life," she said.
-end-
The researchers have collaborated with the Weingarten Children's Center, Redwood City, California; CCHAT Center, Sacramento; the Hearing Speech and Deafness Center, Seattle; the California School for the Deaf, Fremont, California; and The Learning Center for the Deaf, Boston; in addition to financial support from the National Institute on Deafness and other Communication Disorders (part of the National Institutes of Health).

University of California - Davis

Related Language Articles:

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.
Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.
Sign language reveals the hidden logical structure, and limitations, of spoken language
Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.
Lying in a foreign language is easier
It is not easy to tell when someone is lying.
American sign language and English language learners: New linguistic research supports the need for policy changes
A new study of the educational needs of students who are native users of American Sign Language (ASL) shows glaring disparities in their treatment by the U.S Department of Education.
The language of facial expressions
University of Miami Psychology Professor Daniel Messinger collaborated with researchers at Western University in Canada to show that our brains are pre-wired to perceive wrinkles around the eyes as conveying more intense and sincere emotions.
The universal language of hormones
Bioinformatics specialists from the University of W├╝rzburg have studied a specific class of hormones which is relevant for plants, bacteria and indirectly for humans, too.
Stretching language to its limit
A disregard for human traditions, the brutality of predation, sacrifice, and sexual desire are ingrained in languages across cultures.
More Language News and Language Current Events

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Rethinking Anger
Anger is universal and complex: it can be quiet, festering, justified, vengeful, and destructive. This hour, TED speakers explore the many sides of anger, why we need it, and who's allowed to feel it. Guests include psychologists Ryan Martin and Russell Kolts, writer Soraya Chemaly, former talk radio host Lisa Fritsch, and business professor Dan Moshavi.
Now Playing: Science for the People

#538 Nobels and Astrophysics
This week we start with this year's physics Nobel Prize awarded to Jim Peebles, Michel Mayor, and Didier Queloz and finish with a discussion of the Nobel Prizes as a way to award and highlight important science. Are they still relevant? When science breakthroughs are built on the backs of hundreds -- and sometimes thousands -- of people's hard work, how do you pick just three to highlight? Join host Rachelle Saunders and astrophysicist, author, and science communicator Ethan Siegel for their chat about astrophysics and Nobel Prizes.