Overcoming 'Cocktail Party Effect' May Help Babies Learn Language

November 11, 1996

Infants at a surprisingly young age are able to distinguish one voice amid a clutter of distracting background noise, psychologists have discovered.

The infants' ability to deal with the so-called "cocktail party effect" and pay attention to a single sound or speaker's voice may help them learn language quickly, the researchers said.

The effect has been studied in adults, but scientists do not understand how the brain accomplishes the feat. Until now, however, there had been no strong evidence that infants shared the same ability.

The psychologists tested infants an average of 7 1/2 months old and found that the babies have the ability.

The findings are detailed in a scientific paper published in the November issue of Perception & Psychophysics, a scientific journal for experimental psychologists. The paper was co-authored by Rochelle S. Newman, a psychology graduate student at the State University of New York at Buffalo, and by Peter Jusczyk, a professor of psychology at The Johns Hopkins University.

They suggest that the ability may play an important role in learning language by increasing the amount of information available to infants, who are trying to understand the sounds of words and the melody of sentence structure.

"Put simply, in order to learn a language, infants must be able to hear it, and this requires the ability to separate it from background noise," Newman and Jusczyk said in the paper. "They may have to `tune out' all sorts of competing noises around the home to gain any information from their caregivers' speech. If they were unable to do this, they would have to draw on a much smaller set of utterances in order to discover the structure and organization of their native language."

The psychologists noted that there has been "a surprising dearth of research" to explore the cocktail-party effect in infants.

"Like adults, infants often are engaged in noisy situations. ... Infants also have to deal with competition from other speech, such as that from the television down the hall, and from their siblings in the next room. Yet there has been almost no research on infants demonstrating the extent to which they can succeed at this difficult task."

Because infants cannot be questioned directly, researchers must design elaborate testing methods. "We look for other kinds of responses," Jusczyk said in an interview. "Do they listen longer to this one, or how are they responding to something? With an infant, you have to ask the questions indirectly."

To learn whether infants have the ability, the psychologists studied the responses of 24 babies; a women repeated a specific word, such as "cup," or "dog," over and over, while a man in the background read the mundane narrative of the scientific method used to carry out the experiment. Later, taped passages were played to the infants. Some of the passages revolved around the theme of cup or dog. Most of the infants paid more attention to the passages containing the familiar word than they did to passages that did not contain "cup" or "dog." The results demonstrated that the infants had learned the word, despite the background noise.

Understanding the effect in babies could provide insights into how they learn language. But such insights also might be helpful to scientists trying to design computers that understand speech.

"What is interesting about this is how quickly babies are able to progress," whereas the most powerful computers are unable to efficiently decipher spoken language, Jusczyk said.

"It is mysterious. If we had a better understanding of how infants are able to do this, we probably would get a lot farther in terms of how we could get machines to do this," he said.

Jusczyk has written a book about the history of research into how infants learn language, focusing on the first year of life. The field of infant speech perception began only about 25 years ago. Jusczyk's book, The Discovery of Spoken Language, to be published by MIT Press, is scheduled for release in early 1997.
-end-


Johns Hopkins University

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.