Nav: Home

Newborn babies have inbuilt ability to pick out words, study finds

January 29, 2019

Newborn babies are born with the innate skills needed to pick out words from language, a new study published in Developmental Science reveals.

Before infants can learn words, they must identify those words in continuous speech. Yet, the speech signal lacks obvious boundary markers, which poses a potential problem for language acquisition.

Studies have found that by the middle of the first year, infants seem to have solved this problem, but it is unknown if segmentation abilities are present from birth, or if they only emerge after sufficient language exposure and/or brain maturation.

Near-Infrared Spectroscopy

An international team of researchers from the University of Liverpool, SISSA in Italy, the Neurospin Centre in France and The University of Manchester conducted experiments to find the cues crucial for the segmentation of human speech.

The researchers played the infants a three-and-a-half minute audio clip in which four meaningless words, were buried in a stream of syllables.

Using a painless technique called Near-Infrared Spectroscopy, which shines red light into the brain, they were able to measure how much was absorbed, telling them which parts of the brain were active.

'Key Insight'

The researchers discovered two mechanisms in three-day-old infants, which give them the skills to pick out words in a stream of sounds.

The first mechanism is known as prosody, the melody of language, allow us to recognise when a word starts and stops.

The second is called the statistics of language, which describes how we compute the frequency of when sounds in a word come together.

The discovery provides a key insight into a first step to learning language.

Important tools

Dr Alissa Ferry, University of Manchester, said: "We think this study highlights how sentient newborn babies really are and how much information they are absorbing. That's quite important for new parents and gives them some insight into how their baby is listening to them."

Dr Perrine Brusini, University of Liverpool, said: "We then had the infants listen to individual words and found that their brains responded differently to the words that they heard than to slightly different words.

"This showed that even from birth infants can pick out individual words from language."

Dr Ana Flò, Neurospin, said: "Language in incredibly complicated and this study is about understanding how infants try to make sense of it when they first hear it. We often think of language as being made up of words, but words often blur together when we talk. So one of the first steps to learn language is to pick out the words.

"Our study shows that at just three days old, without understanding what it means, they are able pick out individual words from speech. And we have identified two important tools that we are almost certainly born with, that gives them the ability to do this."
-end-
The study was funded by the European Research Council.

The full study, entitled 'Newborns are sensitive to multiple cues for word segmentation in continuous speech', can be found here https://onlinelibrary.wiley.com/doi/abs/10.1111/desc.12802

University of Liverpool

Related Language Articles:

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.
Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.
Sign language reveals the hidden logical structure, and limitations, of spoken language
Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.
More Language News and Language Current Events

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Rethinking Anger
Anger is universal and complex: it can be quiet, festering, justified, vengeful, and destructive. This hour, TED speakers explore the many sides of anger, why we need it, and who's allowed to feel it. Guests include psychologists Ryan Martin and Russell Kolts, writer Soraya Chemaly, former talk radio host Lisa Fritsch, and business professor Dan Moshavi.
Now Playing: Science for the People

#537 Science Journalism, Hold the Hype
Everyone's seen a piece of science getting over-exaggerated in the media. Most people would be quick to blame journalists and big media for getting in wrong. In many cases, you'd be right. But there's other sources of hype in science journalism. and one of them can be found in the humble, and little-known press release. We're talking with Chris Chambers about doing science about science journalism, and where the hype creeps in. Related links: The association between exaggeration in health related science news and academic press releases: retrospective observational study Claims of causality in health news: a randomised trial This...