Pointing is infants' first communicative gesture

February 24, 2014

Catalan researchers have studied the acquisition and development of language in babies on the basis of the temporary coordination of gestures and speech. The results are the first in showing how and when they acquire the pattern of coordination between the two elements which allows them to communicate very early on.

A new study carried out by two researchers from the Pompeu Fabra University of Barcelona analyses the temporary coordination between gestures and speech in babies during the very early stages of language development, from the babbling period until the production of their first words.

The results, published in the journal Speech Communication, are the first to show how and when babies acquire the coordination between gesture and speech.

"There are now more and more investigations that show that the study of language and human communication can not be carried out only with an analysis of speech," Núria Esteve Gibert, one of the authors, explained to SINC.

In fact, in communicative interactions meanings and emotions are transmitted through speech and non-verbal elements (hand gestures, facial expressions or body position).

"Our analysis indicates that it is during the transition between the babbling period and first words (that is to say, before the infant is capable of producing two joined words, one after the other), that the gestural system and system of speech are already closely linked," affirmed Esteve Gibert.

According to the authors, this study demonstrates the vision that speech and body language are two elements required for studying human communication, as there are more and more indications that both modes are developed at the same time and that they are closely coordinated, both semantically and temporarily.

The aim of this pioneering work was to investigate the process of acquisition and development of language in relation to the temporary coordination of gestures and speech.

In order to do so, the researchers filmed four babies, born into Catalan-speaking families, while they played with their parents at home, from when the children were aged 11 to when they reached 19 months old.

"These recordings were used to investigate when children started to combine gesture and speech in the same way as adults and if when they combine the two modes, the patterns of temporary coordination between gesture and speech are appropriate," Gibert continued.

In total, more than 4,500 communicative acts produced by the babies across the analysed months, through 24 hours of recordings, were obtained, which have been studied from the point of view of the gestures and of the acoustic properties of the vocalisations produced by the children.

"Special importance has been given to the analysis of the temporary coordination between speech and the act of pointing, because this gesture is crucial in the linguistic and cognitive development of language since it represents the first communicative gesture that babies are capable of understanding and producing," the expert pointed out.

Moreover, it is noted that the correct development of the coordination is closely linked with the future linguistic abilities of the child at a more advanced stage.

Combination of gesture and speech

During the babbling stage babies still produce many gestures without combining them with vocalisations. However, from the beginning of the period in which they start to produce their first words (four words during half an hour of recording), babies produce the majority of hand gestures in combination with vocalisations, the same as adults.

Furthermore, on analysing the combinations of gesture and vocalisation that the babies produce at this early age we see that most of the gestures that they combine with vocalisations are deictic gestures (pointing and reaching) with a declarative communicative intention (to inform) more than a commanding intention (to achieve that object).

"Already in the first combinations of gesture with vocalisation, the pattern of temporary coordination of both modes (which consists in synchronising the interval of time more prominent in the deictic gesture with the interval of time more prominent in the vocalisation) is very similar to that of adults," concluded Esteve Gibert.
-end-
Reference:

Esteve-Gibert, N. & Prieto, P. "Infants temporally coordinate gesture-speech combinations before they produce their first words". Speech Communication 57, pp. 301-316. 2014

Contact:

Núria Esteve Gibert
Universidad Pompeu Fabra de Barcelona
E-mail: nuria.esteve@upf.edu
Tel.: +34 935422409

FECYT - Spanish Foundation for Science and Technology

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.