Babies get hands-on with language

July 14, 2004

BABIES exposed to sign language babble with their hands, even if they are not deaf. The finding supports the idea that human infants have an innate sensitivity to the rhythm of language and engage it however they can, the researchers who made the discovery claim. Everyone accepts that babies babble as a way to acquire language, but researchers are polarised about its role. One camp says that children learn to adjust the opening and closing of their mouths to make vowels and consonants by mimicking adults, but the sounds are initially without meaning. The other side argues that babbling is more than just random noise-making.

Much of it, they contend, consists of phonetic-syllabic units - the rudimentary forms of language. Laura-Ann Petitto at Dartmouth College in Hanover, New Hampshire, a leader in this camp, has argued that deaf babies who are exposed to sign language learn to babble using their hands the way hearing babies do with their mouths. Petitto believes that the hand-babbling is functionally identical to verbal babbling- only the input is different. But critics counter that deaf children cannot be directly compared with their hearing counterparts.

Now Petitto and her colleagues have tested three hearing babies who, because their parents are deaf, were exposed only to sign. Three control infants had hearing, speaking parents. To analyse the hand movements of the six children, the researchers placed infrared-emitting diodes on the babies' hands, forearms and feet. Sensors tracked the movements of the babies' limbs as they engaged in a variety of tasks, including grasping for toys and watching two people communicate. Petitto reasoned that if her opponents were right, then what the babies did with their hands would be irrelevant- and indistinguishable. Instead the team found that the two groups had different hand movements.

Sign-exposed babies produced two distinct types of rhythmic hand activity, a low-frequency type at 1 hertz and a high-frequency one at 2.5 hertz. The speech-exposed babies had only high-frequency moves. There was a "unique rhythmic signature of natural language" to the low-frequency movements (Cognition, vol 93, p 43). "What is really genetically passed on," Petitto says, "is a sensitivity to patterns." But Peter MacNeilage, of the University of Texas at Austin, is not persuaded. "She makes a blanket statement that there is an exact correspondence between the structures of speech and sign," he says. "But there is no accepted evidence for this view at the level of phonological structure or in the form of a rhythm common to speech and sign."
-end-
Author: Alison Motluk

This article appears in New Scientist issue: 17 JULY 2004

PLEASE MENTION NEW SCIENTIST AS THE SOURCE OF THIS STORY AND, IF PUBLISHING ONLINE, PLEASE CARRY A HYPERLINK TO: http://www.newscientist.com

"These articles are posted on this site to give advance access to other authorised media who may wish to quote extracts as part of fair dealing with this copyrighted material. Full attribution is required, and if publishing online a link to www.newscientist.com is also required. Advance permission is required before any and every reproduction of each article in full - please contact celia.thomas@rbi.co.uk. Please note that all material is copyright of Reed Business Information Limited and we reserve the right to take such action as we consider appropriate to protect such copyright."

New Scientist

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.