When using gestures, rules of grammar remain the same

June 30, 2008

The mind apparently has a consistent way of ordering an event that defies the order in which subjects, verbs, and objects typically appear in languages, according to research at the University of Chicago.

"Not surprisingly, speakers of different languages describe events using the word orders prescribed by their language. The surprise is that when the same speakers are asked to 'speak' with their hands and not their mouths, they ignore these orders - they all use exactly the same order when they gesture," said Susan Goldin-Meadow, the Bearsdley Rum Distinguished Service Professor in Psychology and lead author of the paper, "The Natural Order of Events: How Speakers of Different Languages Represent Events Nonverbally" published in the current issue of the Proceedings of the National Academy of Sciences.

For the study, the team tested 40 speakers of four different languages: 10 English, 10 Mandarin Chinese, 10 Spanish and 10 Turkish speakers. They showed them simple video sequences of activities and asked them to describe the action first in speech and a second time using only gestures. They also gave another 40 speakers of the same languages transparencies to assemble after watching the video sequences. Some of the videos portrayed real people and others animated toys that represented a variety of sentence types: a girl waves, a duck moves to a wheelbarrow, a woman twists a knob and a girl gives a flower to man.

When asked to describe the scenes in speech, the speakers used the word orders typical of their respective languages. English, Spanish, and Chinese speakers first produced the subject, followed by the verb, and then the object (woman twists knob). Turkish speakers first produced the subject, followed by the object, and then the verb (woman knob twists).

But when asked to describe the same scenes using only their hands, all of the adults, no matter what language they spoke, produced the same order -- subject, object, verb (woman knob twists). When asked to assemble the transparencies after watching the video sequences (another nonverbal task, but one that is not communicative), people also tended to follow the subject, object, verb ordering found in the gestures produced without speech.

The grammars of modern languages developed over time and are the result of very distant cultural considerations that are difficult for linguists to study.

Newly emerging sign languages, however, offer intriguing corroborating evidence that the subject-object-verb (SOV) order is a fundamental one.

SOV is the order currently emerging in a language created spontaneously without any external influence. Al-Sayyid Bedouin Sign Language arose within the last 70 years in an isolated community with a high incidence of profound prelingual deafness. In the space of one generation, the language assumed grammatical structure, including the SOV order.

Moreover, when deaf children invent their own gesture systems, they use OV order. Chinese and American deaf children, whose hearing losses prevent them from acquiring spoken language and whose hearing parents have not exposed them to sign language, use the OV order in the gesture sentences they create.

The research challenges the idea that the language we speak inevitably shapes the way we think when we are not speaking. This study is the first to test the notion with respect to word order.

"Our data suggest that the ordering we use when representing events in a nonverbal format is not highly susceptible to language's influence," Goldin-Meadow and her co-authors write. "Rather, there appears to be a natural order that humans use when asked to represent events nonverbally. Indeed, the influence may well go in the other direction--the ordering seen in our nonverbal tasks may shape language in its emerging stages."
-end-
Joining Goldin-Meadow in writing the paper were Wing Chee So, of the National University of Singapore; Ali Ozyurek, of Radboud Universtiy Nijmegen, and Carolyn Mylander, a researcher at the University of Chicago.

University of Chicago

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.