Deaf sign language users pick up faster on body language

January 12, 2012

Deaf people who use sign language are quicker at recognizing and interpreting body language than hearing non-signers, according to new research from investigators at UC Davis and UC Irvine.

The work suggests that deaf people may be especially adept at picking up on subtle visual traits in the actions of others, an ability that could be useful for some sensitive jobs, such as airport screening.

"There are a lot of anecdotes about deaf people being better able to pick up on body language, but this is the first evidence of that," said David Corina, professor in the UC Davis Department of Linguistics and Center for Mind and Brain.

Corina and graduate student Michael Grosvald, now a postdoctoral researcher at UC Irvine, measured the response times of both deaf and hearing people to a series of video clips showing people making American Sign Language signs or "non-language" gestures, such as stroking the chin. Their work was published online Dec. 6 in the journal Cognition.

"We expected that deaf people would recognize sign language faster than hearing people, as the deaf people know and use sign language daily, but the real surprise was that deaf people also were about 100 milliseconds faster at recognizing non-language gestures than were hearing people," Corina said.

This work is important because it suggests that the human ability for communication is modifiable and is not limited to speech, Corina said. Deaf people show us that language can be expressed by the hands and be perceived through the visual system. When this happens, deaf signers get the added benefit of being able to recognize non-language actions better than hearing people who do not know a sign language, Corina said.

The study supports the idea that sign language is based on a modification of the system that all humans use to recognize gestures and body language, rather than working through a completely different system, Corina said.
-end-
The research was supported by grants from the National Institutes of Health and National Science Foundation.

UC Davis is a leader in brain science, with three major centers -- the Center for Mind and Brain, the Center for Neuroscience and the MIND Institute -- that bring together experts from across the university to work together on topics ranging from autism and memory to meditation and the effects of music on the brain.

University of California - Davis

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.