Gestures heard as well as seen

May 18, 2020

Gesturing with the hands while speaking is a common human behavior, but no one knows why we do it. Now, a group of UConn researchers reports in the May 11 issue of PNAS that gesturing adds emphasis to speech--but not in the way researchers had thought.

Gesturing while speaking, or "talking with your hands," is common around the world. Many communications researchers believe that gesturing is either done to emphasize important points, or to elucidate specific ideas (think of this as the "drawing in the air" hypothesis). But there are other possibilities. For example, it could be that gesturing, by altering the size and shape of the chest, lungs and vocal muscles, affects the sound of a person's speech.

A team of UConn researchers led by former postdoc Wim Pouw (currently at Radboud University in the Netherlands) decided to test whether this idea was true, or just so much hand waving. The team had volunteers move their dominant hand as if they were chopping wood, while continuously saying "a" as in "cinema." They were instructed to keep the "a" sound as steady as they could.

Despite that instruction, when the team played audio recordings of this to other people, they found the listener could hear the speaker's gestures. When the listener was asked to move their arms to the rhythm, their movements matched perfectly with those of the original speaker.

Because of the way the human body is constructed, hand movements influence torso and throat muscles. Gestures are tightly tied to amplitude. Rather than just using your chest muscles to produce air flow for speech, moving your arms while you speak can add acoustic emphasis. And you can hear someone's motions, even when they're trying not to let you.

"Some language researchers don't like this idea, because they want language to be all about communicating the contents of your mind, rather than the state of your body. But we think that gestures are allowing the acoustic signal to carry additional information about bodily tension and motion. It's information of another kind," says UConn psychologist and director of the Center for the Ecological Study of Perception and Action James Dixon, one of the authors of the paper.
-end-


University of Connecticut

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.