Nav: Home

Human brain tunes into visual rhythms in sign language

June 08, 2017

The human brain works in rhythms and cycles. These patterns occur at predictable frequencies that depend on what a person is doing and on what part of the brain is active during the behavior.

Similarly, there are rhythms and patterns out in the world, and for the last 20 years, scientists have been perplexed by the brain's ability to "entrain," or match up, with these patterns. Language is one of those areas in which scientists observe neural entrainment: When people listen to speech, their brain waves lock up with the volume-based rhythms they hear. Since people can't pay attention to everything happening in their environment at once, this phase locking is thought to help anticipate when important information is likely to appear.

Many studies have documented this phenomenon in language processing; however, it has been difficult to tell whether neural entrainment is specialized for spoken language. In a new study in the Proceedings of the National Academy of Sciences, University of Chicago scholars designed an experiment using sign language to answer that question.

"To determine if neural entrainment to language is specialized for speech or if it is a general-purpose tool that humans can use for anything that is temporally predictable, we had to go outside of speech and outside of auditory perception," said Geoffrey Brookshire, the study's lead author and a PhD student in the Department of Psychology.

Brookshire worked with Daniel Casasanto, assistant professor of psychology and leader of the Experience and Cognition Lab; Susan Goldin-Meadow, the Beardsley Ruml Distinguished Service Professor of in the Department of Psychology and an acclaimed scholar of language and gesture; Howard Nusbaum, professor of psychology and an expert in spoken language and language use; and Jenny Lu, a PhD student specializing in sign language, gesture and language development.

"By looking at sign, we're learning something about how the brain processes language more generally. We're solving a mystery we couldn't crack by studying speech alone," Casasanto said.

In speech, the brain locks on to syllables, words and phrases, and those rhythms occur below 8 Hz, or 8 pulses per second. Vision also has a preferred frequency onto which it latches.

"When we focus on random flashes of light, for example, our brains most enthusiastically lock on to flashes around 10 Hz. By looking at sign language, we can ask whether the important thing for entrainment is which sense you're using, or the kind of information you're getting," Brookshire said.

To determine if people tune into visual rhythms in the same way they tune into the auditory rhythms of language, they showed videos of stories told in American Sign Language to fluent signers and measured brain activity as they watched. Once the researchers had these electroencephalogram readings, they needed a way to measure visual rhythms in sign language.

While there are well-established methods to measure rhythms in speech, there are no automatic, objective equivalents for the temporal structure of sign language. So the researchers created one.

They developed a new metric, called the instantaneous visual change, which summarizes the degree of change at each time period during signing. They ran experiment videos, the ones watched by participants, through their new algorithm to identify peaks and valleys in visual changes between frames. The largest peaks were associated with large, quick movements.

With this roadmap illustrating the magnitude of visual changes over time in the videos, Brookshire overlaid the participants' EEGs to see whether people entrain around the normal visual frequency of about 10 Hz, or at the lower frequencies of signs and phrases in sign language -- about 2 Hz.

Their discovery answers a fundamental question that has been lingering for years in research on speech entrainment: Is it specialized for auditory speech? The study reveals that the brain entrains depending on the information in the signal--not on the differences between seeing and hearing. Participants' brain waves locked into the specific frequencies of sign language, rather than locking into the higher frequency that vision tends to prefer.

"This is an exciting finding because scientists have been theorizing for years about how adaptable or flexible entrainment may be, but we were never sure if it was specific to auditory processing or if it was more general purpose," Brookshire said. "This study suggests that humans have the ability to follow perceptual rhythms and make temporal predictions in any of our senses."

In a broader sense, neuroscientists want to understand how the human brain creates and perceives language, and entrainment has emerged as an important mechanism. In revealing neural entrainment as a generalized strategy for improving sensitivity to informational peaks, this study takes significant steps toward advancing the understanding of human language and perception.

"The piece of the paper that I find particularly exciting is that it compares how signers and non-signers process American Sign Language stimuli," Goldin-Meadow said. "Although both groups showed the same level of entrainment in early visual regions, they displayed differences in frontal regions -- this finding sets the stage for us to identify aspects of neural entrainment that are linked to the physical properties of the visual signal compared to aspects that appear only with linguistic knowledge."
-end-


University of Chicago

Related Language Articles:

How does language emerge?
How did the almost 6000 languages of the world come into being?
New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.
Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.
Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.
Sign language reveals the hidden logical structure, and limitations, of spoken language
Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.
Lying in a foreign language is easier
It is not easy to tell when someone is lying.
American sign language and English language learners: New linguistic research supports the need for policy changes
A new study of the educational needs of students who are native users of American Sign Language (ASL) shows glaring disparities in their treatment by the U.S Department of Education.
The language of facial expressions
University of Miami Psychology Professor Daniel Messinger collaborated with researchers at Western University in Canada to show that our brains are pre-wired to perceive wrinkles around the eyes as conveying more intense and sincere emotions.
More Language News and Language Current Events

Top Science Podcasts

We have hand picked the top science podcasts of 2019.
Now Playing: TED Radio Hour

In & Out Of Love
We think of love as a mysterious, unknowable force. Something that happens to us. But what if we could control it? This hour, TED speakers on whether we can decide to fall in — and out of — love. Guests include writer Mandy Len Catron, biological anthropologist Helen Fisher, musician Dessa, One Love CEO Katie Hood, and psychologist Guy Winch.
Now Playing: Science for the People

#543 Give a Nerd a Gift
Yup, you guessed it... it's Science for the People's annual holiday episode that helps you figure out what sciency books and gifts to get that special nerd on your list. Or maybe you're looking to build up your reading list for the holiday break and a geeky Christmas sweater to wear to an upcoming party. Returning are pop-science power-readers John Dupuis and Joanne Manaster to dish on the best science books they read this past year. And Rachelle Saunders and Bethany Brookshire squee in delight over some truly delightful science-themed non-book objects for those whose bookshelves are already full. Since...
Now Playing: Radiolab

An Announcement from Radiolab