What articulation-relevant brain regions do when we listen

July 02, 2018

Brain regions that are involved in the articulation of language are also active in the perception of language. This finding of a team from the BrainLinks-BrainTools Cluster of Excellence of the University of Freiburg makes a significant contribution to clarifying a research question that has been hotly debated for decades. The scientists have published their results in the journal Scientific Reports.

Spontaneous oral communication is a fundamental part of our social life. But what is happening in the human brain during it? The neuroscience of language has developed steadily over past decades thanks to experimental studies. However, little is still known about how the brain supports spoken language under everyday, non-experimental, spontaneous conditions. The question whether brain regions responsible for articulation are also activated during perception of language has divided scholars in two camps. Some have observed such activation during experimental studies and concluded that it reflects a mechanism that is necessary for the perception of language. Others have not found this activation in their experiments and deduced that it must be rare or possibly does not really exist.

Nevertheless, both camps had the following concerns: brain activity in regions relevant to articulation could be affected by the design of the experiment - in the end, experimental conditions differ massively from those of spontaneous language. So, it was necessary to conduct a study using natural conversations.

Using an extraordinary design, the researchers from Freiburg have succeeded in studying neuronal activity during such conversations. This was done using brain activity recorded for diagnosis during everyday conversations of neurological patients, which the patients then donated for research. The scientists have shown that brain regions relevant to articulation reliably display activity during perception of spontaneous spoken language. The fact that these regions were not activated when the test subjects heard non-speech noises suggest that this activity may be specific to speech.
-end-
Original publication:

Olga Glanz (Iljina), Johanna Derix, Rajbir Kaur, Andreas Schulze-Bonhage, Peter Auer, Ad Aertsen, Tonio Ball (2018): Real-life speech production and perception have a shared premotor-cortical substrate. In: Scientific Reports.

https://rdcu.be/VDs2

Contact:

PD Dr. Tonio Ball
BrainLinks-BrainTools / Freiburg University Medical Center
Tel.: +49 761 270-93160
tonio.ball@uniklinik-freiburg.de

Olga Glanz
BrainLinks-BrainTools / Freiburg University Medical Center
olga.ganz@uniklinik-freiburg.de

University of Freiburg

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.