Nav: Home

How do we follow the rhythm of language? The answer depends on our brain's path

March 04, 2019

How is our speech shaped by what we hear? The answer varies, depending on the make-up of our brain's pathways, a team of neuroscientists has found. The research, which maps how we synchronize our words with the rhythm of the sounds we hear, offers potential methods for diagnosing speech-related afflictions and evaluating cognitive-linguistic development in children.

"Some people spontaneously synchronize the pace of their speech to match the rhythm of the speech they are hearing, while others do not," explains Florencia Assaneo, a post-doctoral researcher in New York University's Department of Psychology and the lead author of the study, which appears in the journal Nature Neuroscience. "Whether you synchronize or not predicts functional and structural aspects of our language brain network as well as our ability to learn new words."

"These discoveries result from a novel behavioral test, which reveals how individual differences are predictive of audio-motor synchronization and neurophysiological function, among other phenomena," adds David Poeppel, a professor of psychology and neuroscience at NYU and director of the Max Planck Institute for Empirical Aesthetics in Frankfurt and the study's senior author. "The potency of such a test as a tool may lead to new discoveries in language research and perhaps help to spot afflictions such as Alzheimer's, Parkinson's, or multiple sclerosis."

Extensive research has been done on how we synchronize our body movements to sound input, such as tapping our foot to the rhythm of a song. But less understood is how our brain functions in a similar speech scenario, such as singing along to a favorite tune.

The question of whether the human ability to speak is tightly connected with our ability to synchronize to the world around us is a significant one. For example, it's known that preschoolers' proficiency in synchronizing their bodies to a beat predicts their language abilities.

However, scientists have not examined whether there is a direct link between speech production rhythms--i.e., the coordinated movements of the tongue, lips, and jaw that constitute speech--and the rhythms of the perceived audio signal.

"In other words, are our mouths coupled to our ears?" Assaneo asks.

To explore this question, the scientists, who also included researchers from the University of Barcelona and Catalan Institution for Research and Advanced Studies (ICREA), conducted a series of experiments in which the subjects listened to a rhythmic sequence of syllables (e.g., "lah," "di," "fum") and at the same time, were asked to continuously whisper the syllable "tah".

The findings, based on more than 300 test subjects, revealed an unexpected division in how we verbalize sounds in response to what we hear. Some spontaneously synchronized their whispering to the syllable sequence (high synchronizers) while others remained impervious to the external rhythm (low synchronizers).

This division raised additional questions, such as: Does the grouping based on this test tap into how people's brains are organized? And does it have any behavioral consequences with broader significance? To answer these, the researchers deployed a battery of additional techniques.

First, the researchers asked whether white matter pathways, which affect learning and coordinate communication among different parts of the brain, differ between groups. To do this, they studied MRI data from the subjects. Here, the team found that high synchronizers have more white matter volume in the pathways connecting speech-perception (listening) areas with speech-production (speaking) areas than do low synchronizers.

Second, they used magnetoencephalography (MEG), a technique that tracks neural dynamics, to record brain activity while participants passively listened to rhythmic syllable sequences. High synchronizers showed more brain-to-stimulus synchrony than did low synchronizers: their neural activity oscillated at the same frequency as the perceived syllable rate in the part of the brain linked to speech-motor planning.

"This implies that areas related to speech production are also recruited during speech perception, which likely helps us track external speech rhythms," observes Assaneo.

Finally, the scientists tested if being a high or low synchronizer predicts how well people learn new words. Specifically, they studied the early stages of language learning: the ability to identify a sound as a word--even without knowing the meaning. The results showed that high synchronizers were better learners of these words than were low synchronizers.

"In everyday life, this could give the high synchronizers an advantage," notes Assaneo. "For example, when in a foreign country, they may more easily pick up words in an unfamiliar language from the people talking around them."
-end-
This research was supported by a grant from the National Institutes of Health (2R01-DC05660).

DOI: 10.1038/s41593-019-0353-z

New York University

Related Brain Articles:

Study describes changes to structural brain networks after radiotherapy for brain tumors
Researchers compared the thickness of brain cortex in patients with brain tumors before and after radiation therapy was applied and found significant dose-dependent changes in the structural properties of cortical neural networks, at both the local and global level.
Blue Brain team discovers a multi-dimensional universe in brain networks
Using a sophisticated type of mathematics in a way that it has never been used before in neuroscience, a team from the Blue Brain Project has uncovered a universe of multi-dimensional geometrical structures and spaces within the networks of the brain.
New brain mapping tool produces higher resolution data during brain surgery
Researchers have developed a new device to map the brain during surgery and distinguish between healthy and diseased tissues.
Newborn baby brain scans will help scientists track brain development
Scientists have today published ground-breaking scans of newborn babies' brains which researchers from all over the world can download and use to study how the human brain develops.
New test may quickly identify mild traumatic brain injury with underlying brain damage
A new test using peripheral vision reaction time could lead to earlier diagnosis and more effective treatment of mild traumatic brain injury, often referred to as a concussion.
This is your brain on God: Spiritual experiences activate brain reward circuits
Religious and spiritual experiences activate the brain reward circuits in much the same way as love, sex, gambling, drugs and music, report researchers at the University of Utah School of Medicine.
Brain scientists at TU Dresden examine brain networks during short-term task learning
'Practice makes perfect' is a common saying. We all have experienced that the initially effortful implementation of novel tasks is becoming rapidly easier and more fluent after only a few repetitions.
Balancing time & space in the brain: New model holds promise for predicting brain dynamics
A team of scientists has extended the balanced network model to provide deep and testable predictions linking brain circuits to brain activity.
New view of brain development: Striking differences between adult and newborn mouse brain
Spikes in neuronal activity in young mice do not spur corresponding boosts in blood flow -- a discovery that stands in stark contrast to the adult mouse brain.
Map of teenage brain provides evidence of link between antisocial behavior and brain development
The brains of teenagers with serious antisocial behavior problems differ significantly in structure to those of their peers, providing the clearest evidence to date that their behavior stems from changes in brain development in early life, according to new research led by the University of Cambridge and the University of Southampton, in collaboration with the University of Rome Tor Vergata in Italy.

Related Brain Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Digital Manipulation
Technology has reshaped our lives in amazing ways. But at what cost? This hour, TED speakers reveal how what we see, read, believe — even how we vote — can be manipulated by the technology we use. Guests include journalist Carole Cadwalladr, consumer advocate Finn Myrstad, writer and marketing professor Scott Galloway, behavioral designer Nir Eyal, and computer graphics researcher Doug Roble.
Now Playing: Science for the People

#529 Do You Really Want to Find Out Who's Your Daddy?
At least some of you by now have probably spit into a tube and mailed it off to find out who your closest relatives are, where you might be from, and what terrible diseases might await you. But what exactly did you find out? And what did you give away? In this live panel at Awesome Con we bring in science writer Tina Saey to talk about all her DNA testing, and bioethicist Debra Mathews, to determine whether Tina should have done it at all. Related links: What FamilyTreeDNA sharing genetic data with police means for you Crime solvers embraced...