Research studies role native language plays in processing words in new languages

December 16, 2014

LAWRENCE - Research at the University of Kansas is exploring how a person's native language can influence the way the brain processes auditory words in a second language.

Because cues that signal the beginning and ending of words can differ from language to language, a person's native language can provide misleading information when learning to segment a second language into words. Annie Tremblay, an assistant professor of linguistics, is trying to better understand the kinds of cues second language learners listen for when recognizing words in continuous speech. She also is studying how adaptive adult learners are in acquiring these new speech cues.

Working with a group of international collaborators in the Netherlands, South Korea and France, Tremblay received a three-and-half year, $259,000 National Science Foundation grant for the research.

"The moment we hear a new language, all of a sudden we hear a stream of sounds and don't know where the words begin or end," Tremblay said. "Even if we know words from the second language and can recognize them in isolation, we may not be able to locate these words in continuous speech, because a variety of processes affect how words are realized in context."

For second language learners, some cues are easier to pick up than others, such as which consonants are common in starting and ending words. An example is the "z" sound, which is a common end to words in English but is not often found at the beginning of words.

Other cues, such as intonation, are harder to master and are more likely to be influenced by a speaker's native language. Tremblay points to English where a stressed syllable is a strong indication that a new word is beginning. But in French the opposite is true; prominent syllables tend to be at the end of words.

"This kind of information can't be memorized in a language such as French. It has to be computed. And this is where second language learners struggle," Tremblay said.

An example of confusion is the French phrase for cranky cat, which in French is "chat grincheux." For a brief second, the phrase can sound like the English pronunciation for "chagrin," a word with French origins.

"If you hear the 'cha' syllable as being prominent, it cannot come from the word chagrin in French because the first syllable of chagrin will not be stressed in French," Tremblay said.

With her international collaborators, Tremblay manipulates intonation cues similar to the example above to test how listeners use these cues to recognize words. In one experiment, participants hear a sentence containing a phrase such as 'chat grincheux,' see four word options on a computer screen such as chat, chagrin and two unrelated words, then are asked to click the correct word. An eye-tracking device determines when and how long the participant focuses on each word.

Another experiment has participants listen to an artificial, made-up language for 20 minutes. They are then asked to identify words in that language.

So far the research group has studied native English and Korean speakers who have learned French, and native French speakers who live in France or in the United States.

One of the more interesting findings is that when languages share more similarities but still have slight differences, it can be harder for second language learners to use the correct speech cues to identify words. For example, in French and Korean, prominent syllables tend to be at the end of words. However, there is one small difference: Korean intonation drops before the next word begins. In French, intonation drops during the first syllable of the next word.

"For English speakers, the differences between English stress and French prominence are so salient that it ought to be obvious and they ought to readjust their system," Tremblay said. "Whereas in Korean they think, 'Oh, this is just like Korean.' It sounds similar, and they don't readjust their use of this information."

Researchers also found that native French speakers who lived in France did better than native French speakers who lived in the United States at using French-like intonation cues to locate words in an artificial language. In fact, the longer a native French speaker lived in the United States, the worse they did at using the cues from their native language.

"This suggests that the speech processing system is extremely adaptive. Despite all the claims about the existence of a critical period for language learning, the speech processing system is actually very flexible; it might just take a long time to completely override the effects of the native language," Tremblay said.

The research group continues to collect data and plans to include native Dutch speakers who speak French.
-end-


University of Kansas

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.