Deaf infants more attuned to parent's visual cues

October 15, 2019

Eye gaze helps infants communicate. Through everyday interactions, eye gaze establishes a social connection between parent and child and is linked to early word learning.

But can learning experiences before a baby's first birthday prompt babies to pay more attention to their parent's eye gaze?

To test this, a research team led by the University of Washington's Institute for Learning & Brain Sciences (I-LABS) sought out Deaf infants raised by Deaf parents -- families who primarily use visual language and visual cues.

The result - early experiences do matter: Deaf infants exposed to American Sign Language demonstrated strong gaze-following behavior -- and at a more advanced level than hearing infants. The study, published Oct. 15 in the journal Developmental Science, stems from broader research into early learning and finds that Deaf infants of Deaf parents may be more attuned than hearing infants to the social and visual signals of others.

"Children adapt to the people who communicate with them," said Rechele Brooks, a research scientist at I-LABS and lead author of the study. "Whatever your social context is, you're learning from the people around you. Children thrive through interactions with other people. This work shows that children tune into social cues in their environment starting from early infancy."

While gaze following in hearing infants has been studied, the behavior hasn't been formally examined in Deaf infants.

"Informal observations of Deaf infants interacting with their Deaf parents have suggested that these infants possess keen control over their eye-gaze behavior. To evaluate this, we set up a controlled experiment and tracked the gaze behavior of multiple infants," said Jenny Singleton, a linguistics professor at the University of Texas at Austin and co-author of the study. Fewer than 10% of Deaf infants have Deaf parents, thus the research team needed to recruit Deaf infants from across the country.

For this study, 12 Deaf infants participated, along with 60 hearing infants of the same age. Both groups had natural experience with language from birth with their families - the Deaf infants with visual language (American Sign Language) and the hearing infants with spoken language.

During the study, each infant sat with a parent, facing a researcher across a table. The researcher set up the room with two objects, one on either side of the infant. Silently, the researcher then looked to one of the two objects, and a camera recorded the infant's response. Each trial was objectively "scored" based on where the infant directed their gaze.

Scores showed that the Deaf infants were nearly twice as likely as hearing infants to accurately follow the gaze of an adult. Younger Deaf infants (those between 7 and 14 months old) were even more likely to do so than hearing peers.

The accelerated gaze following among Deaf infants could be related to their exposure to sign language. "A signed language environment creates a natural demand on young infants to shift their eye gaze between their parent (who is signing) and the world of interesting objects. Deaf infants may also have enhanced visual control as a result of their sole reliance on visual cues, and not auditory cues," said Singleton.

In the experiment, Deaf infants were also more apt to look back at the adult after following the adult's gaze. This "checking back" behavior is a form of communication, which can indicate that the infant is seeking more information from the adult. Hearing infants can learn from both what an adult looks at and what the adult verbally says about it; Deaf infants must rely on visual cues.

Andrew Meltzoff, co-director of I-LABS and a co-author of the study, added, "Deaf infants, like hearing infants, strive to communicate with others. They are raised with a visual language and become exquisitely attuned to the visual signals from adults."

There is a more general lesson about human nature, too, Meltzoff said: "The human mind and brain flexibly adapt to achieve our fundamental birthright -- connections to others."

The study was funded by the National Science Foundation, the National Institute of Child Health and Human Development, the Virginia Merrill Bloedel Hearing Research Center and the I-LABS Innovative Research Fund.

For more information, contact Brooks at recheleb@uw.edu.

University of Washington

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.