Nav: Home

From scaffolding to screens: Understanding the developing brain for reading

May 04, 2020

May 4, 2020 - In the debate about nature versus nurture for developing reading skills, cognitive neuroscientists have a clear message: both matter. From infancy, children have a neural scaffolding in place upon which environmental factors refine and build reading skills. In new work being presented today at the Cognitive Neuroscience Society (CNS) virtual meeting, scientists are reporting on these biological and environmental factors -- including early screen time -- as they uncover biomarkers that can identify children at risk for dyslexia and other reading acquisition disorders.

"Reading is a relatively new human invention. To read, our brains have to 'recycle' neural circuits originally used for other abilities such as visual and language processing, as well as attention and cognitive abilities," says Tzipi Horowitz-Kraus of The Technion in Israel and Cincinnati Children's Hospital, who is chairing the CNS symposium about the new work. "The fact that 5-10% of children worldwide, across cultures and genetic backgrounds, suffer from dyslexia suggests that this disability is not limited to a specific language."

Indeed, the research being presented by Horowitz-Kraus and others suggests a variety of biological precursors are present in children prior to school age across languages, and several environmental factors can help or hinder reading acquisition. The goal is to identify children at risk early, to provide the best possible interventions that will improve literacy.

The reading brain in infancy

One of the biggest insights to come in recent years in the study of reading acquisition is that most interventions to identify and treat dyslexia in school were coming too late. Over the past decade, longitudinal studies of young children coming out of the lab of Nadine Gaab at Harvard Medical School and others at labs globally have shown that the brains of children who will develop dyslexia are already atypical even before they start into kindergarten.

"We knew that the brain of someone with dyslexia was different from a control, but we didn't know if it was something that developed before the onset of formal reading instruction or if it developed in response to a daily failure to learn to read over a significant period of time," she says. "Our work was the first time MRI imaging could show that some of the brain characteristics predate the onset of reading development," Gaab says.

And in new work being presented at the CNS meeting and available via preprint, Gaab's team has shown that, as a group, babies as young as 3 months old have an underlying infrastructure that helps predict success in reading years later.

As part of the BOLD (Boston Longitudinal Dyslexia) study, Gaab's team has scanned the brains of 140 infants who have a familial risk for dyslexia and then followed them over time to study changes in the structure and function of their brains. For the newest data, 45 of the once-infant subjects have now turned 5 or 6 years old, allowing the researchers to map their brain scans from infancy to their pre-reading skills.

"What our infant data suggest is that there is a structural brain scaffold in infancy that serves as a foundation," Gaaab explains. "Language and reading may be a process that refines this pre-existing brain scaffold."

Studying the brains of young children in an MRI machine is far from simple, Gaab explains. When they are babies, the goal is to have the participants sleep in the scanner. So her lab looks like an elaborate daycare center -- with adaptable rocking chairs, swings, cribs, and other gear optimized for use with the scanner. While safely sleeping in the MRI, the babies hear stories read to them, allowing the researchers to capture both structural information about their brains but also, surprisingly, functional data. "We were very surprised to see robust language networks activated while the infants sleep," Gaab says.

As 5- and 6-year-olds returning to the lab, the children identify word sounds in games designed to test their pre-reading skills. As they get older, the children will do increasingly more advanced tasks, such as reading in the scanner. This longitudinal work gives the researchers a big-picture view of reading development rather than just a snapshot view.

Gaab's lab is next working to understand the co-occurrence of disorders such as ADHD and dyscalculia (a math learning disorder) with dyslexia. They also want to understand techniques children successfully use to compensate for dyslexia in the brain. "We now see children are not a clean slate for reading experience," Gaab says, and they want to not only better understand the determining factors but also inform policy-makers and the public.

The reading brain on screen

While studying neurobiochemistry for her master's program, Horowitz-Kraus worked on SAT preparation with her younger brother who was struggling with reading despite his high intelligence in nonverbal tasks. "Observing my brother's frustration in executing a task that is very intuitive for individuals without dyslexia made me set the goal to seek neurobiological correlates for reading difficulties and to find ways to improve reading ability," she says. "This way, I thought, the difficulty can be diagnosed objectively, maybe even before reading is formally acquired, and can prove without a doubt that the difficulty is real."

Fifteen years later, Horowitz-Kraus has done just that and, in new research, is seeking to understand how day-to-day conditions affect the neurobiological foundation for reading in the brain. "Although dyslexia is a genetic disorder, the environment has an impact wherein it can reduce or increase reading challenges," she says. "The brain is extremely plastic at the pre-reading age, and hence negative stimuli, such as exposure to screens, may have an amplifying effect on a child's outcomes."

In a series of studies, Horowitz-Kraus and colleagues examined how the home literacy environment, including screen exposure, affects the brain circuits of children 3- to 5-years old, in particular executive functions, language and visual processing. As published recently in JAMA Pediatrics, screen-based media use beyond American Academy of Pediatrics guidelines was associated with "lower microstructural integrity of brain white matter tracts supporting language and emergent literacy skills in prekindergarten children."

Earlier work using EEG had found reduced narrative comprehension in preschool children using screens compared to in-person reading. They also have found that screen exposure engages different brain networks in children with dyslexia compared to typical readers.

The results suggest, Horowitz-Kraus says, that listening to stories through screens is not similar to joint reading when seeking to nurture the developing brain. "There is no replacement for joint storytelling in engaging neuronal circuits related to future reading," she says.

Such studies enabled by modern neuroimaging data are allowing researchers for the first time to determine what infrastructure is needed to be able to read and to track the typical and atypical development of this infrastructure -- and to develop appropriate early interventions.

Both Horowitz-Karus and Gaab envision moving to a more preventative model for reading disorders. "This preventive model is something we embrace a lot in medicine but for some reason, we have not yet done so in education," Gaab says. She cites cholesterol screening to help identify those at risk for heart disease as a model that could work for dyslexia and other learning disorders.

Already their research and others' have led to new educational policies, including early dyslexia screening in 29 states to identify children at risk in kindergarten. "We and other cognitive neuroscientists hope to continue to contribute to that shift in this model," Gaab says.
The symposium "Moving from a Deficit-Oriented to a Preventive Model in Education: Examining Neural Correlates for Reading Development" is taking place at CNS 2020 Virtual, from May 2-5 (

CNS is committed to the development of mind and brain research aimed at investigating the psychological, computational, and neuroscientific bases of cognition. Since its founding in 1994, the Society has been dedicated to bringing its 2,000 members worldwide the latest research to facilitate public, professional, and scientific discourse.

Cognitive Neuroscience Society

Related Language Articles:

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.
Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.
'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.
How does language emerge?
How did the almost 6000 languages of the world come into being?
New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.
Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.
Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.
Sign language reveals the hidden logical structure, and limitations, of spoken language
Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.
More Language News and Language Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Debbie Millman: Designing Our Lives
From prehistoric cave art to today's social media feeds, to design is to be human. This hour, designer Debbie Millman guides us through a world made and remade–and helps us design our own paths.
Now Playing: Science for the People

#574 State of the Heart
This week we focus on heart disease, heart failure, what blood pressure is and why it's bad when it's high. Host Rachelle Saunders talks with physician, clinical researcher, and writer Haider Warraich about his book "State of the Heart: Exploring the History, Science, and Future of Cardiac Disease" and the ails of our hearts.
Now Playing: Radiolab

Insomnia Line
Coronasomnia is a not-so-surprising side-effect of the global pandemic. More and more of us are having trouble falling asleep. We wanted to find a way to get inside that nighttime world, to see why people are awake and what they are thinking about. So what'd Radiolab decide to do?  Open up the phone lines and talk to you. We created an insomnia hotline and on this week's experimental episode, we stayed up all night, taking hundreds of calls, spilling secrets, and at long last, watching the sunrise peek through.   This episode was produced by Lulu Miller with Rachael Cusick, Tracie Hunte, Tobin Low, Sarah Qari, Molly Webster, Pat Walters, Shima Oliaee, and Jonny Moens. Want more Radiolab in your life? Sign up for our newsletter! We share our latest favorites: articles, tv shows, funny Youtube videos, chocolate chip cookie recipes, and more. Support Radiolab by becoming a member today at