Bluesky Facebook Reddit Email

Follow motion or light? How the brain deals with multiple visual inputs

03.05.26 | University of Konstanz

Apple iPhone 17 Pro

Apple iPhone 17 Pro delivers top performance and advanced cameras for field documentation, data collection, and secure research communications.

Imagine arriving at a busy location with people moving around and a multitude of visual and other sensory cues vying for your attention. How does the brain integrate such floods of sensory information and reach a decision on what to do or where to move next? A new study published in Nature Communications , led by Katja Slangewal and Professor Armin Bahl at the Centre for the Advanced Study of Collective Behaviour at the University of Konstanz, sheds light on this fundamental question, using larval zebrafish as a model to uncover the neural mechanisms behind visual integration and decision-making. The findings not only deepen our understanding of neural circuits but also offer insights for future research in areas like robotics, artificial intelligence, and human neuroscience.

An additive strategy resolves sensory conflicts
Animals—including humans—constantly perceive a complex flood of sensory information from their environment that guides their behavioural decisions. Sometimes these inputs may be in conflict with each other, e.g., by leading attention in different directions. How does the brain resolve such conflicts? To address this question, the team focused on two well-studied behaviours in larval zebrafish: the optomotor response, which is a reflexive movement in response to visual motion, such as following a moving pattern, and phototaxis, which describes movement toward light and helps zebrafish navigate their environment.

"A zebrafish might simultaneously perceive motion in one direction and light from another. Previous research suggested that the brain might solve that conflict by using either an additive strategy— which means combining inputs—or a 'winner-takes-all' approach—which means prioritizing the strongest cue," explains Armin Bahl. "But the neural mechanisms underlying these strategies remained unclear."

The researchers presented larval zebrafish with conflicting motion and light cues. They discovered that the animals use an additive behavioural algorithm in their brains to integrate three key visual features of the presented cues: motion coherence (strength and direction of the motion cue patterns), luminance level (brightness of the light cue), and changes in luminance (i.e., sudden increases or decreases in the light cue). The discovered algorithm allows zebrafish to weigh and combine these features to make rapid, adaptive decisions. To identify the brain regions where the neural computation mechanism takes place, the team monitored neural activity across the entire brain using cutting-edge imaging techniques.

Parallel neural pathways in the hindbrain process multiple sensory inputs
"Our imaging revealed that the anterior hindbrain is a critical node where motion, luminance, and luminance change signals converge," says Katja Slangewal. "This region acts as a central hub for integrating visual cues. It processes these parallel inputs and combines them to guide behaviour."

The experiments revealed three parallel computational pathways in the brain, each corresponding to one of the visual features and converging in the anterior hindbrain, thus providing a direct link between brain activity and decision-making.

Based on these findings, the researchers developed an additive network model to simulate how zebrafish process sensory inputs. By fitting the model to experimental data, the team confirmed that the brain uses a weighted sum of motion, luminance, and luminance change signals to determine behaviour.

"Our model not only explains the behavioural data but also predicts how silencing specific pathways—such as motion or luminance processing—would disrupt decision-making," says Armin Bahl.

The study provides a novel, brain-wide account of how sensory information is integrated to drive actions, bridging the gap between behavioural algorithms and their neural implementation. The findings advance our understanding of how vertebrate brains transform sensory information from complex environments into actions. This may have broad implications for the fields of neuroscience, artificial intelligence, and human health.

Key facts:

Note to editors:
You can download images here:

Nature Communications

10.1038/s41467-026-69633-4

Keywords

Article Information

Contact Information

Helena Dietz
University of Konstanz
kum@uni-konstanz.de

How to Cite This Article

APA:
University of Konstanz. (2026, March 5). Follow motion or light? How the brain deals with multiple visual inputs. Brightsurf News. https://www.brightsurf.com/news/1ZZGX271/follow-motion-or-light-how-the-brain-deals-with-multiple-visual-inputs.html
MLA:
"Follow motion or light? How the brain deals with multiple visual inputs." Brightsurf News, Mar. 5 2026, https://www.brightsurf.com/news/1ZZGX271/follow-motion-or-light-how-the-brain-deals-with-multiple-visual-inputs.html.