Bluesky Facebook Reddit Email

Let AI grow like the brain: Temporal development mechanisms enable cross-domain continual learning

04.10.26 | Science China Press

SAMSUNG T9 Portable SSD 2TB

SAMSUNG T9 Portable SSD 2TB transfers large imagery and model outputs quickly between field laptops, lab workstations, and secure archives.


How does artificial intelligence continue to improve its capabilities? For a long time, expanding model size has been regarded as an important way to enhance the performance of artificial neural networks, but it has also led to rising energy consumption and growing computational costs. In contrast, during development the human brain does not simply increase connection density; instead, it continuously gains new cognitive abilities through selective pruning.

Inspired by these, the research team proposed a temporally developmental continual learning framework for spiking neural networks. By enabling the temporal establishment and reorganization of connections across different regions, the approach achieves continual learning from simple to complex across perception–motor–interaction tasks while network size is progressively reduced, offering a new pathway toward low-energy, sustainably evolving general cognitive intelligence.

Temporally Development–Inspired Continual Learning Mechanism

Studies show that brain development follows clear temporal principles: neural connectivity first increases and then becomes refined, with cross-regional long-range connections gradually strengthening while local connections are selectively pruned. Primary brain regions mature earlier to support higher cognition, and feedback from higher cognitive functions in turn optimizes lower-level structures. Along this process, infants progressively acquire multiple cognitive functions from simple to complex. Building on these principles, the researchers proposed a temporally development-inspired continual learning method.

The approach allows cognitive modules in spiking neural networks to grow progressively following the learning sequence of perception, motor control, and interaction, while evolving cross-regional long-range connections to promote knowledge reuse across tasks. At the same time, feedback mechanisms are introduced to inhibit and prune redundant local connections from earlier tasks, enabling the network to become increasingly compact as learning progresses.

Energy-Efficient Cross-Domain Continual Learning

The research team found that the proposed method demonstrates stable and strong continual learning performance across multiple cognitive domains, including perception, motor control, and interaction, and achieves leading results on several widely used continual learning benchmarks. Experimental results show that the model learns complex tasks progressively along a “simple-to-complex” trajectory, clearly outperforming direct training or direct pruning approaches. Even as the network scale is continuously reduced, the model effectively preserves memory of previously learned tasks, significantly mitigating catastrophic forgetting while continuing to acquire new cognitive capabilities.

Further analysis indicates that this performance gain arises from brain-like dynamic changes within the network. As learning progresses, local connections first grow rapidly and are then selectively inhibited and pruned, reducing interference from irrelevant or outdated knowledge, while cross-regional long-range connections are continuously strengthened to support the selective reuse of prior knowledge with shared structure and semantics. Importantly, this process does not rely on conventional continual learning strategies such as regularization, experience replay, or weight freezing. The researchers note that this brain-inspired developmental mechanism enhances learning and memory in an efficient, low-energy manner, highlighting the potential of brain developmental principles to drive the next generation of artificial intelligence.

National Science Review

10.1093/nsr/nwag066

Experimental study

Keywords

Article Information

Contact Information

Bei Yan
Science China Press
yanbei@scichina.com

How to Cite This Article

APA:
Science China Press. (2026, April 10). Let AI grow like the brain: Temporal development mechanisms enable cross-domain continual learning. Brightsurf News. https://www.brightsurf.com/news/LN2PY7Y1/let-ai-grow-like-the-brain-temporal-development-mechanisms-enable-cross-domain-continual-learning.html
MLA:
"Let AI grow like the brain: Temporal development mechanisms enable cross-domain continual learning." Brightsurf News, Apr. 10 2026, https://www.brightsurf.com/news/LN2PY7Y1/let-ai-grow-like-the-brain-temporal-development-mechanisms-enable-cross-domain-continual-learning.html.