Bluesky Facebook Reddit Email

How the brain charts emotion in a map-like way

03.10.26 | Emory University

Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C)

Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C) keeps Macs, tablets, and meters powered during extended observing runs and remote surveys.


It is well established in psychology that humans conceptualize emotions by features known as valence (the degree of pleasantness or unpleasantness) and arousal (the intensity of bodily reactions, such as rapid breathing or a racing heart).

If you think of “pleasantness” as longitude and “bodily reaction” as latitude, you can imagine a “mental map,” with nodes that “chart” knowledge of emotion.

The neural mechanisms giving rise to this configuration, however, have remained unclear.

Now, a new study reveals that hippocampal-prefrontal circuits — neural structures implicated in forming other types of cognitive maps — could support the mental mapping of emotion.

Nature Communications published the research by neuroscientists at Emory University. The results showed how the hippocampus represents emotion concepts in a structured hierarchy of “nodes” of pleasantness and bodily reaction, while the ventromedial prefrontal cortex more accurately tracks relationships between these different nodes, or how they are distributed on the mental map.

Pinpointing the neural mechanisms that produce such map-like representations may ultimately help in the treatment for some mental illnesses, says Philip Kragel, senior author of the research and Emory professor of psychology.

“Research has shown that individuals with depression and anxiety represent emotions in a more compressed, less differentiated way,” he explains. “And that people who represent emotion with more granularity and differentiation tend to have better health outcomes.”

The current paper combined human brain imaging data, pattern recognition and simulations using AI neural networks.

“People’s emotional experiences are subjective,” says Yumeng Ma, first author of the paper and an Emory PhD student of psychology. “We’re using technology to understand the mechanisms underlying emotions in an objective, scientific way.”

Developing new approaches

“Emotions are central to human experience, they are not simply reactions to things,” Kragel says. “They are important to our success and to our well-being. They help us to communicate better, learn from our experiences, and empathize with others.”

And yet, he adds, emotions have been notoriously difficult to study scientifically.

Kragel is a leader in developing computational methods to study the nature of emotions. His Emotion Cognition and Computation Lab (ECCO Lab) works at the intersection of psychology, cognitive neuroscience and machine learning.

AI neural networks, modeled on the human brain, are one tool used by the lab.

Similarly to the human brain, an artificial neural network must boil down complex data into its essence, a process known as “embedding,” so that vast amounts of knowledge may be stored in an organized and efficient manner.

“For the current paper, we wanted to probe how the human brain compresses emotion experiences,” Kragel says. “How do we embed these very complicated events? What are the relevant neural signals?”

Combining human neuroimaging and pattern recognition

The researchers began by tapping the multimodal dataset Emo-FiLM (Emotion Research Using Films and fMRI), a component of OpenNeuro, a free and open platform for validating and sharing neuroscience data.

The Emo-FilM dataset includes ratings of various emotions by participants as they watch short, emotionally evocative film clips. These human ratings of emotion experience and the corresponding brain activity scans can be examined in relation to one another to reduce the gap between theory in psychology and empirical neuroscience. The dataset is tuned to understand underlying emotion processes rather than individual differences.

The researchers developed predictive models to analyze this dataset and found, as expected, that self-report measures of emotional experience could be decoded from fMRI patterns of hippocampal-prefrontal activity.

The hippocampus is a seahorse-shaped structure in the temporal lobe that helps organize experiences into memories by linking information from across the brain. The ventromedial prefrontal cortex, or vmPFC, is a brain region in the frontal lobe involved in weighing information about goals, social cues and outcomes, helping people make decisions and evaluate risk and reward.

Analyzing the outputs of predictive models revealed these brain systems contained information consistent with a map-like representation.

“For example,” Ma explains, “occurrences of anger and fear are often closer together compared to those of happiness and excitement.”

The researchers tested the model’s ability to predict both emotion categories and the relations between them. The results showed more information about emotion categories in the hippocampus and more relational information in the vmPFC.

Tapping an artificial neural network

They further probed their framework using an artificial neural network known as the Tolman-Eichenbaum Machine, or TEM, which serves as a computational model of relational memory in the brain.

The researchers first created an artificial environment, represented as an abstract graph, based on emotion category ratings from the film-viewing data. TEM artificial agents, or virtual robots, were exposed to this environment so they could learn how emotion concepts relate to one another.

After this training, trajectories of the artificial agents were plotted as they “walked” through the environment and made their own predictions about what they would experience if they stayed put or moved up, down, to the right or to the left along the graph.

“The main takeaway,” Ma says, “is we found that the hierarchy of emotion categories is represented more broadly — for example, this is good, that is bad — in the interior part of the hippocampus. And in the posterior region, the representations are more granular, finer-grained concepts.”

The results also showed that the vmPFC appears to track long-term transitions for broad, rather than finer-grained emotion concepts.

Foundational work

The findings offer a neurocomputational explanation of how humans organize abstract emotion knowledge in a generalized, normative way.

The researchers hope to build on their findings by studying how this mental map may differ among those with mental health issues and across different cultures.

They also want to explore how this mental map for emotions develops over time.

“These are open questions,” Kragel says, “Are you born with the ability to form broad categories of emotion, such as good or bad, and then you gradually learn where to add more nuanced nodes on the graph? Or maybe you’re born with the ability to learn general relational structures. Do the emotions come first? Or is it the other way around?”

Nature Communications

10.1038/s41467-025-68240-z

Computational simulation/modeling

Not applicable

Map-like representations of emotion knowledge in hippocampal-prefrontal systems

26-Jan-2026

Keywords

Article Information

Contact Information

Carol Clark
Emory University
carol.clark@emory.edu

Source

How to Cite This Article

APA:
Emory University. (2026, March 10). How the brain charts emotion in a map-like way. Brightsurf News. https://www.brightsurf.com/news/LQ40WO58/how-the-brain-charts-emotion-in-a-map-like-way.html
MLA:
"How the brain charts emotion in a map-like way." Brightsurf News, Mar. 10 2026, https://www.brightsurf.com/news/LQ40WO58/how-the-brain-charts-emotion-in-a-map-like-way.html.