Nav: Home

Face it. Our faces don't always reveal our true emotions

February 25, 2019

Actor James Franco looks sort of happy as he records a video diary in the movie "127 Hours." It's not until the camera zooms out, revealing his arm is crushed under a boulder, that it becomes clear his goofy smile belies his agony.

That's because when it comes to reading a person's state of mind, visual context -- as in background and action -- is just as important as facial expressions and body language, according to a new study from the University of California, Berkeley.

The findings, to appear online this week in the journal Proceedings of the National Academy of Sciences, challenge decades of research positing that emotional intelligence and recognition are based largely on the ability to read micro-expressions signaling happiness, sadness, anger, fear, surprise, disgust, contempt and other positive and negative moods and sentiments.

"Our study reveals that emotion recognition is, at its heart, an issue of context as much as it is about faces," said study lead author Zhimin Chen, a doctoral student in psychology at UC Berkeley.

Researchers blurred the faces and bodies of actors in dozens of muted clips from Hollywood movies and home videos. Despite the characters' virtual invisibility, hundreds of study participants were able to accurately read their emotions by examining the background and how they were interacting with their surroundings.

The "affective tracking" model that Chen created for the study allows researchers to track how people rate the moment-to-moment emotions of characters as they view videos.

Chen's method is capable of collecting large quantities of data in a short time, and could eventually be used to gauge how people with disorders like autism and schizophrenia read emotions in real time, and help with their diagnoses.

"Some people might have deficits in recognizing facial expressions, but can recognize emotion from the context," Chen said. "For others, it's the opposite."

Moreover, the findings, based on statistical analyses of the ratings collected, could inform the development of facial recognition technology.

"Right now, companies are developing machine learning algorithms for recognizing emotions, but they only train their models on cropped faces and those models can only read emotions from faces," Chen said. "Our research shows that faces don't reveal true emotions very accurately and that identifying a person's frame of mind should take into account context as well."

For the study, Chen and study senior author David Whitney, a UC Berkeley vision scientist, tested the emotion recognition abilities of nearly 400 young adults. The visual stimuli they used were video clips from various Hollywood movies as well as documentaries and home videos that showed emotional responses in more natural settings.

Study participants went online to view and rate the video clips. A rating grid was superimposed over the video so that researchers could track each study participant's cursor as it moved around the screen, processing visual information and rating moment-to-moment emotions.

In the first of three experiments, 33 study participants viewed interactions in movie clips between two characters, one of which was blurred, and rated the perceived emotions of the blurred character. The results showed that study participants inferred how the invisible character was feeling based not only on their interpersonal interactions, but also from what was happening in the background.

Next, approximately 200 study participants viewed video clips showing interactions under three different conditions: one in which everything was visible, another in which the characters were blurred, and another in which the context was blurred. The results showed that context was as important as facial recognition for decoding emotions.

In the final experiment, 75 study participants viewed clips from documentaries and home videos so that researchers could compare emotion recognition in more naturalistic settings. Again, context was as critical for inferring the emotions of the characters as were their facial expressions and gestures.

"Overall, the results suggest that context is not only sufficient to perceive emotion, but also necessary to perceive a person's emotion," said Whitney, a UC Berkeley psychology professor. "Face it, the face is not enough to perceive emotion."
-end-


University of California - Berkeley

Related Emotions Articles:

Is it ok for parents to be supportive to children's negative emotions?
New research suggests that whereas mothers who are more supportive of their children's negative emotions rate their children as being more socially skilled, these same children appear less socially adjusted when rated by teachers.
Emotions expressed by the dying are unexpectedly positive
Fear of death is a fundamental part of the human experience -- we dread the possibility of pain and suffering and we worry that we'll face the end alone.
Streamlined analysis could help people better manage their emotions
The strategies people use to manage their emotions fall into three core groupings, according to newly published research from the University at Buffalo.
We read emotions based on how the eye sees
We use others' eyes -- whether they're widened or narrowed -- to infer emotional states, and the inferences we make align with the optical function of those expressions, according to new research published in Psychological Science, a journal of the Association for Psychological Science.
Emotions are cognitive, not innate, researchers conclude
Emotions are not innately programmed into our brains, but, in fact, are cognitive states resulting from the gathering of information, New York University Professor Joseph LeDoux and Richard Brown, a professor at the City University of New York, conclude.
Well-being linked with when and how people manage emotions
Reframing how we think about a situation is a common strategy for managing our emotions, but a new study suggests that using this reappraisal strategy in situations we actually have control over may be associated with lower well-being.
Oxytocin in the recognition of emotions
Studies have demonstrated that oxytocin plays a role in facilitating the perception of emotions in other people's facial expressions.
How our emotions affect store prices
Why stores should take shoppers' emotions into account when setting prices.
Chronic fatigue patients more likely to suppress emotions
Chronic fatigue syndrome patients report they are more anxious and distressed than people who don't have the condition, and they are also more likely to suppress those emotions.
Emotions in the age of Botox
Botulin injections in the facial muscles, which relax expression lines and make one's skin appear younger as a result of a mild paralysis, have another, not easily predictable effect: they undermine the ability to understand the facial expressions of other people.

Related Emotions Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Setbacks
Failure can feel lonely and final. But can we learn from failure, even reframe it, to feel more like a temporary setback? This hour, TED speakers on changing a crushing defeat into a stepping stone. Guests include entrepreneur Leticia Gasca, psychology professor Alison Ledgerwood, astronomer Phil Plait, former professional athlete Charly Haversat, and UPS training manager Jon Bowers.
Now Playing: Science for the People

#524 The Human Network
What does a network of humans look like and how does it work? How does information spread? How do decisions and opinions spread? What gets distorted as it moves through the network and why? This week we dig into the ins and outs of human networks with Matthew Jackson, Professor of Economics at Stanford University and author of the book "The Human Network: How Your Social Position Determines Your Power, Beliefs, and Behaviours".