Augmented reality visor makes cake taste moister, more delicious

December 15, 2020

Researchers have developed an augmented reality (AR) visor system that enables them to manipulate the light coming off food in such a way as to 'trick' people consuming the food into experiencing it as more or less moist, watery, or even delicious. The findings not only reveal how human taste is experienced in a multisensorial way -- through a combination of visual perception, smell and even sound -- but the technique could be used in hospitals to improve the palatability of food, or as a design development tool in the food industry.

The findings were published in Scientific Reports on September 30, 2020.

It has long been known that taste is not only a product of a food's chemical composition, which directly shapes the experience of consumption of food and drink, but that its visual appearance also contributes to how we experience its taste.

For example, research has shown that humans tend to associate sour-tasting food and carbonated beverages with sharper shapes, and to associate creamy food and still drinks with more rounded shapes. Underlying that visual experience is the way that light bounces off an object, or -- to put it in more scientific terms -- the distribution of luminance. Earlier research had shown that variation in this luminance distribution influences how fresh a cabbage appears to people when looking at a series of still pictures of the vegetable. But pictures are not the same as the dynamic experience of actually eating a piece of food.

"So we wondered whether manipulating this luminance distribution while someone was eating something would produce a similar effect," said Katsunori Okajima, who specializes in vision and brain sciences at Yokohama National University in Japan.

The researchers developed an augmented reality (AR) system that allows them to manipulate the standard deviation of luminance distribution. This is a statistical term that describes how spread out a set of numbers are from their average value. So for example, the total amount of light bouncing off two different slices of cake might be the same, but for the first slice, the deviation of luminance distribution is small, giving it a smoother appearance, while the deviation is large for the second slice, giving it a rougher appearance.

They used their AR system in two experiments. In the first, people wore the AR visor system while eating slices of Baumkuchen, a type of German cake widely available in Japan, and in the second they wore it while eating a spoonful of ketchup. The researchers were thus able to manipulate the appearance during consumption of food, going one step further than the photographs of cabbage.

Upon interviewing the participants, they found that manipulating the standard deviation of the luminance distribution (while keeping the color and the overall luminance constant) altered not only what the participants expected to taste in terms of moistness, wateriness and deliciousness, but also the actual taste and texture properties upon sampling the food itself.

The AR manipulation was most effective in moistness (of the cake) and wateriness (of the ketchup), while the effect of the system on perception of sweetness was relatively modest

"This suggests that the association between visual texture and sweetness is weak," added Dr. Okajima.

The researchers now hope to develop new image processing technology which can manipulate any appearance of any food in real time. Ultimately, they want to use these techniques to quantify all ways that visual information affects our taste perception, and to describe the precise mechanisms of such processing within the brain.
-end-
Yokohama National University (YNU or Yokokoku) is a Japanese national university founded in 1949. YNU provides students with a practical education utilizing the wide expertise of its faculty and facilitates engagement with the global community. YNU's strength in the academic research of practical application sciences leads to high-impact publications and contributes to international scientific research and the global society. For more information, please see: https://www.ynu.ac.jp/english/

Yokohama National University

Related Perception Articles from Brightsurf:

Intelligent cameras enhance human perception
A team of FAU researchers has developed an intelligent camera that achieves not only high spatial and temporal but also spectral resolution.

New perception metric balances reaction time, accuracy
Researchers at Carnegie Mellon University have developed a new metric for evaluating how well self-driving cars respond to changing road conditions and traffic, making it possible for the first time to compare perception systems for both accuracy and reaction time.

Sweet-taste perception changes as children develop
While adults prefer levels of sweetness similar to typical soft drinks, children and adolescents are less sensitive to the taste and prefer concentrations that are 50% sweeter, according to research by professor of food science and human nutrition M.

Optogenetic odors reveal the logic of olfactory perception
Using optogenetic control, researchers have created an electrical signature that is perceived as an odor in the brain's smell-processing center, the olfactory bulb, even though the odor does not exist.

Vision loss influences perception of sound
People with severe vision loss can less accurately judge the distance of nearby sounds, potentially putting them more at risk of injury.

Why visual perception is a decision process
A popular theory in neuroscience called predictive coding proposes that the brain produces all the time expectations that are compared with incoming information.

How the heart affects our perception
When we encounter a dangerous situation, signals from the brain make sure that the heart beats faster.

Changing how we think about warm perception
Perceiving warmth requires input from a surprising source: cool receptors.

Rhythmic perception in humans has strong evolutionary roots
So suggests a study that compares the behaviour of rodents and humans with respect to the detection rhythm, published in Journal of Comparative Psychology by Alexandre Celma-Miralles and Juan Manuel Toro, researchers at the Center for Brain and Cognition.

Approaching the perception of touch in the brain
More than ten percent of the cerebral cortex are involved in processing information about our sense of touch -- a larger area than previously thought.

Read More: Perception News and Perception Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.