UNN scientists are studying the problem of modeling the cognitive dissonance phenomenon

November 16, 2017

Lobachevsky University (UNN) scientists, Associate Professor of the History and Theory of International Relations Department Alexander Petukhov and Head of the Department of Psychophysiology Sofya Polevaya, are studying the modeling of the cognitive dissonance phenomenon. They rely on the theory of information images and a mathematical model developed on the basis of this theory. The proposed theory is based on the idea of a universal cognitive unit of information in the human mind, the so-called information image, and of the space where it exists, its topology and properties. Accordingly, the theory of information images is a way to describe information interactions of individuals, as well as a number of human cognitive functions.

According to Alexander Petukhov, the main statements of this theory have been formulated in the framework of the study.

"We have considered the hierarchy of information images in the mind of an individual, which determines the individual's real and virtual activities. We have also developed the algorithms for describing the transmission and distortion of information images by individuals during communication," notes Alexander Petukhov.

The theory of information images states that there is a certain limited space filled with a set of information images, which are in constant interaction governed by certain laws. "Heavier", inert information images are in the center of this space, while "lighter", high-energy images are closer to the edges. From a mathematical point of view, this can be described by means of diffusion equations (for example, the Langevin equation), where information images are likened to particles intensively interacting in a limited region (information image space). For experimental validation of the theory, the bilingual Stroop test was chosen, which is a classical test for detecting the effects of cognitive dissonance with several conflicting information disturbances; that is why it was used to compare the results of modeling with experimental results.

In the classical version of the Stroop test, the test subject is given the task of reading the name of the color in accordance with the meaning or color of the letters of the word that denotes the color. The reaction time and the number of errors are measured in 4 contexts: In the third and fourth contexts, a discrepancy arises between the information images activated by verbal and color visual stimuli. Such a cognitive conflict of information images is manifested in an increased time required for decision-making (the time interval between the moment when the stimulus is presented and the response of the person being tested). One of the options for modifying the computerized Stroop test is a bilingual test in which the words are presented in both the native language of the person being tested and a foreign language.

According to Sofya Polevaya, the results of the test are interpreted with the help of the proposed theory and compared with the results of computer modeling based on this theory.

"It has been shown that with the help of information images one can explain a number of cognitive processes in the human mind and also predict their dynamics in some particular cases," Sofya Polevaya notes.

The results of the simulation demonstrate that the general characteristic pattern coincides in the experiment for the native and foreign language, which confirms the adequacy of the model for solving problems of this type and the appropriateness of the interpretation proposed.

Lobachevsky University

Related Color Articles from Brightsurf:

Envision color: Activity patterns in the brain are specific to the color you see
Researchers at the National Eye Institute (NEI) have decoded brain maps of human color perception.

OPD optical sensors that reproduce any color
POSTECH Professor Dae Sung Chung's team uses chemical doping to freely control the colors of organic photodiodes.

What laser color do you like?
Researchers at the National Institute of Standards and Technology (NIST) and the University of Maryland have developed a microchip technology that can convert invisible near-infrared laser light into any one of a panoply of visible laser colors, including red, orange, yellow and green.

Increasing graduation rates of students of color with more faculty of color
A new analysis published in Public Administration found that student graduation rates improve as more faculty employed by a college or university share sex and race/ethnic identities with students.

How much color do we really see?
Color awareness has long been a puzzle for researchers in neuroscience and psychology, who debate over how much color observers really perceive.

Stretchable variable color sheet that changes color with expansion and contraction
Toyohashi University of Technology research team have succeeded in developing a variable color sheet with a film thickness of 400 nanometers that changes color when stretched and shrunk.

High color purity 3D printing
ICFO researchers report on a new method to obtain high color purity 3D objects with the use of a new class of nanoparticles.

Building a better color vision test for animals
University of Cincinnati biologists modified simple electronics to create a color vision test for fiddler crabs and other animals.

The color of your clothing can impact wildlife
Your choice of clothing could affect the behavioral habits of wildlife around you, according to a study conducted by a team of researchers, including faculty at Binghamton University, State University of New York.

Recovering color images from scattered light
Engineers at Duke University have developed a method for extracting a color image from a single exposure of light scattered through a mostly opaque material.

Read More: Color News and Color Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.