Trust in humans and robots: Economically similar but emotionally different

April 16, 2020

Orange, Calif. - In research published in the Journal of Economic Psychology, scientists explore whether people trust robots as they do fellow humans. These interactions are important to understand because trust-based interactions with robots are increasingly common in the marketplace, workplace, on the road and in the home. Results show people extend trust similarly to humans and robots but people's emotional reactions in trust-based interactions vary depending on partner type.

The study was led by Chapman University's Eric Schniter, Ph.D. and Timothy Shields, Ph.D. along with University of Montreal's Daniel Sznycer, Ph.D.

Experiment

The researchers used an anonymous trust game experiment during which a human trustor decided how much of a $10 endowment to give to a trustee - a human, a robot, or a robot whose payoffs go to another human. The human trustor knows there were potential gains from the transfer and the trustee would determine whether to reciprocate by transferring back an amount. Robots were programmed to mimic previously observed reciprocation by human trustees.

It is well established that in trust games like this, most people make decisions that lead to both trustor and trustee benefit. After the interaction, participants rated various positive and negative emotions.

The experimental design allowed researchers to explain two important aspects of trust in explainable robots: how much humans trust robots compared to fellow humans and patterns of how humans react emotionally following interactions with robots versus other humans.

Results

The experiment shows people extend similar levels of trust to humans and robots. This is not what we would find if humans blindly trusted or refused to t rust robots. This would also not be the outcome if we believe people extend trust with the sole intention of improving other humans' welfare, since trusting a robot does not improve another person's welfare.

The result is consistent with the view that people extend trust for both monetary gain and to discover information about human behavioral propensities. Through their trust interactions with the robots, participants learned about the cooperativeness of fellow humans.

Social emotions are more than feelings - they regulate social behavior. More specifically, social emotions such as guilt, gratitude, anger, and pride affect how we treat others and influence how others treat us in trust-based interactions.

Participants in this experiment experienced social emotions differently depending on whether their partner was a robot or human. A failure to reciprocate the trustor's investment in the trustee triggered more anger when the trustee was a human than when the trustee was a robot. Similarly, reciprocation triggered more gratitude when the trustee was a human than when the trustee was a robot.

Further, participants' emotions finely discriminated among robot types. They reported feeling more intense pride and guilt when the robot trustee's payoff went to a human than when the robot acted alone.

Prospects and Implications

Given that initial trust did not differ across partner type, but social emotions did, a distinct possibility is that trust re-extension in repeated interactions will differ when the partner is a human, a robot, or a robot linked to a human beneficiary.

In the future, driving will present interaction opportunities where it will matter whether decisions are being made by humans or robots and if they serve humans or not. Some cars used for delivery or pickups may drive without human occupants, other cars will drive with passive human occupants and yet other cars will be driven by human drivers. Analogous interactions occur with automated or robotic check-in agents, bank tellers, surgeons, etc.

Partnerships with consistent reciprocators may consolidate into stronger, more productive partnerships when the reciprocators are fellow humans, because humans elicit more gratitude than robots do. Conversely, partnerships with inconsistent reciprocators may be more stable when the reciprocators are robots, because robots elicit less anger than humans do. Further, humans experienced pride and guilt more intensely in interactions where robots served a beneficiary, which suggests people will be more likely to re-extend trust to similar partners.

The human cognitive architecture evolved to have enough structure and content to promote our ancestors' survival and reproduction while also having the flexibility to navigate novel challenges and opportunities. These features enable humans to design and rationally interact with artificial intelligence and robots. Still, interactions with automata, and science's ability to explain these interactions are imperfect because automata lack the psychophysical cues that we expect in an interaction and are often guided by unexplainable or unintuitive decision logics.
-end-
About Chapman University

Founded in 1861, Chapman University is a nationally-ranked private university located in Southern California. Chapman is categorized by the Carnegie Classification as an R2 "high research activity" institution and offers personalized education to more than 9,000 undergraduate and graduate students. The campus has produced a Rhodes Scholar, been named a top producer of Fulbright Scholars and hosts a chapter of Phi Beta Kappa, the nation's oldest and most prestigious honor society. Based in the City of Orange, Chapman also includes the Harry and Diane Rinker Health Science Campus in Irvine. In 2019, the university opened its 11th college, Fowler School of Engineering, in its newest facility, Keck Center for Science and Engineering. Learn more about Chapman University: http://www.chapman.edu.

Chapman University

Related Robots Articles from Brightsurf:

On the way to lifelike robots
In order for robots to be able to achieve more than simple automated machines in the future, they must not only have their own ''brain''.

Children think robots can help the elderly -- but not their own grandparents
A study that asked children to assess three different robots showed that they responded most positively to simple robots shaped like flower pots, and were most sceptical of Pepper the robot, which looks more human.

Nanomaterial gives robots chameleon skin
A new film made of gold nanoparticles changes color in response to any type of movement.

How many jobs do robots really replace?
MIT economist Daron Acemoglu's new research puts a number on the job costs of automation.

Robots popular with older adults
A new study by psychologists from the University of Jena (Germany) does not confirm that robot skepticism among elder people is often suspected in science.

Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.

Designing better nursing care with robots
Robots are becoming an increasingly important part of human care, according to researchers based in Japan.

Darn you, R2! When can we blame robots?
A recent study finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.

Robots need a new philosophy to get a grip
Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.

How can robots land like birds?
Birds can perch on a wide variety of surfaces, thick or thin, rough or slick.

Read More: Robots News and Robots Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.