Researchers in japan make android child's face strikingly more expressive

November 15, 2018

Osaka - Japan's affection for robots is no secret. But is the feeling mutual in the country's amazing androids? We may now be a step closer to giving androids greater facial expressions to communicate with.

While robots have featured in advances in healthcare, industrial, and other settings in Japan, capturing humanistic expression in a robotic face remains an elusive challenge. Although their system properties have been generally addressed, androids' facial expressions have not been examined in detail. This is owing to factors such as the huge range and asymmetry of natural human facial movements, the restrictions of materials used in android skin, and of course the intricate engineering and mathematics driving robots' movements.

A trio of researchers at Osaka University has now found a method for identifying and quantitatively evaluating facial movements on their android robot child head. Named Affetto, the android's first-generation model was reported in a 2011 publication. The researchers have now found a system to make the second-generation Affetto more expressive. Their findings offer a path for androids to express greater ranges of emotion, and ultimately have deeper interaction with humans.

The researchers reported their findings in the journal Frontiers in Robotics and AI.

"Surface deformations are a key issue in controlling android faces," study co-author Minoru Asada explains. "Movements of their soft facial skin create instability, and this is a big hardware problem we grapple with. We sought a better way to measure and control it."

The researchers investigated 116 different facial points on Affetto to measure its three-dimensional movement. Facial points were underpinned by so-called deformation units. Each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or raising of part of a lip or eyelid. Measurements from these were then subjected to a mathematical model to quantify their surface motion patterns.

While the researchers encountered challenges in balancing the applied force and in adjusting the synthetic skin, they were able to employ their system to adjust the deformation units for precise control of Affetto's facial surface motions.

"Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms," study first author Hisashi Ishihara says. "Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning."
-end-
The article, "Identification and Evaluation of the Face System of a Child Android Robot Affetto for Surface Motion Design" was published in Frontiers in Robotics and AI at doi: https://doi.org/10.3389/frobt.2018.00119.

A related movie can be shown at the following links:

Various expressions of a child android Affetto
https://www.youtube.com/watch?time_continue=36&v=EKFc1DEoO6U

Child Android Affetto Head Version 2018
https://www.youtube.com/watch?time_continue=1&v=IxuluHiwGSk

Affetto Eyeblink&Neck motion
https://www.youtube.com/watch?time_continue=20&v=KbEIv9ONajs

About Osaka University

Osaka University was founded in 1931 as one of the seven imperial universities of Japan and now has expanded to one of Japan's leading comprehensive universities.?The University has now embarked on open research revolution from a position as Japan's most innovative university and among the most innovative institutions in the world according to Reuters 2015 Top 100 Innovative Universities and the Nature Index Innovation 2017. The university's ability to innovate from the stage of fundamental research through the creation of useful technology with economic impact stems from its broad disciplinary spectrum.

Website: http://resou.osaka-u.ac.jp/en/top

Osaka University

Related Robots Articles from Brightsurf:

On the way to lifelike robots
In order for robots to be able to achieve more than simple automated machines in the future, they must not only have their own ''brain''.

Children think robots can help the elderly -- but not their own grandparents
A study that asked children to assess three different robots showed that they responded most positively to simple robots shaped like flower pots, and were most sceptical of Pepper the robot, which looks more human.

Nanomaterial gives robots chameleon skin
A new film made of gold nanoparticles changes color in response to any type of movement.

How many jobs do robots really replace?
MIT economist Daron Acemoglu's new research puts a number on the job costs of automation.

Robots popular with older adults
A new study by psychologists from the University of Jena (Germany) does not confirm that robot skepticism among elder people is often suspected in science.

Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.

Designing better nursing care with robots
Robots are becoming an increasingly important part of human care, according to researchers based in Japan.

Darn you, R2! When can we blame robots?
A recent study finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.

Robots need a new philosophy to get a grip
Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.

How can robots land like birds?
Birds can perch on a wide variety of surfaces, thick or thin, rough or slick.

Read More: Robots News and Robots Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.