Novel software can recognize eye contact in everyday situations

August 11, 2017

"Until now, if you were to hang an advertising poster in the pedestrian zone, and wanted to know how many people actually looked at it, you would not have had a chance", explains Andreas Bulling, who leads the independent research group "Perceptual User Interfaces" at the Excellence Cluster at Saarland University and the Max Planck Institute for Informatics. Previously, one would try to capture this important information by measuring gaze direction. This required special eye tracking equipment which needed minutes-long calibration; what was more, everyone had to wear such a tracker. Real-world studies, such as in a pedestrian zone, or even just with multiple people, were in the best case very complicated and in the worst case, impossible.

Even when the camera was placed at the target object, for example the poster, and machine learning was used i.e. the computer was trained using a sufficient quantity of sample data only glances at the camera itself could be recognized. Too often, the difference between the training data and the data in the target environment was too great. A universal eye contact detector, usable for both small and large target objects, in stationary and mobile situations, for one user or a whole group, or under changing lighting conditions, was hitherto nearly impossible.

Together with his PhD student Xucong Zhang, and his former PostDoc Yusuke Sugano, now a Professor at Osaka University, Bulling has developed a method [1] that is based on a new generation of algorithms for estimating gaze direction. These use a special type of neural network, known as "Deep Learning", that is currently creating a sensation in many areas of industry and business. Bulling and his colleagues have already been working on this approach for two years [2] and have advanced it step by step [3]. In the method they are now presenting, first a so-called clustering of the estimated gaze directions is carried out. With the same strategy, one can, for example, also distinguish apples and pears according to various characteristics, without having to explicitly specify how the two differ. In a second step, the most likely clusters are identified, and the gaze direction estimates they contain are used for the training of a target-object-specific eye contact detector. A decisive advantage of this procedure is that it can be carried out with no involvement from the user, and the method can also improve further, the longer the camera remains next to the target object and records data. "In this way, our method turns normal cameras into eye contact detectors, without the size or position of the target object having to be known or specified in advance," explains Bulling.

The researchers have tested their method in two scenarios: in a workspace, the camera was mounted on the target object, and in an everyday situation, a user wore an on-body camera, so that it took on a first-person perspective. The result: Since the method works out the necessary knowledge for itself, it is robust, even when the number of people involved, the lighting conditions, the camera position, and the types and sizes of target objects vary.

However, Bulling notes that "we can in principle identify eye contact clusters on multiple target objects with only one camera, but the assignment of these clusters to the various objects is not yet possible. Our method currently assumes that the nearest cluster belongs to the target object, and ignores the other clusters. This limitation is what we will tackle next." He is nonetheless convinced that "the method we present is a great step forward. It paves the way not only for new user interfaces that automatically recognize eye contact and react to it, but also for measurements of eye contact in everyday situations, such as outdoor advertising, that were previously impossible."

Further Information:
-end-
[1] Xucong Zhang, Yusuke Sugano and Andreas Bulling. Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery. Proc. ACM UIST 2017.

https://perceptual.mpi-inf.mpg.de/files/2017/05/zhang17_uist.pdf

[2] Xucong Zhang, Yusuke Sugano, Mario Fritz and Andreas Bulling. Appearance-Based Gaze Estimation in the Wild. Proc. IEEE CVPR 2015, 4511-4520.

https://perceptual.mpi-inf.mpg.de/files/2015/04/zhang_CVPR15.pdf

[3] Xucong Zhang, Yusuke Sugano, Mario Fritz and Andreas Bulling. It's Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. Proc. IEEE CVPRW 2017.

https://perceptual.mpi-inf.mpg.de/files/2017/05/zhang_cvprw2017.pdf

Demo Video:

https://www.youtube.com/watch?v=ccrS5XuhQpk

Questions can be directed to:

Dr. Andreas Bulling
Perceptual User Interfaces Group
Cluster of Excellence "Multimodal Computing and Interaction"
Saarland Informatics Campus
Tel. 49-681-932-52128
E-Mail: bulling@mpi-inf.mpg.de

Editor:

Gordon Bolduan
Competence Center Computer Science Saarland
Saarland Informatics Campus
Phone: 49-681-302-70741
E-mail: bolduan@mmci.uni-saarland.de

Notice for Radio Journalists:

You can conduct telephone interviews in studio quality with scientists of Saarland University over the Radio Codec (IP connection with direct dialing or via ARD Sternpunkt 106813020001). Please send interview requests to the press office (0681/302-3610).

Saarland University

Related Eye Contact Articles from Brightsurf:

A smart eye mask that tracks muscle movements to tell what 'caught your eye'
Integrating first-of-its-kind washable hydrogel electrodes with a pulse sensor, researchers from the University of Massachusetts Amherst have developed smart eyewear to track eye movement and cardiac data for physiological and psychological studies.

Vision scientists discover why people literally don't see eye to eye
We humans may not always see eye to eye on politics, religion, sports and other matters of debate.

Eye contact activates the autonomic nervous system even during video calls
A new study from Tampere University in Finland found that eye contact during video calls can elicit similar psychophysiological responses than those in genuine, in-person eye contact.

Smart contact lens sensor developed for point-of-care eye health monitoring
A research group led by Prof. DU Xuemin from the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences has developed a ''smart'' contact lens that can show real-time changes in moisture and pressure by altering colors.

Cooperation after eye contact: Gender matters
Researchers from the UB published an article in the journal Scientific Reports which analyses, through the prisoner's dilemma game, the willingness of people to cooperate when in pairs.

Self-moisturising smart contact lenses
Researchers at Tohoku University have developed a new type of smart contact lenses that can prevent dry eyes.

Antibody-based eye drops show promise for treating dry eye disease
Researchers have identified the presence of a specific type of antibody, called anti-citrullinated protein autoantibodies, or ACPAs, in human tear fluid.

Left eye? Right eye? American robins have preference when looking at decoy eggs
Just as humans are usually left- or right-handed, other species sometimes prefer one appendage, or eye, over the other.

UNH researchers create a hydrogel contact lens to treat serious eye disease
Researchers at the University of New Hampshire have created a hydrogel that could one day be made into a contact lens to more effectively treat corneal melting, a condition that is a significant cause for blindness world-wide.

New research debunks importance of eye contact
Using eye tracking technology, ECU researchers have demonstrated that people don't need to mindfully look at the eyes of their audience to be perceived as making eye contact during face-to-face conversation.

Read More: Eye Contact News and Eye Contact Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.