Automated analysis of bee behavior may yield better robots

December 10, 2003

A new computer vision system for automated analysis of animal movement — honey bee activities, in particular — is expected to accelerate animal behavior research, which also has implications for biologically inspired design of robots and computers.

The animal movement analysis system is part of the BioTracking Project, an effort conducted by Georgia Institute of Technology robotics researchers led by Tucker Balch, an assistant professor of computing.

"We believe the language of behavior is common between robots and animals," Balch said. "That means, potentially, that we could videotape ants for a long period of time, learn their 'program' and run it on a robot."

Social insects, such as ants and bees, represent the existence of successful large-scale, robust behavior forged from the interaction of many, simple individuals, Balch explained. Such behavior can offer ideas on how to organize a cooperating colony of robots capable of complex operations.

To expedite the understanding of such behavior, Balch's team developed a computer vision system that automates analysis of animal movement — once an arduous and time-consuming task. Researchers are using the system to analyze data on the sequential movements that encode information — for example in bees, the location of distant food sources, Balch said. He will present the research at the Second International Workshop on the Mathematics and Algorithms of Social Insects on Dec. 16-17 at Georgia Tech.

With an 81.5 percent accuracy rate, the system can automatically analyze bee movements and label them based on examples provided by human experts. This level of labeling accuracy is high enough to allow researchers to build a subsequent system to accurately determine the behavior of a bee from its sequence of motions, Balch explained.

For example, one sequence of motions bees commonly perform are waggle dances consisting of arcing to the right, waggling (walking in a generally straight line while oscillating left and right), arcing to the left, waggling and so on. These motions encode the locations of distant food sources, according to Cornell University Professor of Biology Thomas Seeley, who has collaborated with Balch on this project. Balch is also working with Professor Deborah Gordon of Stanford University on related work with ants.

Balch's animal movement analysis system has several components. First, researchers shoot 15 minutes of videotape of bees — some of which are marked with a bright-colored paint and returned to an observation hive. Then computer vision-based tracking software converts the video of the marked bees into x- and y-coordinate location information for each animal in each frame of the footage. Some segments of this data are hand labeled by a researcher and then used as motion examples for the automated analysis system.

In future work, Balch and his colleagues will build a system that can learn executable models of these behaviors and then run the models in simulation. These simulations, Balch explained, would reveal the accuracy of the models. Researchers don't yet know if these models will yield better computer programming algorithms, though they are hopeful based on what previous research has revealed.

"Computer scientists have borrowed some of the algorithms discovered by biologists working with insects to challenging problems in computing," Balch said. "One example is network routing, which dictates the path data takes across the Internet. In this case the insect-based network routing algorithm, investigated by Marco Dorigo, is the best solution to date."

But challenges lie ahead for researchers. They will have to grapple with differences between the motor and sensory capabilities of robots and insects, Balch added.

Balch's research team members include graduate student Adam Feldman, Assistant Professor of Computing Frank Dellaert and researcher Zia Khan. More information about their project is available at The project is funded by a grant from the National Science Foundation.

In related research with Professor Kim Wallen at Emory University's Yerkes National Primate Research Center, Balch and Khan are also observing monkeys with a similar computer vision system. They hope these studies will yield behavior models that can be implemented in computer code.

The research team is learning about the spatial memory of and social interaction among monkeys. Already, they can track the movements of individual monkeys as they search for and find hidden treats in a large enclosure. Later, they want to observe a troop of 60 to 80 monkeys living together in a larger compound.

So far, researchers have learned that male and female monkeys have different spatial memories. Males apparently remember the physical distance to food, while females follow landmarks to find treats, Balch says.

"We're involved to measure precisely where the monkeys go and how long it takes them to find the food," Balch explains. "We use the information from experiments to test hypotheses on spatial memory. We're more interested in the social systems among these animals. But we need this basic capability to track monkeys in 3D. So this work is a first step in this direction."

Ultimately, Balch and his colleagues in the Georgia Tech College of Computing's "Borg Lab" — named after the Borg of "Star Trek" fame — want to use this animal behavior information to design robots that work effectively with people in dynamic, noisy and unknown environments such as those faced by military and law enforcement officials.

Balch will present his research team's findings on the bee movement system at the December workshop to prominent biologists, mathematicians, engineers and computer scientists who gather to share ideas about mathematical and algorithmic models of social insect behavior. Balch organized the workshop with Carl Anderson, a visiting assistant professor of natural systems in the Georgia Tech School of Industrial and Systems Engineering. For more information on the workshop, see
Georgia Tech Research News and Research Horizons magazine, along with high-resolution JPEG images, can be found on the Web at

Media Contacts:
1. Jane M. Sanders, 404-894-2214, or
2. John Toon, 404-894-6986, or

For technical information, contact:
1. Tucker Balch, 404-385-2861, or

Georgia Institute of Technology Research News

Related Robots Articles from Brightsurf:

On the way to lifelike robots
In order for robots to be able to achieve more than simple automated machines in the future, they must not only have their own ''brain''.

Children think robots can help the elderly -- but not their own grandparents
A study that asked children to assess three different robots showed that they responded most positively to simple robots shaped like flower pots, and were most sceptical of Pepper the robot, which looks more human.

Nanomaterial gives robots chameleon skin
A new film made of gold nanoparticles changes color in response to any type of movement.

How many jobs do robots really replace?
MIT economist Daron Acemoglu's new research puts a number on the job costs of automation.

Robots popular with older adults
A new study by psychologists from the University of Jena (Germany) does not confirm that robot skepticism among elder people is often suspected in science.

Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.

Designing better nursing care with robots
Robots are becoming an increasingly important part of human care, according to researchers based in Japan.

Darn you, R2! When can we blame robots?
A recent study finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.

Robots need a new philosophy to get a grip
Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.

How can robots land like birds?
Birds can perch on a wide variety of surfaces, thick or thin, rough or slick.

Read More: Robots News and Robots Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to