Speedy collision detector could make robots better human assistants

November 14, 2017

Electrical engineers at the University of California San Diego have developed a faster collision detection algorithm that uses machine learning to help robots avoid moving objects and weave through complex, rapidly changing environments in real time. The algorithm, dubbed "Fastron," runs up to 8 times faster than existing collision detection algorithms.

A team of engineers, led by Michael Yip, a professor of electrical and computer engineering and member of the Contextual Robotics Institute at UC San Diego, will present the new algorithm at the first annual Conference on Robot Learning Nov. 13 to 15 at Google headquarters in Mountain View, Calif. The conference brings the top machine learning scientists to an invitation-only event. Yip's team will deliver one of the long talks during the 3-day conference.

The team envisions that Fastron will be broadly useful for robots that operate in human environments where they must be able to work with moving objects and people fluidly. One application they are exploring in particular is robot-assisted surgeries using the da Vinci Surgical System, in which a robotic arm would autonomously perform assistive tasks (suction, irrigation or pulling tissue back) without getting in the way of the surgeon-controlled arms or the patient's organs.

"This algorithm could help a robot assistant cooperate in surgery in a safe way," Yip said.

The team also envisions that Fastron can be used for robots that work at home for assisted living applications, as well as for computer graphics for the gaming and movie industry, where collision checking is often a bottleneck for most algorithms.

A problem with existing collision detection algorithms is that they are very computation-heavy. They spend a lot of time specifying all the points in a given space--the specific 3D geometries of the robot and obstacles--and performing collision checks on every single point to determine whether two bodies are intersecting at any given time. The computation gets even more demanding when obstacles are moving.

To lighten the computational load, Yip and his team in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego developed a minimalistic approach to collision detection. The result was Fastron, an algorithm that uses machine learning strategies--which are traditionally used to classify objects--to classify collisions versus non-collisions in dynamic environments. "We actually don't need to know all the specific geometries and points. All we need to know is whether the robot's current position is in collision or not," said Nikhil Das, an electrical engineering Ph.D. student in Yip's group and the study's first author.

The name Fastron comes from combining Fast and Perceptron, which is a machine learning technique for performing classification. An important feature of Fastron is that it updates its classification boundaries very quickly to accommodate for moving scenes, something that has been challenging for the machine learning community in general to do.

Fastron's active learning strategy works using a feedback loop. It starts out by creating a model of the robot's configuration space, or C-space, which is the space showing all possible positions the robot can attain. Fastron models the C-space using just a sparse set of points, consisting of a small number of so-called collision points and collision-free points. The algorithm then defines a classification boundary between the collision and collision-free points--this boundary is essentially a rough outline of where the abstract obstacles are in the C-space. As obstacles move, the classification boundary changes. Rather than performing collision checks on each point in the C-space, as is done with other algorithms, Fastron intelligently selects checks near the boundaries. Once it classifies the collisions and non-collisions, the algorithm updates its classifier and then continues the cycle.

Because Fastron's models are more simplistic, the researchers set its collision checks to be more conservative. Since just a few points represent the entire space, Das explained, it's not always certain what's happening in the space between two points, so the team developed the algorithm to predict a collision in that space. "We leaned toward making a risk-averse model and essentially padded the workspace obstacles," Das said. This ensures that the robot can be tuned to be more conservative in sensitive environments like surgery, or for robots that work at home for assisted living.

The team has so far demonstrated the algorithm in computer simulations on robots and obstacles in simulation. Moving forward, the team is working to further improve the speed and accuracy of Fastron. Their goal is to implement Fastron in a robotic surgery and a homecare robot setting.
-end-
Paper title: "Fastron: An Online Learning-Based Model and Active Learning Strategy for Proxy Collision Detection." Authors of the study are Nikhil Das, Naman Gupta and Michael Yip in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego.

University of California - San Diego

Related Robots Articles from Brightsurf:

On the way to lifelike robots
In order for robots to be able to achieve more than simple automated machines in the future, they must not only have their own ''brain''.

Children think robots can help the elderly -- but not their own grandparents
A study that asked children to assess three different robots showed that they responded most positively to simple robots shaped like flower pots, and were most sceptical of Pepper the robot, which looks more human.

Nanomaterial gives robots chameleon skin
A new film made of gold nanoparticles changes color in response to any type of movement.

How many jobs do robots really replace?
MIT economist Daron Acemoglu's new research puts a number on the job costs of automation.

Robots popular with older adults
A new study by psychologists from the University of Jena (Germany) does not confirm that robot skepticism among elder people is often suspected in science.

Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.

Designing better nursing care with robots
Robots are becoming an increasingly important part of human care, according to researchers based in Japan.

Darn you, R2! When can we blame robots?
A recent study finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.

Robots need a new philosophy to get a grip
Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.

How can robots land like birds?
Birds can perch on a wide variety of surfaces, thick or thin, rough or slick.

Read More: Robots News and Robots Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.