Nav: Home

Closing the loop for robotic grasping

June 25, 2018

Roboticists at QUT have developed a faster and more accurate way for robots to grasp objects, including in cluttered and changing environments, which has the potential to improve their usefulness in both industrial and domestic settings.
  • The new approach allows a robot to quickly scan the environment and map each pixel it captures to its grasp quality using a depth image
  • Real world tests have achieved high accuracy rates of up to 88% for dynamic grasping and up to 92% in static experiments.
  • The approach is based on a Generative Grasping Convolutional Neural Network


QUT's Dr Jürgen Leitner said while grasping and picking up an object was a basic task for humans, it had proved incredibly difficult for machines.

"We have been able to program robots, in very controlled environments, to pick up very specific items. However, one of the key shortcomings of current robotic grasping systems is the inability to quickly adapt to change, such as when an object gets moved," Dr Leitner said.

"The world is not predictable - things change and move and get mixed up and, often, that happens without warning - so robots need to be able to adapt and work in very unstructured environments if we want them to be effective," he said.

The new method, developed by PhD researcher Douglas Morrison, Dr Leitner and Distinguished Professor Peter Corke from QUT's Science and Engineering Faculty, is a real-time, object-independent grasp synthesis method for closed-loop grasping.

"The Generative Grasping Convolutional Neural Network approach works by predicting the quality and pose of a two-fingered grasp at every pixel. By mapping what is in front of it using a depth image in a single pass, the robot doesn't need to sample many different possible grasps before making a decision, avoiding long computing times," Mr Morrison said.

"In our real-world tests, we achieved an 83% grasp success rate on a set of previously unseen objects with adversarial geometry and 88% on a set of household objects that were moved during the grasp attempt. We also achieve 81% accuracy when grasping in dynamic clutter."

Dr Leitner said the approach overcame a number of limitations of current deep-learning grasping techniques.

"For example, in the Amazon Picking Challenge, which our team won in 2017, our robot CartMan would look into a bin of objects, make a decision on where the best place was to grasp an object and then blindly go in to try to pick it up," he said

"Using this new method, we can process images of the objects that a robot views within about 20 milliseconds, which allows the robot to update its decision on where to grasp an object and then do so with much greater purpose. This is particularly important in cluttered spaces," he said.

Dr Leitner said the improvements would be valuable for industrial automation and in domestic settings.

"This line of research enables us to use robotic systems not just in structured settings where the whole factory is built based on robotic capabilities. It also allows us to grasp objects in unstructured environments, where things are not perfectly planned and ordered, and robots are required to adapt to change.

"This has benefits for industry - from warehouses for online shopping and sorting, through to fruit picking. It could also be applied in the home, as more intelligent robots are developed to not just vacuum or mop a floor, but also to pick items up and put them away."
-end-
The team's paper Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach will be presented this week at Robotics: Science and Systems, the most selective international robotics conference, which is being held at Carnegie Mellon University in Pittsburgh USA.

The research was supported by the Australian Centre for Robotic Vision.

Queensland University of Technology

Related Robots Articles:

Tactile sensor gives robots new capabilities
Eight years ago, Ted Adelson's research group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new sensor technology, called GelSight, that uses physical contact with an object to provide a remarkably detailed 3-D map of its surface.
Researchers question if banning of 'killer robots' actually will stop robots from killing
A University at Buffalo research team has published a paper that implies that the rush to ban and demonize autonomous weapons or 'killer robots' may be a temporary solution, but the actual problem is that society is entering into a situation where systems like these have and will become possible.
Soft robots that mimic human muscles
An EPFL team is developing soft, flexible and reconfigurable robots.
Team of robots learns to work together, without colliding
When you have too many robots together, they get so focused on not colliding with each other that they eventually just stop moving.
Social robots -- programmable by everyone
The startup LuxAI was created following a research project at the Interdisciplinary Centre for Security, Reliability and Trust (SnT) of the University of Luxembourg.
On the path toward molecular robots
Scientists at Hokkaido University have developed light-powered molecular motors that repetitively bend and unbend, bringing us closer to molecular robots.
Gentle strength for robots
A soft actuator using electrically controllable membranes could pave the way for machines that are no danger to humans.
Robots get creative to cut through clutter
Clutter is a special challenge for robots, but new Carnegie Mellon University software is helping robots cope, whether they're beating a path across the moon or grabbing a milk jug from the back of the refrigerator.
Humans can empathize with robots
Toyohashi Tech researchers in cooperation with researchers at Kyoto University have presented the first neurophysiological evidence of humans' ability to empathize with a robot in perceived pain.
Giving robots a more nimble grasp
Engineers at MIT have now hit upon a way to impart more dexterity to simple robotic grippers: using the environment as a helping hand.

Related Robots Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Jumpstarting Creativity
Our greatest breakthroughs and triumphs have one thing in common: creativity. But how do you ignite it? And how do you rekindle it? This hour, TED speakers explore ideas on jumpstarting creativity. Guests include economist Tim Harford, producer Helen Marriage, artificial intelligence researcher Steve Engels, and behavioral scientist Marily Oppezzo.
Now Playing: Science for the People

#524 The Human Network
What does a network of humans look like and how does it work? How does information spread? How do decisions and opinions spread? What gets distorted as it moves through the network and why? This week we dig into the ins and outs of human networks with Matthew Jackson, Professor of Economics at Stanford University and author of the book "The Human Network: How Your Social Position Determines Your Power, Beliefs, and Behaviours".