Nav: Home

Robot uses social feedback to fetch objects intelligently

March 06, 2017

PROVIDENCE, R.I. [Brown University] -- If someone asks you to hand them a wrench from a table full of different sized wrenches, you'd probably pause and ask, "which one?" Robotics researchers from Brown University have now developed an algorithm that lets robots do the same thing -- ask for clarification when they're not sure what a person wants.

The research, which will be presented this spring at the International Conference on Robotics and Automation in Singapore, comes from Brown's Humans to Robots Lab led by computer science professor Stefanie Tellex. Her work focuses on human-robot collaboration -- making robots that can be good helpers to people at home and in the workplace.

"Fetching objects is an important task that we want collaborative robots to be able to do," Tellex said. "But it's easy for the robot to make errors, either by misunderstanding what we want, or by being in situations where commands are ambiguous. So what we wanted to do here was come up with a way for the robot to ask a question when it's not sure."

Tellex's lab had previously developed an algorithm that enables robots to receive speech commands as well as information from human gestures. It's a form of interaction that people use all the time. When we ask someone for an object, we'll often point to it at the same time. Tellex and her team showed that when robots could combine the speech commands with gestures, they got better at correctly interpreting user commands.

Still, the system isn't perfect. It runs into problems when there are lots of very similar objects in close proximity to each other. Take the workshop table, for example. Simply asking for "a wrench" isn't specific enough, and it might not be clear which one a person is pointing to if a number of wrenches are clustered close together.

"What we want in these situations is for the robot to be able to signal that it's confused and ask a question rather than just fetching the wrong object," Tellex said.

The new algorithm does that. It enables the robot to quantify how certain it is that it knows what a user wants. When its certainty is high, the robot will simply hand over the object as requested. When it's not so certain, the robot makes its best guess about what the person wants, then asks for confirmation by hovering its gripper over the object and asking, "this one?"

One of the important features of the system is that the robot doesn't ask questions with every interaction. It asks intelligently.

"When the robot is certain, we don't want it to ask a question because it just takes up time," said Eric Rosen, an undergraduate working in Tellex's lab and co-lead author of the research paper with graduate student David Whitney. "But when it is ambiguous, we want it to ask questions because mistakes can be more costly in terms of time."

And even though the system asks only a very simple question, "it's able to make important inferences based on the answer," Whitney said. For example, say a user asks for a wrench and there are two wrenches on a table. If the user tells the robot that its first guess was wrong, the algorithm deduces that the other wrench must be the one that the user wants. It will then hand that one over without asking another question. Those kinds of inferences, known as implicatures, make the algorithm more efficient.

To test their system, the researchers asked untrained participants to come into the lab and interact with Baxter, a popular industrial and research robot. Participants asked Baxter for objects under different conditions. The team could set the robot to never ask questions, ask a question every time, or to ask questions only when uncertain. The trials showed that asking questions intelligently using the new algorithm was significantly better in terms of accuracy and speed compared to the other two conditions.

The system worked so well, in fact, that participants thought the robot had capabilities it actually didn't have. For the purposes of the study, the researchers used a very simple language model -- one that only understood the names of objects. However, participants told the researchers they thought the robot could understand prepositional phrases like, "on the left" or "closest to me," which it could not. They also thought the robot might be tracking their eye-gaze, which it wasn't. All the system was doing was making smart inferences after asking a very simple question.

In future work, Tellex and her team would like to combine the algorithm with more robust speech recognition systems, which might further increase the system's accuracy and speed.

Ultimately, Tellex says, she hopes systems like this will help robots become useful collaborators both at home and at work.
-end-
The work was funded in part by grants from the Defense Advanced Research Projects Agency (W911NF-15-1-0503 and D15AP00102) and NASA.

Brown University

Related Robots Articles:

Robots popular with older adults
A new study by psychologists from the University of Jena (Germany) does not confirm that robot skepticism among elder people is often suspected in science.
Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.
Designing better nursing care with robots
Robots are becoming an increasingly important part of human care, according to researchers based in Japan.
Darn you, R2! When can we blame robots?
A recent study finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.
Robots need a new philosophy to get a grip
Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.
How can robots land like birds?
Birds can perch on a wide variety of surfaces, thick or thin, rough or slick.
Soft robots for all
Each year, soft robots gain new abilities. They can jump, squirm, and grip.
The robots that dementia caregivers want: robots for joy, robots for sorrow
A team of scientists spent six months co-designing robots with informal caregivers for people with dementia, such as family members.
Faster robots demoralize co-workers
A Cornell University-led team has found that when robots are beating humans in contests for cash prizes, people consider themselves less competent and expend slightly less effort -- and they tend to dislike the robots.
Increasing skepticism against robots
In Europe, people are more reserved regarding robots than they were five years ago.
More Robots News and Robots Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Listen Again: Reinvention
Change is hard, but it's also an opportunity to discover and reimagine what you thought you knew. From our economy, to music, to even ourselves–this hour TED speakers explore the power of reinvention. Guests include OK Go lead singer Damian Kulash Jr., former college gymnastics coach Valorie Kondos Field, Stockton Mayor Michael Tubbs, and entrepreneur Nick Hanauer.
Now Playing: Science for the People

#562 Superbug to Bedside
By now we're all good and scared about antibiotic resistance, one of the many things coming to get us all. But there's good news, sort of. News antibiotics are coming out! How do they get tested? What does that kind of a trial look like and how does it happen? Host Bethany Brookeshire talks with Matt McCarthy, author of "Superbugs: The Race to Stop an Epidemic", about the ins and outs of testing a new antibiotic in the hospital.
Now Playing: Radiolab

Dispatch 6: Strange Times
Covid has disrupted the most basic routines of our days and nights. But in the middle of a conversation about how to fight the virus, we find a place impervious to the stalled plans and frenetic demands of the outside world. It's a very different kind of front line, where urgent work means moving slow, and time is marked out in tiny pre-planned steps. Then, on a walk through the woods, we consider how the tempo of our lives affects our minds and discover how the beats of biology shape our bodies. This episode was produced with help from Molly Webster and Tracie Hunte. Support Radiolab today at Radiolab.org/donate.