Nav: Home

NUS researchers gives robots intelligent sensing abilities to carry out complex tasks

July 15, 2020

Picking up a can of soft drink may be a simple task for humans, but this is a complex task for robots - it has to locate the object, deduce its shape, determine the right amount of strength to use, and grasp the object without letting it slip. Most of today's robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.

A team of computer scientists and materials engineers from the National University of Singapore (NUS) has recently demonstrated an exciting approach to make robots smarter. They developed a sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel's Loihi chip. This novel system integrates artificial skin and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the vision and touch sensors in real-time.

"The field of robotic manipulation has made great progress in recent years. However, fusing both vision and tactile information to provide a highly precise response in milliseconds remains a technology challenge. Our recent work combines our ultra-fast electronic skins and nervous systems with the latest innovations in vision sensing and AI for robots so that they can become smarter and more intuitive in physical interactions," said Assistant Professor Benjamin Tee from the NUS Department of Materials Science and Engineering. He co-leads this project with Assistant Professor Harold Soh from the Department of Computer Science at the NUS School of Computing.

The findings of this cross-disciplinary work were presented at the renowned conference Robotics: Science and Systems conference in July 2020.

Human-like sense of touch for robots

Enabling a human-like sense of touch in robotics could significantly improve current functionality, and even lead to new uses. For example, on the factory floor, robotic arms fitted with electronic skins could easily adapt to different items, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.

In the new robotic system, the NUS team applied an advanced artificial skin known as Asynchronous Coded Electronic Skin (ACES) developed by Asst Prof Tee and his team in 2019. This novel sensor detects touches more than 1,000 times faster than the human sensory nervous system. It can also identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.

"Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter. They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle," added Asst Prof Tee, who is also from the NUS Institute for Health Innovation & Technology.

A human-like brain for robots

To break new ground in robotic perception, the NUS team explored neuromorphic technology - an area of computing that emulates the neural structure and operation of the human brain - to process sensory data from the artificial skin. As Asst Prof Tee and Asst Prof Soh are members of the Intel Neuromorphic Research Community (INRC), it was a natural choice to use Intel's Loihi neuromorphic research chip for their new robotic system.

In their initial experiments, the researchers fitted a robotic hand with the artificial skin, and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning. Loihi achieved over 92 per cent accuracy in classifying the Braille letters, while using 20 times less power than a normal microprocessor.

Asst Prof Soh's team improved the robot's perception capabilities by combining both vision and touch data in a spiking neural network. In their experiments, the researchers tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system's ability to identify rotational slip, which is important for stable grasping.

In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage. The classification was 10 per cent more accurate than a system that used only vision. Moreover, using a technique developed by Asst Prof Soh's team, the neural networks could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered. In addition, the researchers demonstrated the efficiency of neuromorphic technology: Loihi processed the sensory data 21 per cent faster than a top performing graphics processing unit (GPU), while using more than 45 times less power.

Asst Prof Soh shared, "We're excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It's a step towards building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations."

"This research from the National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture," said Mr Mike Davies, Director of Intel's Neuromorphic Computing Lab.

This research was supported by the National Robotics R&D Programme Office (NR2PO), a set-up that nurtures the robotics ecosystem in Singapore through funding research and development (R&D) to enhance the readiness of robotics technologies and solutions. Key considerations for NR2PO's R&D investments include the potential for impactful applications in the public sector, and the potential to create differentiated capabilities for our industry.

Next steps

Moving forward, Asst Prof Tee and Asst Prof Soh plan to further develop their novel robotic system for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, especially moving forward in the post-COVID era.
-end-


National University of Singapore

Related Brain Articles:

Human brain size gene triggers bigger brain in monkeys
Dresden and Japanese researchers show that a human-specific gene causes a larger neocortex in the common marmoset, a non-human primate.
Unique insight into development of the human brain: Model of the early embryonic brain
Stem cell researchers from the University of Copenhagen have designed a model of an early embryonic brain.
An optical brain-to-brain interface supports information exchange for locomotion control
Chinese researchers established an optical BtBI that supports rapid information transmission for precise locomotion control, thus providing a proof-of-principle demonstration of fast BtBI for real-time behavioral control.
Transplanting human nerve cells into a mouse brain reveals how they wire into brain circuits
A team of researchers led by Pierre Vanderhaeghen and Vincent Bonin (VIB-KU Leuven, Université libre de Bruxelles and NERF) showed how human nerve cells can develop at their own pace, and form highly precise connections with the surrounding mouse brain cells.
Brain scans reveal how the human brain compensates when one hemisphere is removed
Researchers studying six adults who had one of their brain hemispheres removed during childhood to reduce epileptic seizures found that the remaining half of the brain formed unusually strong connections between different functional brain networks, which potentially help the body to function as if the brain were intact.
Alcohol byproduct contributes to brain chemistry changes in specific brain regions
Study of mouse models provides clear implications for new targets to treat alcohol use disorder and fetal alcohol syndrome.
Scientists predict the areas of the brain to stimulate transitions between different brain states
Using a computer model of the brain, Gustavo Deco, director of the Center for Brain and Cognition, and Josephine Cruzat, a member of his team, together with a group of international collaborators, have developed an innovative method published in Proceedings of the National Academy of Sciences on Sept.
BRAIN Initiative tool may transform how scientists study brain structure and function
Researchers have developed a high-tech support system that can keep a large mammalian brain from rapidly decomposing in the hours after death, enabling study of certain molecular and cellular functions.
Wiring diagram of the brain provides a clearer picture of brain scan data
In a study published today in the journal BRAIN, neuroscientists led by Michael D.
Blue Brain Project releases first-ever digital 3D brain cell atlas
The Blue Brain Cell Atlas is like ''going from hand-drawn maps to Google Earth'' -- providing previously unavailable information on major cell types, numbers and positions in all 737 brain regions.
More Brain News and Brain Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Debbie Millman: Designing Our Lives
From prehistoric cave art to today's social media feeds, to design is to be human. This hour, designer Debbie Millman guides us through a world made and remade–and helps us design our own paths.
Now Playing: Science for the People

#574 State of the Heart
This week we focus on heart disease, heart failure, what blood pressure is and why it's bad when it's high. Host Rachelle Saunders talks with physician, clinical researcher, and writer Haider Warraich about his book "State of the Heart: Exploring the History, Science, and Future of Cardiac Disease" and the ails of our hearts.
Now Playing: Radiolab

Insomnia Line
Coronasomnia is a not-so-surprising side-effect of the global pandemic. More and more of us are having trouble falling asleep. We wanted to find a way to get inside that nighttime world, to see why people are awake and what they are thinking about. So what'd Radiolab decide to do?  Open up the phone lines and talk to you. We created an insomnia hotline and on this week's experimental episode, we stayed up all night, taking hundreds of calls, spilling secrets, and at long last, watching the sunrise peek through.   This episode was produced by Lulu Miller with Rachael Cusick, Tracie Hunte, Tobin Low, Sarah Qari, Molly Webster, Pat Walters, Shima Oliaee, and Jonny Moens. Want more Radiolab in your life? Sign up for our newsletter! We share our latest favorites: articles, tv shows, funny Youtube videos, chocolate chip cookie recipes, and more. Support Radiolab by becoming a member today at Radiolab.org/donate.