BabyBot takes first steps

May 02, 2006

The researchers used BabyBot to test a model of the human sense of 'presence', a combination of senses like sight, hearing and touch. The work could have enormous applications in robotics, artificial intelligence (AI) and machine perception. The research is being funded under the European Commission's FET (Future and Emerging Technologies) initiative of the IST programme, as part of the ADAPT project.

"Our sense of presence is essentially our consciousness," says Giorgio Metta, Assistant Professor at the Laboratory for Integrated Advanced Robotics at Italy's Genoa University and ADAPT project coordinator.

Modelling, or defining, consciousness remains one of the intractable problems of both science and philosophy. "The problem is duality, where does the brain end and the mind begin, the question is whether we need to consider them as two different aspects of reality," says Metta.

Neuroscientists would tend to develop theories that fit the observed phenomena, but engineers take a practical approach. Their objective is to make it work.

To that end, ADAPT first studied how the perception of self in the environment emerges during the early stages of human development. So developmental psychologists tested 6 to 18 month-old infants. "We could control a lot of the parameters to see how young children perceive and interact with the world around them. What they do when interacting with their mothers or strangers, what they see, the objects they interact with, for example," says Metta.

From this work they developed a 'process' model of consciousness. This assumes that objects in the environment are not real physical objects as such; rather they are part of a process of perception.

The practical upshot is that, while other models describe consciousness as perception, cognition then action, the ADAPT model sees it as action, cognition then perception. And it's how babies act, too.

When a baby sees an object that is not the final perception of it. A young child will then try to reach the object. If the child fails, the object is too far away. This teaches the child perspective. If the child does reach the object, he or she will try to grasp it, or taste it or shake it. These actions all teach the child about the object and govern its perception of it. It is a cumulative process rather than a single act.

Our expectations also have enormous influence on our perception. For example, if you believe an empty pot is full, you will lift the pot very quickly. Your muscles unconsciously prepare for the expected resistance, and put more force than is required into lifting; everyday proof that our expectations govern our relationship with the environment.

Or at least that's the model. "It's not validated. It's a starting point to understand the problem," says Metta.

The team used BabyBot to test it, providing a minimal set of instructions, just enough for BabyBot to act on the environment. For the senses, the team used sound, vision and touch, and focused on simple objects within the environment.

There were two experiments, one where BabyBot could touch an object and second one where it could grasp the object. This is more difficult than it sounds. If you look at a scene, you unconsciously segment the scene into separate elements.

This is a highly developed skill, but by simply interacting with the environment the BabyBot did its engineering parents proud when it demonstrated that it could learn to successfully separate objects from the background.

Once the visual scene was segmented, the robot could start learning about specific properties of objects useful, for instance, to grasp them. Grasping opens a wider world to the robot and to young infants too.

The work was successful, but it was a very early proof-of-principle for their approach. The sense of presence, or consciousness, is a huge problem and ADAPT did not seek to solve it in one project. They made a very promising start and many of the partners will take part in a new IST project, called ROBOTCUB.

In ROBOTCUB the engineers will refine their robot so that it can see, hear and touch its environment. Eventually it will be able to crawl, too.

"Ultimately, this work will have a huge range of applications, from virtual reality, robotics and AI, to psychology and the development of robots as tools for neuro-scientific research," concludes Metta.
-end-
Contact:
Giorgio Metta
University of Genova
Genoa
Italy
Tel: +39-010-353-2791
Email: pasa@dist.unige.it

Source: Based on information from ADAPT

IST Results

Related Robotics Articles from Brightsurf:

Borrowing from robotics, scientists automate mapping of quantum systems
Riddhi Gupta has taken an algorithm used for autonomous vehicles and adapted it to help characterise and stabilise quantum technology.

COVID-19 should be wake-up call for robotics research
Robots could perform some of the 'dull, dirty and dangerous' jobs associated with combating the COVID-19 pandemic, but that would require many new capabilities not currently being funded or developed, an editorial in the journal Science Robotics argues.

How robots can help combat COVID-19: Science Robotics editorial
Can robots be effective tools in combating the COVID-19 pandemic?

Novel use of robotics for neuroendovascular procedures
The advanced technology has the potential to change acute stroke treatment.

Robotics: Teaming for future soldier combat
The US Army's investment for the 10 year, Army-led foundational research program has resulted in advanced science in four critical areas of ground combat robotics that affect the way US Warfighters see, think, move and team.

New haptic arm places robotics within easy reach
Imagine being able to build and use a robotic device without the need for expensive, specialist kit or skills.

AI-guided robotics enable automation of complex synthetic biological molecules
This article describes a platform that combines artificial intelligence-driven synthesis planning, flow chemistry and a robotically controlled experimental platform to minimize the need for human intervention in the synthesis of small organic molecules.

A step forward in wearable robotics: Exosuit assists with both walking and running
A soft robotic exosuit -- worn like a pair of shorts -- can make both walking and running easier for the wearer, a new study reports.

A first in medical robotics: Autonomous navigation inside the body
Bioengineers at Boston Children's Hospital report the first demonstration of a robot able to navigate autonomously inside the body.

Engineers build a soft robotics perception system inspired by humans
An international team of researchers has developed a perception system for soft robots inspired by the way humans process information about their own bodies in space and in relation to other objects and people.

Read More: Robotics News and Robotics Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.