Bluesky Facebook Reddit Email

Simulated cats and elephants with touch-based memory help usher in new age of robotics

03.03.26 | King's College London

Nikon Monarch 5 8x42 Binoculars

Nikon Monarch 5 8x42 Binoculars deliver bright, sharp views for wildlife surveys, eclipse chases, and quick star-field scans at dark sites.

A new approach to simulating biologically inspired robotics can cut the design and training of tactile robots from eighteen months to two weeks, new research suggests.

Published in Cyborg & Bionic Systems , the study applies lessons from some of nature’s most famous ‘sensors’, including cats’ paws and elephant trunks, to help create artificial sensors with a human-like sense of touch better and faster than ever before.

Combined with recent work in Nature Communications on training these tactile sensors in a way that mirrors human tactile memory, the team led by King’s College London now believe they can dramatically slash the time and cost of producing next-generation robots.

Tactile robots are robots with a sense of touch, usually facilitated by an abundance of sensors in a device like a robotic hand. Aside from being more ‘human-like’, these robots have the advantage of much greater dexterity.

Dr Shan Luo, Reader in Robotics and AI at King’s College London and author of the paper, explains “While the human brain inherently knows the difference between grasping a strawberry and a baseball bat and applies different levels of force, a robot does not. This leads to difficulty when handling the myriads of complex shapes that a robot may interact with.

“While these tactile robots offer distinct advantages, designing these sensors relies on trial and error, and the calibration processes across sensors are time-consuming. It can often take up to eighteen months to make one tactile robot prototype, without the guarantee it will work – a major bottleneck for everything from robotic pickers in factories to next-generation prosthetics.”

SimTac, the platform designed by the researchers, takes a different approach. By simulating these sensors and training them with generated data based on real-world objects in a virtual environment, the team can remove the time-consuming trial and error process and explore potential benefits of ‘bio-inspired’ sensor shapes.

Xuyang Zhang, PhD student at King’s and first author of SimTac, said “Previous simulation-led approaches have only really created sensors that are flat surfaces, like the pad of a finger. But imagine trying to pick up a piece of paper on a table using only your finger pad – it’s almost impossible.

“Our work has learnt from the best of nature to create an abundance of prototypes and models capable of different tasks. We’ve created simulated cats’ paws, octopus tentacles and elephant trunks, all to give us a better idea of how different designs of tactile sensors can be used to create improved prototypes for use by people. It’s a powerful expansion of the design space and can be used to create physical tactile robots in a fraction of the time.”

By combining the new technology with GenForce , an AI model that mimics how the human brain learns how to sense force and grasp objects, the team also hope to reduce the cost of training tactile robots exponentially – helping industry roboticists make significant savings.

High accuracy force/torque sensors can cost upwards of £10,000, and to train and test a complete tactile prototype, many need to be used when deploying tactile robots on a large scale. By uniting different tactile sensing types, such as sensing that uses cameras or disturbances to electric fields the framework opens up the whole prototype to be trained with just one sensor.

Zhuo Chen, PhD student at King’s and first author of GenForce, said “Wedding together different types of tactile sensors in a single robotic device has been well established to give higher levels of dexterity, but they’re expensive and take a lot of time to train. By borrowing from the way a human hand learns how to exert force on an object by touching it just once, we use AI to abstract this force onto a 2D image the rest of the device uses to calibrate the memory of force and touch.

“In a sense, this means you can train an entire hand how to sense and move by using just one finger. Depending on the size of the overall robot, this could deliver exponential savings for players in the automated manufacturing, handling and robotics sectors.”

In the future, the team hope to completely fabricate bio-inspired tactile robots to push the design space of what tactile robotics can accomplish.

Cyborg and Bionic Systems

Keywords

Article Information

Contact Information

Joanna Dungate
King's College London
joanna.dungate@kcl.ac.uk

How to Cite This Article

APA:
King's College London. (2026, March 3). Simulated cats and elephants with touch-based memory help usher in new age of robotics. Brightsurf News. https://www.brightsurf.com/news/1GRMJJ58/simulated-cats-and-elephants-with-touch-based-memory-help-usher-in-new-age-of-robotics.html
MLA:
"Simulated cats and elephants with touch-based memory help usher in new age of robotics." Brightsurf News, Mar. 3 2026, https://www.brightsurf.com/news/1GRMJJ58/simulated-cats-and-elephants-with-touch-based-memory-help-usher-in-new-age-of-robotics.html.