A perception system for soft robots was developed that is inspired by the way humans process information about their own bodies in space and in relation to other objects and people. The system includes a motion capture system, soft sensors, a neural network, and a soft robotic finger. The goal is to build a system that can predict a robot’s movements and internal state without relying on external sensors, much like humans do every day. The work has applications in human-robot interaction and wearable robotics as well as soft devices to correct disorders affecting muscles and bones.

The system is meant to mimic the various components required for humans to navigate their environment: the motion capture system stands in for vision, the neural network stands in for brain functions, the sensors for touch, and the finger for the body interacting with the outside world. The motion capture system is there to train the neural network and can be discarded once training is complete.

The system can predict complex motions and forces that the soft robot experiences (which is difficult with traditional methods) and the fact that it can be applied to multiple types of actuators and sensors. The method also includes redundant sensors that improve the overall robustness of the predictions.

Researchers embedded soft strain sensors arbitrarily within the soft robotic finger, knowing that they would be responsive to a wide variety of motions, and used machine learning techniques to interpret the sensors’ signals. This enabled them to predict forces applied to, and movements of, the finger. This approach will enable researchers to develop models that can predict forces and deformations experienced by soft robotic systems as they move.

Techniques traditionally used in robotics for processing sensor data can’t capture the complex deformations of soft systems. In addition, the information the sensors capture is equally complex. As a result, sensor design, placement, and fabrication in soft robots are difficult tasks that could be vastly improved if researchers had access to robust models.

For more information, contact Ioana Patringenaru at This email address is being protected from spambots. You need JavaScript enabled to view it.; 858-822-0899.