Researchers have developed a new type of robotic finger with a sense of touch. The finger can localize touch with very high precision (<1 mm) over a large, multi-curved surface, much like its human counterpart.
Current methods for building touch sensors have proven difficult to integrate into robot fingers due to multiple challenges including difficulty in covering multi-curved surfaces, high wire count, or difficulty fitting into small fingertips, thus preventing use in dexterous hands. The new method uses overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger.
By measuring light transport between every emitter and receiver, a very rich signal data set can be obtained that changes in response to deformation of the finger due to touch. Purely data-driven deep learning methods can extract useful information from the data, including contact location and applied normal force, without the need for analytical models. The result is a fully integrated, sensorized robot finger with a low wire count, built using accessible manufacturing methods and designed for easy integration into dexterous hands.
Two aspects of the underlying technology combine to enable the new results. First, the researchers use light to sense touch. Under the “skin,” the finger has a layer made of transparent silicone into which they shined light from more than 30 LEDs. The finger also has more than 30 photodiodes that measure how the light bounces around. Whenever the finger touches something, its skin deforms, so light shifts around in the transparent layer underneath. Measuring how much light goes from every LED to every diode, the researchers end up with close to 1,000 signals that each contain some information about the contact that was made. Since light can also bounce around in a curved space, these signals can cover a complex 3D shape such as a fingertip.
Secondly, the team designed this data to be processed by machine learning algorithms. Because there are so many signals, all of them partially overlapping with each other, the data is too complex to be interpreted by humans. Fortunately, current machine learning techniques can learn to extract the information that researchers care about: where the finger is being touched, what is touching the finger, how much force is being applied, etc.
In addition, the team built the finger so it and others can be put onto robotic hands. Integrating the system onto a hand is easy: the finger collects almost 1,000 signals but only needs a 14-wire cable connecting it to the hand and it needs no off-board electronics. The researchers have two dexterous hands (capable of grasping and manipulating objects) being outfitted with these fingers — one one has three fingers and the other one four. The team will be using these hands to demonstrate dexterous manipulation abilities, based on tactile and proprioceptive data.