(Image: University of Cambridge)

Scientists have developed a low-cost, durable, highly sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.

The researchers, from the University of Cambridge and University College London (UCL), developed the flexible, conductive skin, which is easy to fabricate and can be melted down and formed into a wide range of complex shapes. The technology senses and processes a range of physical inputs, allowing robots to interact with the physical world in a more meaningful way.

Unlike other solutions for robotic touch, which typically work via sensors embedded in small areas and require different sensors to detect different types of touch, the entirety of the electronic skin developed by the Cambridge and UCL researchers is a sensor, bringing it closer to our own sensor system: our skin.

Although the robotic skin is not as sensitive as human skin, it can detect signals from over 860,000 tiny pathways in the material, enabling it to recognize different types of touch and pressure — like the tap of a finger, a hot or cold surface, damage caused by cutting or stabbing, or multiple points being touched at once — in a single material.

The researchers used a combination of physical tests and machine learning techniques to help the robotic skin ‘learn’ which of these pathways matter most, so it can sense different types of contact more efficiently.

Here is an exclusive Tech Briefs interview, edited for length and clarity, with Co-Author Thomas George Thuruthel, Ph.D., from UCL.

Tech Briefs: What was the biggest technical challenge you faced while melting and forming the skin?

Thuruthel: There were a few small challenges. I think one of them was this material is not that easy to flow. You can't create very complex shapes. You might get things like gaps. But I think honestly the biggest challenge was wiring this material with our electronics book. So the material itself is soft and compliant, but the wires have to be at some point rigid. This interface between the soft material and this rigid wire is always a big challenge.

Tech Briefs: What's the process like for melting down and forming the shapes?

Thuruthel: We have a water bath. This material melts around 50 ° to 60 °C, and then it'll solidify at around 30 ° to 40 °C. So, we heat it up into a liquid, we have mold where there'll be small openings from which you can pour in the material. There’ll be small openings so that can go out as well. You pour in the material, you seal all the holes, and then you keep it outside for a few hours so that it sets. Then you open up the mold. Of course, the mold is easy to detach so that you can take out the form shape later.

Tech Briefs: The article I read says, “Although the robotic skin is not as sensitive as human skin, it can detect signals from over 860,000 tiny pathways in the material.” My question is: How many signals can the current commercial e-skin detect and how do those numbers compare to human touch?

Thuruthel: Although we said 860,000 channels, it doesn't necessarily mean that's the number of independent units of information that you get. There's a lot of information, but there's a lot of information that is redundant. We haven't quantified how much is the independent amount information that you get. But I would say, roundabout, for our setup, I would say it would be around 2,000 to 3,000 units would be the number that you're looking at. For the human hand, that number would be around 15,000 units. A lot of the commercial ones are very discreet; what you mostly see are in the order of hundreds or tens — I think even hundreds is very rare.

However, there is a technology called vision-based tactile sensors, which uses cameras embedded inside your hands. They theoretically would have higher resolution, but, again, it can't be quantified. You can’t get a number as to how many units you have.

Tech Briefs: Do you have any set plans for further research work? And if not, what are your next steps?

Thuruthel: We received a recent U.K. grant; we’re trying to develop this technology for more commercial applications. We haven't really tested how the skin would fare if we had repeated contact for, let's say, thousands or 10,000 of interactions. I think we anticipate that this could be an issue, especially at this interface between the soft material and the network.

So we are looking at better ways of interfacing and also looking at different materials. What we use is hydrogel, which is a decent material but not very robust or durable. We’re looking at more synthetic materials, natural materials like rubber, for example, as an alternative.

And then we're looking at higher-level tasks. Right now, we are just estimating perception information, like where is the contact location, for example. We want to close the loop — so how do we use this information on a robotic hand or system so that it can perform real-world tasks that are quite useful?

Those are our next steps.



Transcript

00:00:02 Robots can now feel what they touch just like we do. Well, almost. Researchers at the University of Cambridge have created an artificial skin packed with ultra sensitive sensors. These sensors don't just detect pressure, they read texture, temperature, even pain-like signals. The skin which the researchers cast into the shape of a hand is made from an

00:00:29 electrolyed hydrogel with electrodes embedded around the wrist. Electrical fields generated across the skin detect different types of stimulation. The sensors monitor thousands of bits of information which not only detect where the stimulation is but also the type of stimulation. The information is then transferred to the electrodes. The artificial skin can detect multiple

00:00:56 sensations at the same time, such as touch, moisture, temperature, and pain, and can fit over mechanical robot hands like a glove. This lowcost skin could revolutionize the fields of prosthetics, robotic surgery, the automotive industry, rehabilitation, and even space exploration. What's that feel? The future just got a little more human.