The Human Touch in Robotic Form: GelPalm
Watch this video to learn how MIT researchers enhance robotic precision with sophisticated tactile sensors in the palm and agile fingers. The work aims to set the stage for improvements in human-robot interaction and prosthetic technology.
“We draw inspiration from human hands, which have rigid bones surrounded by soft, compliant tissue,” says recent MIT graduate Sandra Q. Liu SM ’20, PhD ’24 , the lead designer of GelPalm, who developed the system as a CSAIL affiliate and PhD student in mechanical engineering. “By combining rigid structures with deformable, compliant materials, we can better achieve that same adaptive talent as our skillful hands. A major advantage is that we don't need extra motors or mechanisms to actuate the palm's deformation — the inherent compliance allows it to automatically conform around objects, just like our human palms do so dexterously.”
Transcript
00:00:01 (air whooshing) (gentle music) (air vibrating) (air vibrating) - The focus on this research was actually making a compliant palm with high resolution tactile sensing. And the reason this is important is because a lot of researchers actually focus on what might be a little bit cooler,
00:00:26 which is the actual dexterous finger. A lot of times when we actually use the human palm, it complies to things. It has sensing itself, and so what we wanted to do was to actually bring this into robotics. And so what we do is we use this sort of flexible LED low cost strip, and we paint it with different colors so that it can illuminate different colors.
00:00:46 And as a result, we're able to use this technology in a palm. What we can actually see is that our palm is completely lit up with these LEDs. And because of our novel compliance structure design, we have something so that it actually compliantly deforms when you press an object inside and the gel also provides additional compliance. And then if you look on the other side, you can put cameras here
00:01:09 so that you can get this high resolution tactile sensing. A lot of previous work on robotic hands focuses mostly on the fingers, and they don't actually look at the robotic palms or even any sort of palms. And even when they do, they actually only put in low resolution sensing or they have something that isn't really a compliant structure, but our human hands themselves are compliant structures.
00:01:32 And actually when you press something against your palm, you'll feel that your palm itself is deforming a little bit. That's why we wanted to integrate both of this, both the high resolution tactile sensing, and also this sort of structural compliance as well. Like human hands, we wanted to be able to build a robotic hand that is able to better envelop or grasp an object without actually failing. And so one of the issues with rigid robotic hands is
00:01:57 that without a palm with any sort of compliance or actuation, a lot of times when you do an enveloping grasp, you actually can't grasp the object really well unless you're relying on the fingers. So for here where we have this sort of structural compliance and also this material compliance, we're able to better envelop and grasp an object. And then another potential benefit with this is that with our new illumination system,
00:02:20 we're actually able integrate this high resolution sensing into other types of structures. So in this case, we made a finger, which is sort of comparable to human finger, and we're able to then use this technology to make things like this sort of human hand inspired soft robotic gripper and also this sort of 120 degree configuration gripper, which we use to grab some objects. Each of these structures
00:02:47 involve some sort of semi rigid backbone printed out of a carbon fiber filled nylon. We have this sort of what we call our equivalent to human flesh is our silicone covering. And with the third part, it's our flexible all LED strip that we talked about. And this, we actually use a silicone-based paint to paint red, green, and blue and this gives us our tricolor sensing region, which is inspired by GelSight sensing technology,
00:03:18 a type of high resolution, camera-based, tactile sensing. All of this technology because it's so simple, it's really low cost and scalable and able to be easily integrated into other sort of soft, rigid robotic components. So one of the major limitations of this hand is a lot of the limitations that come with using camera-based tactile sensors, which is that we need to use multiple cameras to be able to see along the entire palm
00:03:43 and also along the entire finger. We're currently working on developing some technologies to allow us to just use one camera to be able to see the entire surface of the palm or the finger by itself. So some of the applications for this work could be for use in developing soft, rigid robotic hands, for use in factory or manufacturing settings for safety. Another one that I'm personally very passionate about, although I'm sure this is a long ways away,
00:04:10 it's actually using these soft, rigid robotic hands to improve functionality for prosthetics or for helping the elderly age with dignity as they get older. I'm really excited to be working and to be introducing the work that I did with the GelPalm and with the ROMEO fingers as well. And I hope to be able to continue working on this in the future so that I can be used for further bettering
00:04:32 and developing soft, rigid robotic hands.