Smart Hand for Amputees Combines User and Robotic Control
EPFL scientists are developing new approaches for improved control of robotic hands, particularly for amputees, that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven non-amputee subjects. The technology contributes to the emerging field of shared control in neuroprosthetics. One concept is from neuroengineering, and involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand. The other concept is from robotics, and allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping.
Transcript
00:00:07 What we're doing is we're developing a very smart prosthetic hand that allows an amputee to control each finger individually and also benefit from the aid of robotic assistance to grasp easier. Our research is a really good example actually of shared control which is a concept that we're very excited about in the field of robotics and that is merging user intention, that of an amputee in this case, with robotic automation. So typically when you hold objects in your hands and it starts slipping from your hands you only have a few millisecond to react. That's where this hand in particular, which has the possibility to react in 400 milliseconds, with all these tactile sensors, which can really react and move the object and restabilize it before the brain could
00:00:56 actually perceive that it's slipping. For an amputee, it's actually very hard to contract the muscles many many different ways to control all of the ways that our fingers move. What we do is we put these sensors on their remaining stump and then record them and try to interpret what the movement signals are. Because these signals can be a bit noisy, what we need is this machine learning algorithm that extracts meaningful activity from those muscles and interpret them into movements. And these movements are what control each finger of the robotic hands. On top of that because these predictions of the finger movements may not be 100% accurate we had this robotic automation which allows the hand to automatically start closing around an object once
00:01:44 contact is made. Now if the user wants to release an object all he or she has to do is try to open up their hand and the robotic controller turns off and all of the control goes back to the user. This is exactly an implementation of shared control because what we have is user intention, basically the finger movements, and also the robotic automation which closes the hand around an object and keeps it there if the user wants so that a grasp is more robust. We did not design the arm and and the sensor. We designed the contribution of all of those and primarily the algorithm behind that. The algorithm to react very rapidly to sense that things are slipping, to decide how to place the finger to react and restabilize the object. So we are behind, if
00:02:34 you want, this is more the intelligence behind that but of hardware is provided by an external party.