
A tiny, soft, flexible robot that can crawl through earthquake rubble to find trapped victims or travel inside the human body to deliver medicine may seem like science fiction, but an international team led by researchers at Penn State are pioneering such adaptable robots by integrating flexible electronics with magnetically controlled motion.
Soft robotics, unlike traditional rigid robots, are made from flexible materials that mimic the movement of living organisms. This flexibility makes them ideal for navigating tight spaces, such as debris in a disaster zone or the intricate pathways of the human body. However, integrating sensors and electronics into these flexible systems has posed a significant challenge, according Huanyu “Larry” Cheng, James L. Henderson, Jr. Memorial Associate Professor of Engineering Science and Mechanics at Penn State.
A principal factor in making these robots smarter lies in the integration of flexible electronics, which enables their key features.
Cheng and his team shot videos of the robots in action, capturing their dynamic behavior as they crawl and roll into a ball to move along a simple course. The robots move using hard magnetic materials embedded in their flexible structure, which allows the robots to respond predictably to an external magnetic field. By adjusting the field’s strength and direction, researchers can control the robots’ movements, such as bending, twisting or crawling, without onboard power or physical connections such as wires.
Here is an exclusive Tech Briefs interview, edited for length and clarity, with Cheng.
Tech Briefs: You’re quoted in the article as saying the biggest technical challenge really was to make it smart. How did the team come to that conclusion to distribute the electronic components in a way that preserves the robot’s flexibility while maintaining robust performance? How do you settle on that?
Cheng: Traditional electronics are kind of rigid. So, when we try to integrate sensors into the soft robotics, it kind of restricts the motion. That's why we wanted to introduce soft sensors. But if you cluster all the things into one place it is still going to increase the rigidity and restrict the motion. That's why we came up with the idea to distribute the sensors in different places, to pose minimum restrictions to the motion.
And, of course, soft robotics, can come in different forms. In this particular work, they are modulated by a magnetic field, so we'll have to make sure the sensors are not going to change the magnetic properties of this integrated system. So, we had to look at how the sensors in the soft robotics respond to the magnetic field and how to change the motion of the magnetically controlled robotics.
Also, we wanted to minimize the mechanical restraints, as you mentioned, to create this very similar motion without electronics on the surface. So, it’s really from these two sides, to have minimum interaction or interference between the electronics and the magnetic control. On the other side is to pose minimum restriction to mechanical motion.
Tech Briefs: What was the catalyst for this project? How'd the work come about?
Cheng: We have had a long interest in making these soft robotics smart. This is aligned with our long-term interest in integrating sensors on the soft human body. We have been mostly working on soft electronics on the skin surface to capture vital signs and be able to predict disease conditions. Soft robotics is a very similar class of objects. Of course, we're not trying to detect biomarkers with the soft robotics, but we would like to detect the exposed environment in a way similar to what we do for humans. We would like to know when a human is in a dangerous situation, in extreme heat, in a toxic-gas environment, something quite similar for the soft robotics when they are in a rescue mission. We want to make sure that the soft robotics have an impact without the environment causing danger or damage to the soft robotics. They’re quite different from rigid robotics: soft robots perform more easily and can be damaged more easily.
Tech Briefs: Do you have any set plans for further research, work, etc.? And, if not, what are your next steps?
Cheng: So, there are different directions for us, and, as you can see, we integrate a few different system modalities, but those are certainly not all the things people would like to have. Think about humans: we have touch, vision, smell, sound. But we certainly miss a lot of other modalities that are very useful. So, we're working on a gas sensor and transducers based on imaging and sensing modalities, and we also don't have a drug-delivery module.
So, it's really of high interest to further integrate all these very essential modules for different target applications. We don't have a particular one we would like to go at this moment, but we want to redesign this to be a platform technology so as to enable further research, not only in our research group but also for others in the field. When they are working with thrombosis, we want to know if there's particular module that can release the drug and clear the blood vessel.
Or for diabetes patients, we want to deliver insulin not only to the interstitial fluids just beneath the surface, but maybe to a target location. From what we understand, if you inject the insulin into one target location, the concentration in that injection site is going to be different from another location. If want to mimic the natural process, we want to deliver the insulin to the target location and then create or encapsulate this process to create a concentration difference as the human body does through blood vessels — it is really about the need.
When we get to a different need, we probably want to think about how to deliver the drug and how to control that process and how to trigger that on demand. And, so, some onboard electronics that can receive the information about the surrounding and then be able to further automate the process to trigger the drug delivery on time at different dosages and all the things that will be very essential. That means that it's also of high interest to integrate the control module that is going to be a bit automated and maybe we can leverage from the artificial intelligence. Then, you can have a controller that can get all this complex information and processes onboard and trigger the delivery on time as well.
Transcript
00:00:00 [ SILENCE - NO AUDIO ]