NASA robotics engineer Sandeep Yayathi works on Robonaut 2, or R2, a humanoid robot built and designed at Johnson Space Center in Houston. As a robotics engineer, Sandeep Yayathi is developing a battery-based power system that will allow the Robonaut 2, now aboard the International Space Station, to move about freely without having to be plugged into the ISS power grid.

NASA Tech Briefs: What does the R2 look like? What kind of tools are on it? What is it made up of?

Sandeep Yayathi

Sandeep Yayathi: The Robonaut is a humanoid robot, so it’s a robot that looks very much like a person. It has two arms, similar degrees of freedom, and some complex dexterous hands. The hands are also very similar to what we have on our arms. The goal is for the Robonaut to be able to interface with the same interfaces that the crew uses now, and be able to handle the same tools that they use in orbit. Currently we have an (intra-vehicular activity) IVA version of the Robonaut, so it’s inside the space station mounted to a stanchion that the crew’s been working with. Looking forward to the future, we are currently working on a battery-based power system, as well as a pair of legs. Not so much legs like you and I have, but similar to the arms, with specialized end effectors for grabbing on to fixtures, tracks, and hand rails available on the station. This will set the stage for eventually having a robot that goes EVA [extra-vehicular activity].

NTB: What are its functions? What tasks can it perform?

Yayathi: Right now we are constantly developing new tasks and augmenting his abilities as time goes on. Controls engineers down here on the ground are working on tasks that eventually get tested on orbit. Right now, on orbit, we have a task panel positioned in front of the robot that has a whole host of different switches, buttons, and connectors that are found all around the station. So we’re working on vision recognition and the robot’s ability to manipulate those objects.

In addition, we’ve been identifying various other tasks that the crew has to do that are sort of monotonous and take up time that could be otherwise spent doing science. One of which more recently was checking flow out of some of the air filters on the station. That’s something that the crew has to go and do with the meter, and we recently had the robot hold that meter in front of the air filters and pipe that video data back to the ground for analysis. So hopefully, eventually, the robot can be doing some of these tasks, moving around the station without the crew in the loop to basically offload those monotonous tasks that are really well suited for a robot.

NTB: Is the team impressed with the robot’s ability?

Yayathi: I think that the station program and NASA as a whole are pumped about the potential that the robot has in impacting how we do operations on station and how we utilize crew time appropriately. The astronauts only have so much time during the day, and there’s a whole lot of science that we want to be doing up there. The robot is very electromechanically capable. It’s a work in progress, but we do have high hopes for it. People that are interfacing with it on the ops side are excited about the potential it has. Every day, people are coming up with new tasks that they would love Robonaut to take over in the future.

NTB: How else can the Robonaut make it easier for the team?

Yayathi: Our goal is eventually to have a robot that goes EVA and can withstand the harsh environment of space, where it can really make a big impact. Astronauts, when they go EVA, usually have some main goals that require a human in the loop to manage these tasks and more complicated procedures and maneuvers. But in order to get to a work site, there are a lot of things that have to happen first. A lot of times, they have to spend a good portion of their EVA setting up foot restraints and fixtures to get out there, before they can actually do the real task that they set out to accomplish. Having a robot out there could be advantageous: you would be able to have it go and do all these setup tasks ahead of time and really optimize the time while the crew members out there. We’re blessed with the brains that we have and the abilities that we have as humans, and we want to take advantage of those when we’re out there in the dangerous environment of space.

NTB: What is still challenging for the robot to accomplish? What needs the most tweaking, would you say?

Yayathi: Making it mobile is a big, big deal. When the robot is stationary, we can develop a lot tasks. But there’s only so much you can do in one spot. We’re headed in the right path from that perspective.

There’s a lot of work that’s still to be done on vision processing and recognition. A lot of challenges with robotics, in general, today are with software — being able to take advantage of all the power that you have in front of you. That’s something that’s just going to continually evolve with time. We brought in quite a few researchers and PhDs to work on that end of development. We’re partnering up heavily with academia and pulling in the latest research, and trading off with researchers. We’re able to take advantage of their labs, their students, and their expertise, and give them an opportunity to work with an expensive, highly capable robot, which not every academic lab has the funding to support. This is a good symbiotic type of relationship that will benefit NASA and the world and researchers alike.

NTB: You’re developing a battery-based power system for the Robonaut. Can you take us through that process?

Yayathi: There’s a lot that goes into that power system, various DC/DC converters, [components] that provide power to the robot, interfacing to higher level control on the robot and power sequence. The core of that is the actual battery itself, and that’s where we started. We knew we’d need a mobile power source if we were going to have the robot running around, and it couldn’t be tethered.

We explored lithium ion. It’s pretty much the premier in secondary cell technology. When I say secondary cell, I mean a battery cell that can be recharged. We did a survey of all the lithium ion technology that was out there when we started, and we identified a cell that we liked that had a really good energy density and also high cycle life. We took that cell and basically worked on optimizing the packaging for it. You’ll find this today with electric cars and various other devices: it’s all about how much energy can you pack into how small a space with how little weight. We’re faced with the same challenge. We want to be able to get the most out of our robot, without having too big of a battery, and without adding too much weight. So packaging is a big deal. We spent a lot of time developing a kind of modular cartridge design that we’d be able to utilize in Robonaut as well as other robotic mobility platforms that would be reconfigurable. We’re trying to have a battery solution that we’ll be able to use in multiple robots and various voltage and current capacities.

NTB: Why is a battery-based system so important?

Yayathi: The number one reason for a battery-based system is to allow mobility. It’ll allow us to freely move about IVA, freely move out EVA eventually, and to any location, and run for a long enough duration to accomplish major mission objectives before having to go recharge. Managing tethers is difficult. Even for people it’s difficult, and the astronauts have to do that for safety. Now managing a tether that’s connected to a power source is even more difficult because you need to have large enough cable bundles to provide all your power. To do that and move freely all around station would be pretty difficult.

NTB: Are there any other technical challenges with the battery-based system? You mentioned packaging.

Yayathi: With lithium ion technology, one of the other major design efforts for us, in addition to packaging, is actually monitoring and balancing cells. We have to be constantly monitoring temperatures, monitoring every cell voltage, and making sure the pack is in balance so when we charge it, we’re not accidentally charging one of the cells up too high. That involves a lot of specialized electronics. We’ve been developing a modular battery management system that can go along with these battery cartridges and change with the size of the robot. It can handle all of that, maintain the battery and operate it safely, independent of the rest of the robot.

Safety is a big concern with large batteries in general. That’s something we’re taking very seriously and has been a challenge. There’s a lot that goes into just building a battery, especially a high-power battery. We see lithium ion cells everywhere today, but something that a lot of people don’t realize is every time you have a lithium ion cell in your consumer electronic device, whether it’s your phone or laptop, there’s a [device] associated with that that makes sure that those cells are within limits. It’s a great energy source, but it needs to be treated with care.

NTB: Is the battery-based system what you’re working on currently? What is a typical day for you?

Yayathi: We started out focused on the batteries and as time went on, I’ve actually been doing more with the actual power distribution electronics: interfacing with multiple other engineers and making sure that we’re doing [parallel] developments and meeting our deadlines. Firmware and software. I actually do a lot of just electrical hardware design: making circuit boards that do all this, working on defining the whole system architecture, making sure that everything goes together right at the end.

It’s challenging, it’s interesting, and it’s nice to be able to come into work and basically start with a system design and actually be designing circuit boards, having them fabbed and brought back in-house, and testing them myself. You get to go through the entire engineering process here. It’s one of the unique things about our lives that’s really nice. Also, we work very closely between the electrical software and mechanical team. We’re all sitting in the same room, and that’s how we achieve the type of packaging that we get in our robots.

NTB: Do you have any other projects in the pipeline, related to R2?

Yayathi: I’m mostly focused on the battery-pack system at the moment. The legs development is also happening, parallel to this, so you’ll see those showing up before too long. We have a prototype already. That’s pretty much absorbing a good chunk of our time at the moment, at least as far as R2 is concerned. Once we move on from that, we’ll be probably transitioning more to focus on EVA development. There are design challenges specific to EVA that need to be tackled. But it’s important to tackle your degrees of complexity in stages. We want to make sure we have a functioning robot that does the things we want it to do, and then take those lessons learned, in addition to the challenges of thermal control, etc. that are required for an EVA robot, and wrap them all into that unit.

NTB: How long have you been working with R2?

Yayathi: I started as a co-op back in early 2006, pretty much the beginning of the Robonaut project. I was fortunate enough to be here from the beginning, right when we were designing that first limb, and it came in stages. We designed and built the arm, and we had an arm just running by itself and a hand. That eventually graduated to doing a whole power system, the head, and the brainstem: the actual main computer that does most our controls work. We’ve evolved quite a bit since the beginning —definitely with getting the buy-in from station, and being able to upgrade and fly our robot.

NTB: How often is Robonaut tested, and how does that work?

Yayathi: There are different stages. During development, we try to segment and test things in pieces before the robot goes together. That was one of my first major tasks as well during the co-op. I was building a simulator of all the power electronics in the robot body, a hardware simulator (not a software one), so we could plug in a joint or plug in a limb, and do a full check out on it. That involves plugging it in, hooking it up to our GUI, and our software control, and making sure it can do all the motions correctly. Then once we have the robot fully assembled, a lot of the sensing [data] is all piped back in through the GUI so we can basically see online if things are going wrong with the robot.

NTB: Are there problems with coordination when you’re adding new functions?

Yayathi: When you have a robot or any other system that’s already flown and is up there, we have to integrate these new features with a robot that exists on station. We’re concerned with how can we make this easy to go together so that the crew can assemble these new components without having to be too invasive. There are areas where, sure, if we built a new robot, we might be able to make something more integrated. We have to make sure that we interface with what we have. So that is a challenge sometimes, too, to make sure we have the right workarounds to get these [components] to connect up. It is highly advantageous for us to be able to do that. Since we have a robot up there, we have to utilize the resources that we have. Hardware is not something that you can just replace on the fly. We do spend a lot of time making sure that these systems are going to integrate.

We have a full cert unit on the ground that is identical to the flight Robonaut 2 that’s on ISS. We’re actually in the process of building a full hardware simulator of the robot so that we can plug in and test all these new components that we build. Eventually, once we’re confident that everything is working correctly, we can then take all the hardware and actually integrate it with that cert flight unit. We can do all our checkups with that on the ground, as well as even the assembly procedures. Before anything ever flies, all of that checkout will be done ahead of time, so that when the crew assembles in orbit, it’ll behave just as we expect.

NTB: How much of commercial industry had a hand in the makeup of the R2? Can you describe the types of partnerships with other industries?

Yayathi: We partnered with General Motors Corporation during the design and build of R2. A handful of their engineers embedded themselves in our lab to work alongside the NASA engineers in order to learn from our experience, as well as impart their knowledge of designing for robustness and reliability. Having GM on the team definitely influenced the design of the robot and the types of things we focused on, including our graphical control interface. We also utilized some of their resources and partner labs to develop custom sensors that are now inside the robot. The Space Act Agreement allows us to form commercial partnerships that benefit both NASA and our commercial partners.

NTB: What’s next for R2? When you look ahead to ten years or so, what do you see as some applications for R2 and similar technologies?

Yayathi: There are just so many things that you have to do when you’re EVA in space, that don’t necessarily require a person in the loop. There are a lot of things that do require a person in the loop, and you want to basically minimize the time people spend doing tasks that can be automated by a robot. Let the people do the things that people are good at. In the future, we’re talking about targeting an asteroid and other planetary surfaces, and that’s a great place for a robot like this. Even if it’s not a Robonaut per se and we take 90 percent of the technology in Robonaut, it can still be a robot that goes out and explores dangerous environments, or unknown environments, before a crew member is sent out there.

Human life is precious. If we lose a robot, we lost some money and some parts, but it’s much less valuable than a human being. If we’re exploring surface and we want to go through some unchartered areas and be able to do a fairly high level of exploration, we can send a robot out there, and then determine that it’s safe and interesting enough for a person to go in after. The robot can also serve as an assistant out in the field as well.

NTB: What’s your favorite part of the job?

Yayathi: My favorite part of this job is honestly the hands-on part. I really like to work with my hands. I like to have tangible hardware. We’re lucky enough to be a part of the entire engineering process and go from design to physical [hardware] sitting in front of you, and it’s pretty rewarding when you spend all this time, thinking and designing things, working with a large team trying to make sure everything’s going to go together just right, and you actually get to have the hardware in your hand and put it together and see it work, and for us, luckily enough to see it fly.

For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

To download this interview as a podcast, click here .