Terry Fong

The NASA Ames Intelligent Robotics Group (IRG) is dedicated to enabling humans and robots to explore and learn about extreme environments, remote locations, and uncharted worlds. IRG conducts applied research in a wide range of areas with an emphasis on robotics systems science and field testing. IRG expertise includes applied computer vision, human-robot interaction, mobile manipulation, interactive 3D visualization, and robot software architecture. Terry Fong is the Group Lead for the IRG.

NASA Tech Briefs: What are some of the projects the IRG is working on now?

Terry Fong: We are working on a broad range of things with the common theme of trying to better support exploration. In particular, a primary focus of our work right now is supporting exploration at specific lunar sites. As an example, this summer we conducted a field test in Haughton Crater in Canada where we used two robots to do systematic site surveys. This is very different from the kinds of operations you've seen with the Mars Exploration Rovers in that we are interested in learning as much as possible about a bounded area.

{ntbad}During the two-and-a-half weeks we were operating up at Haughton, our two robots drove 45 kilometers, which is three times the distance that the Mars rovers have traveled in the past three years. We're trying to characterize a bounded area or specific site, which means you need to cover it very densely and you need to do it systematically. We do terrain and sub-surface mapping, which requires driving a lot in an area. That percolates throughout the whole system in terms of how you deal with commanding the robots, what you use for visualizing what they've done, monitoring their state, the onboard autonomy, all kinds of things. There's a fundamental difference from past NASA planetary robotics work.

NTB: The IRG has its own research facility that includes a Marscape outdoor rover test facility and a Moonscape indoor rover test facility. Why did you have to go to Canada to conduct these tests? What is it about that site that made it so desirable for the type of testing you were doing?

Fong: We do field tests all over the United States and in this case, Canada, because each analog site has a different value. There's no way you can have a perfect reproduction of the Moon, except by going to the Moon. So, depending on what you're testing, you would need to go test at different places.

It's true we have an outdoor test site here which measures for the Marscape, but in reality, it's pretty small. For example, "Drill Hill," which is one of the analog lunar sites we mapped at Haughton Crater, measures 700 x 700 meters. There's just no way to have anything of that scale at a NASA site, especially if you care about other properties such as it's non-vegetative, that it has the same surface composition in terms of rocks and minerals as the Moon, or at least similar through the physical characteristics such as grain size, lighting, and other kinds of things. There are all kinds of different characteristic that you care about duplicating if you're trying to test something. You just cannot do it all in a laboratory.

NTB: How do you locate these different sites around the country that are going to provide the type of results that are parallel to what you expect to find on other planets?

Fong: It depends on what you're trying to do. If you go back to the Apollo program, they actually took astronauts to many different sites in the Southwest desert region of the United States, and also to Hawaii, because they were interested in doing field geology. They were looking at areas that had similar basalts - things that were very similar to what they expected the astronauts to find on the Moon.

You think of the same sorts of things if you're trying to test robots. In particular, if you're going to test robots for doing lunar operations, the first thing you care about is sites that are non-vegetated - sites that have a mixture of basalts and perhaps sand as a stimulant for regolith. There are a number of suitable places in the American Southwest, in places like Arizona, New Mexico, and California, and there are some other places in Utah and Colorado. Then there are places such as Haughton Crater in Canada.

Haughton is very interesting because it is a 20-kilometer-diameter impact crater, which is exactly the same scale as Shackleton Crater, the south pole of the Moon. It is completely non-vegetated. During the summer it has extremely strong UV, and it has continuous sunlight, so it's really good for a number of reasons as a lunar analog. For us it was very useful to test there because during the past 10 years, there's been a project called the Haughton Mars Project – the PI on that project is Pascal Lee and Chris McKay here at Ames – and they had established a small camp up there and every summer they do a series of experiments. Those experiments range from how you would actually set up and run a greenhouse to things that involve people walking around in suits to understand the physiological impacts of, say, doing a 10-kilometer walk back from an EVA, to our robot testing.

So for us it's good because there was some infrastructure that was up there that we could take advantage of. It's good from the standpoint that it's a really nice lunar analog in terms of an impact structure of the same scale as Shackleton which, of course, is one of the primary targets for NASA as we go back to the Moon.

NTB: What is meant by the term "Intelligent Robotics"? Don't most robots have a certain amount of intelligence programmed into them?

Fong: To some extent they do, although robotics these days – and we're taking a very general, broad view of the field of robotics – covers everything from a Roomba vacuum cleaner, to assembly-line stuff, to planetary robotics. We made the distinction that we're interested in things that are intelligent because they're self-sufficient - they can figure out where obstacles are in the world so they don't run into things, and they know how to operate even when they're disconnected and out of communications range. Those are the kinds of things that assembly-line robots don't do. Those are the kinds of things a Roomba vacuum cleaner, which has a very limited view of the world until it bumps into something, does not do. So we use the term "intelligent robotics" to really focus on robots that are much more self-sufficient.

Again, I refer to the field test that we did this summer. Our robots operated for probably 200 total hours of operation. About 10 percent of that time they operated completely outside of communications range. We let them go because we knew that they would come back. They were smart enough to know where they were; they were smart enough to avoid obstacles while they were carrying out their survey work. So basically we let them go outside of our wireless communications range, let them finish their work, and then they would come back and we would talk to them once they came back.

NTB: One of the projects you're currently working on, the Peer-to-Peer Human-Robot Interaction Project, is developing techniques to make it easier for humans to communicate with robots.

Fong: The Peer-to-Peer Human-Robot Interaction Project was a project that we actually finished almost two years ago. That project was a collaboration with NASA Johnson, the Naval Research Laboratory, the National Institute of Standards and Technology, Carnegie Mellon University, and MIT. It was a good group of people to work with on a topic that is fundamentally "how do you allow humans and robots to better communicate what they're doing and to ask for help from each other?"

One of the things you see these days is that NASA, and actually most organizations that use robots, typically view those robots purely as a tool. That works great, except that it requires the human to always be aware of what the robot is doing, constantly monitoring, looking for problems, and trying to figure out when the robot is having a problem. That means the human can't do anything else; the human is stuck taking care of the robot. What this project was looking at was, "how do you make robots and humans more equal from the sense of being partners as they're working on a task?"

As an example, if you and I are going to go out and build a fence, I'm not going to be standing there saying, "Okay, you need to pick up that beam and put it on there. Take the screwdriver and screw this on." We're going to talk at a higher level. We'll say, "Okay, you go to your end, you work on that side of the fence, and I'm going to work on my side, and when you have a problem, call me and tell me what the problem is and what sort of help you need, and then I can help." We're trying to find ways so that robots and humans can work more in that manner, so that robots will be able to ask humans for their advice when the robot determines that it has a problem.

NTB: When you talk about dialog between robots and humans, part of that involves speech recognition and speech synthesis technology. How sophisticated has that technology become in the last few years?

Fong: It's made huge advances. In this project, we really were trying to not only use speech recognition and speech synthesis, but we were focusing on "spatial dialogue." That's the idea that if a human speaks to a robot and says, "Hey, shine a light to my left," the robot not only knows what the human is talking about with a light, but it can actually figure out what "my left" means. So there has to be some spatial reasoning based on the robot understanding where it is and the relationship between the two so that it can point the light at the right place. When you say, "my left," what does "my left" mean? It depends. Are you facing me? Are you next to me? Are you behind me? Being able to reason in terms of spatial language, spatial location, and orientation is something that we think is certainly important for tasks that NASA is going to want to conduct on the lunar surface.

NTB: What types of machine vision systems are you designing into these robotic systems?

Fong: The Intelligent Robotics Group does a broad range of machine vision. We do vision that's onboard for looking at the world and determining where there are obstacles so the robot can avoid them. We also do vision from off-board, whether that's aerial or orbital imagery for doing 3D modeling of the terrain, so you can determine areas where you don't want to send the robot because it's too steep or too rough, as well as being able to reconstruct and build models you can use; for example, lighting simulation or a wide variety of things.

Ames has actually had a long history of developing computer vision tools. We recently released a software package called Vision Workbench, which is an open-source tool for doing computer vision, and it incorporates a broad range of things including stereovision and texture matching. We use it for a variety of applications. One is the Ames Stereo Pipeline, which is used for building 3D models for orbital imagery. And a new piece of software called the Terrain Pipeline, which produces digital elevation models over a very wide scale. So if you really want to produce a model of a site that's going to be used for a number of purposes, such as robot planning for science, or things like illumination modeling where you start off with orbital images, you process those into terrain models and then you provide essentially a server that can provide that data to different users and clients.

NTB: Do you see any potential commercial applications for the robotic technology your group is developing?

Fong: Sure. I think that systematic survey – robotic survey – clearly is something that would be of interest to, say, the oil and gas industry. Our work right now is focused primarily on wheeled mobile robots and planetary rover type robots, but there's no reason the software couldn't be used by underwater robots if you're going to try to map and prospect for resources. So clearly, I think, it would be useful from a commercial standpoint.

I think also the computer vision tools that we have been developing would be of interest to organizations that care about doing 3D modeling, whether they're scientific organizations or anywhere vision is going to go out and look to better understand a site that you want to operate in. The oil and gas industry is a perfect example, and perhaps construction. Perhaps, people who care about long-term environmental changes might be interested in being able to look at how the surface of an area changes over time.

For more information, contact Terry Fong at This email address is being protected from spambots. You need JavaScript enabled to view it..