RoboSimian, a statically stable quadrupedal robot capable of both dexterous manipulation and versatile mobility in difficult terrain, was built to compete in the Defense Advanced Research Projects Agency (DARPA) Robotics Challenge, a competitive effort to develop hardware and software in the area of mobile manipulation platforms to assist humans in responding to natural and manmade disasters.

A completely new software architecture and framework was developed to communicate and control RoboSimian. This software system has been designed for semi-autonomous operation of the robot, enabling low-bandwidth, high-latency control operated from a standard laptop. Because limbs are used for mobility and manipulation, a single, unified, mobile manipulation planner is used to generate autonomous behaviors, including walking, sitting, climbing, grasping, and manipulating. The remote operator interface is optimized to designate, parameterize, sequence, and preview behaviors, which are then executed by the robot.

The software of RoboSimian is model-based and data-driven. It consists of multiple processes and modules running simultaneously across two computers inside RoboSimian, as well as one remote operator machine. The two RoboSimian computers communicate over a Gigabit Ethernet link. Each RoboSimian computer, the high-brain and low-brain machines, runs 12.04 Ubuntu LTS on an Intel Quad-Core i7 with 16 GB of memory. The low-brain machine runs a low-latency (soft real-time) kernel and the EtherLab R open-source real-time kernel module, which runs low-level processes such as limb and control processes. The highbrain machine is responsible for higher- level processes not concerned with real-time execution but rather higher throughput.

Processes communicate with each other via shared-memory and/or inter-process communication (IPC) either through TCP or UDP. With IPC communication, each module subscribes to other modules’ messages, which are sent asynchronously. Messages and data that can be sent as a constant stream are sent via UDP, while one-off messages or messages requiring receipt confirmation are sent via TCP. Data products that are both large in size and sent as a stream are sent via shared memory. Imagery and stereo data are sent via shared memory.

The mechanics modeling (model) module provides the infrastructure to create, modify, and query a model of the world and robot, and is used in almost all of RoboSimian’s modules. The camera (cam) module acquires imagery from the stereo cameras, computes stereo range images, and performs visual odometry. The perception (prcp) module takes the image data and produces map and scene information. The plan module produces feasible mobility and manipulation plans. The control (ctrl) module executes the generated plans and behaviors by commanding the limbs through the limb modules. Lastly, the remote module is the remote user interface that views robot data and commands the robot.

This work was done by Paul Hebert, Jeremy C. Ma, James W. Borders, Alper O. Aydemir, and Jason I. Reid of Caltech; and Max Bajracharya and Nicolas Hudson of Google for NASA’s Jet Propulsion Laboratory. For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

This software is available for commercial licensing. Please contact Dan Broderick at This email address is being protected from spambots. You need JavaScript enabled to view it.. Refer to NPO-49694.


NASA Tech Briefs Magazine

This article first appeared in the March, 2016 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.