A small mobile robot equipped with a stereoscopic machine vision system and two manipulator arms that have limited degrees of freedom has been given the ability to perform moderately dexterous manipulation autonomously, under control by an onboard computer. The approach taken in this development has been one of formulating vision-based control software to utilize the mobility of the vehicle to compensate for the limitation on the dexterity of the manipulator arms. Although the goal was selected visually, it is tracked onboard using information about its shape; in particular, the target is assumed to be a local elevation maximum (i.e., the highest point within a small patch of area).

A Six-Wheeled Robotic Vehicle Is Moved — in this case, backward along a series of circular arc segments — while visually tracking the target, so that the manipulator-arm workspace initially at position A can be robustly moved onto the target at position B.

The mobile robot in question is Rocky 7 — a prototype “rover”-type vehicle used in research on robotic-vehicle concepts for the exploration of Mars. The mobility system of Rocky 7 is based on a six-wheel-drive rocker-bogey mechanism that includes two steerable front wheels and four nonsteerable back wheels. One of the manipulator arms is equipped with two independently actuated scoops for acquiring samples. Not counting the motions of the scoops, this arm has two degrees of freedom — shoulder roll and a shoulder pitch. The other manipulator arm is a mast on which is mounted a stereoscopic pair of video cameras and that can, if desired, be tipped with a scientific instrument. The mast has three degrees of freedom (shoulder pitch, shoulder roll, and elbow pitch). The mast can be used to position and orient its cameras and/or to place its tip instrument on a target object to acquire a sample or take a reading. Additional stereoscopic pairs of cameras are located at the front and rear ends of the main body of the vehicle.

At the beginning of an operation, one or more target objects a small distance away are selected, and then the robot is commanded to perform autonomously some manipulations that involve the objects. Following the basic approach of using mobility to augment dexterity, the navigation and mobility control subsystems of the vehicle cause the vehicle to maneuver into a position and orientation in which the target lies within the range of one of the manipulators (see figure), and then the manipulator control subsystem causes the manipulator to perform the remaining fine positioning and manipulation.

The key to ensuring that the rover reaches its target is to move in small steps, and lock onto the target by automatically tracking its shape. The computer processes the image data from the stereoscopic camera pairs into an elevation map of the nearby terrain and locates the target on the elevation map. The computer plans the route of the vehicle across the terrain toward the target, using an approximate kinematical model (assuming flat terrain and no slippage of wheels). At frequent intervals along the route, updated elevation maps are generated from newly acquired stereoscopic-image data, the target is identified on the updated maps, and the planned route is corrected accordingly. Here the scale-invariant features (i.e., shape, elevation, and centroids) are tracked: this allows one to track the target even as its image grows dramatically in size during the final approach, a situation that often causes traditional visual servoing techniques to fail. This process of iterative, vision-based refinement of the route continues until the vehicle arrives at the desired location near the target.

Once the vehicle is in the desired position and orientation relative to the target, the designated manipulator arm is lowered toward the target; tactile sensing is used to signal contact with the target or with the ground adjacent to the target. The manipulator arm is then commanded to perform the assigned manipulation. Manipulations that Rocky 7 has performed in demonstrations include grasping several small rocks that were initially at a distance of >1 m and placing an instrument on a boulder that was initially at a distance of >5 m.

This work was done by Mark Maimone, Issa Nesnas, and Hari Das of Caltech for NASA’s Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.nasatech.com/tsp  under the Machinery/ Automation category.

NPO-21011



This Brief includes a Technical Support Package (TSP).
Document cover
Vision-Based Manuevering and Manipulation by a Mobile Robot

(reference NPO-21011) is currently available for download from the TSP library.

Don't have an account? Sign up here.