A prototype of an advanced robotic vision system both (1) measures its own motion with respect to a stationary background and (2) detects other moving objects and estimates their motions, all by use of visual cues. Like some prior robotic and other optoelectronic vision systems, this system is based partly on concepts of optical flow and visual odometry. Whereas prior optoelectronic visual-odometry systems have been limited to frame rates of no more than 1 Hz, a visual-odometry subsystem that is part of this system operates at a frame rate of 60 to 200 Hz, given optical-flow estimates. The overall system operates at an effective frame rate of 12 Hz. Moreover, unlike prior machine-vision systems for detecting motions of external objects, this system need not remain stationary: it can detect such motions while it is moving (even vibrating).

The system includes a stereoscopic pair of cameras mounted on a moving robot. The outputs of the cameras are digitized, then processed to extract positions and velocities. The initial image-data-processing functions of this system are the same as those of some prior systems: Stereoscopy is used to compute three-dimensional (3D) positions for all pixels in the camera images. For each pixel of each image, optical flow between successive image frames is used to compute the two-dimensional (2D) apparent relative translational motion of the point transverse to the line of sight of the camera.

The challenge in designing this system was to provide for utilization of the 3D information from stereoscopy in conjunction with the 2D information from optical flow to distinguish between motion of the camera pair and motions of external objects, compute the motion of the camera pair in all six degrees of translational and rotational freedom, and robustly estimate the motions of external objects, all in real time. To meet this challenge, the system is designed to perform the following image-data-processing functions:

The visual-odometry subsystem (the subsystem that estimates the motion of the camera pair relative to the stationary background) utilizes the 3D information from stereoscopy and the 2D information from optical flow. It computes the relationship between the 3D and 2D motions and uses a least-mean-squares technique to estimate motion parameters. The least-mean- squares technique is suitable for real-time implementation when the number of external-moving-object pixels is smaller than the number of stationary-background pixels.

In another subsystem, pixels representative of external transversely moving objects are detected by means of differences between (1) apparent transverse velocities computed from optical flow and (2) the corresponding relative transverse velocities estimated from visual odometry under the temporary assumption that all pixels belong to the stationary background.

In yet another subsystem, pixels representative of radially moving objects are detected by means of differences between (1) changes in radial distance estimated from changes in stereoscopic disparities between successive image frames and (2) the corresponding relative radial velocities estimated from visual odometry under the temporary assumption that all pixels belong to the stationary background. However, it is more difficult to detect radial than to detect transverse motion, especially at large distances. This difficulty is addressed by incorporating several additional processing features, including means to estimate rates of change of stereoscopic disparities, post- processing to prevent false alarms at low signal-to-noise ratios, and taking advantage of sometimes being able to distinguish between radial-motion optical flow and transverse-motion optical flow at short distances.

This work was done by Ashit Talukder and Larry Matthies of Caltech for NASA's Jet Propulsion Laboratory.

NPO-40687


This Brief includes a Technical Support Package (TSP).
Vision System Measures Motions of Robot and External Objects

(reference NPO-40687) is currently available for download from the TSP library.

Don't have an account? Sign up here.



NASA Tech Briefs Magazine

This article first appeared in the November, 2008 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.