A software library and set of programs largely automate the geometric calibration of video cameras. Developed especially for robotic vision systems, this software generates the information needed to determine the three-dimensional (3D) positions of objects that appear in two- dimensional (2D) video images. Typically, the software can perform 2D-to-3D mappings with precision of 0.1 to 0.3 pixels. The software enables the creation, manipulation, and application of geometric models of camera lenses. The models are constructed semiautomatically from images of known calibration targets, and these models can be applied automatically to live images, thereby enabling robots to generate the position information needed for such robotic operations as manipulation of objects, mapping, and navigation. The software supports three main types of models: (1) linear (ordinarily suitable for fields of view narrower than about 30°), (2) radial lens distortion (typically suitable for fields of view ranging from 15° to 110° wide), and (3) fisheye lens distortion (typically suitable for fields of view wider than 90°). Camera models generated by this software have enabled the development of real-time, visionbased control systems on a variety of advanced civilian and military robots.

The algorithms and software were developed by Don Gennery, Todd Litwin, Yalin Xiong, Mark Maimone, and Larry Matthies of Caltech for NASA’s Jet Propulsion Laboratory.

This software is available for commercial licensing. Please contact Don Hart of the California Institute of Technology at (818) 393-3425. Refer to NPO-21077.



This Brief includes a Technical Support Package (TSP).
Document cover
Software for Geometric Calibration of Video Cameras

(reference NPO-21077) is currently available for download from the TSP library.

Don't have an account?



Magazine cover
Motion Control Tech Briefs Magazine

This article first appeared in the February, 2002 issue of Motion Control Tech Briefs Magazine (Vol. 26 No. 2).

Read more articles from the archives here.


Overview

The document discusses a software system developed for the geometric calibration of video cameras used in mobile platforms, such as robots. This calibration is crucial for enabling robots to accurately interpret their surroundings, which is essential for various operations, including 3D mapping, autonomous navigation, and object manipulation.

The software allows for the semi-automatic construction of camera models from images of known calibration targets. These models can then be applied to live images, enabling robots to reason about their environment effectively. The system supports three main types of camera models based on the lens distortion characteristics: linear, radial, and fisheye. The choice of model depends on the camera's field of view (FOV): linear models are suitable for small FOVs (less than 30 degrees), radial models for medium FOVs (15-110 degrees), and fisheye models for large FOVs (greater than 90 degrees).

The calibration software has been successfully implemented in various real-time vision-based control systems across multiple robotic platforms. Notable applications include prototype Mars rovers such as Rocky 7, FIDO, and Athena, as well as the Army Research Lab’s unmanned ground vehicles (Demo III) and DARPA’s TMR robot (Urbie). Additionally, the software was utilized by the Mars Pathfinder ground support team to interpret stereo images transmitted from Mars.

The document is part of a technical support package prepared under the sponsorship of NASA and highlights the contributions of several inventors, including Donald Gennery, Larry H. Matthies, Mark W. Maimone, Todd Litwin, and Yalin Xiong. It emphasizes that the work was conducted at the Jet Propulsion Laboratory (JPL) under a contract with NASA, and it includes a notice regarding the non-endorsement of any specific commercial products mentioned.

Overall, this document outlines a significant advancement in robotic vision technology, showcasing how geometric calibration of cameras enhances the capabilities of robots in understanding and interacting with their environments, thereby facilitating a wide range of applications in space exploration and beyond.