A system for indoor navigation of a mobile robot includes (1) modulated infrared beacons at known positions on the walls and ceiling of a room and (2) a cameralike sensor, comprising a wide angle lens with a position-sensitive photodetector at the focal plane, mounted in a known position and orientation on the robot. The system also includes a computer running special-purpose software that processes the sensor readings to obtain the position and orientation of the robot in all six degrees of freedom in a coordinate system embedded in the room.

For a given beacon imaged on the focal plane, the output of the sensor comprises two parameters that depend in a known way on the characteristics of the lens and the direction to the beacon in a coordinate system attached to the sensor and robot. If at least three beacons are within the field of view of the sensor, then the sensor outputs from observations of all three beacons can be combined to obtain six parameters indicative of the directions to all three beacons. These directions, in combination with the known positions of the beacons, uniquely determine the position and orientation of the robot in the room. Equivalently, the six parameters constitute, in principle, sufficient data to locate the robot in all six degrees of freedom by solving the equations that express the applicable geometric relationships summarized above.

The nature of a position-sensitive photodetector is such that it is not possible to measure the centroids of two beacon images simultaneously. Therefore, it is necessary to provide for illumination of the beacons in rapid succession and to provide means for the image-data-processing software to recognize which beacon is under observation at a given instant. To satisfy this need, the beacons are turned on and off in a sequence that coincides with a predetermined code. The sensor subsystem accumulates beacon readings and their times until it begins to recognize the code sequence. Thereafter, the computer processes the readings from the recognized beacons within the field of view of the sensor.

The equations for the geometric relationships are nonlinear. The software includes a module that solves these equations by means of an iterative optimization procedure, in which it strives to find a position and orientation that, when inserted in the equations, minimizes a measure of the difference between the actual sensor readings and the sensor readings predicted by the equations.

Another software module provides an initial guess of position and orientation to start the optimization procedure. Knowing which beacons are in view, this module applies to the equations for a number of postulated robot poses and determines which pose, when inserted in the equations yields the closest match to the sensor readings. The closest match becomes the initial guess for the optimization procedure.

This work was done by Joel Shields and Muthu Jeganathan of Caltech for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.techbriefs.com/tsp under the Electronics/Computers category. NPO-40730