A fine-pointing scheme that involves correlation of images and maximum-likelihood estimation has been proposed to enable tracking of optical sources. This scheme is intended for implementation in the pointing-control system of an imaging instrument (e.g., a telescope equipped with an image detector) to provide a capability for highly stable and accurate pointing to a specific area within a moving target, even under conditions that ordinarily give rise to pointing jitters. Such conditions include motion of the target relative to the instrument, instability of the platform that supports the instrument-aiming mechanism, and turbulence in the atmosphere or other optical medium. In the original intended application, the scheme would be implemented in a ground station for tracking a laser that would be part of an optical communication aboard a distant spacecraft. Other potential applications include stabilization of images for video cameras and precise pointing of lasers in military, industrial, and surgical settings. This scheme is expected to make it possible to achieve subpixel resolution in a high-disturbance environment.
The scheme is based partly on the following assumptions:
- The image of the source occupies multiple pixels on the image detector. [For the purpose of this assumption, it does not matter whether (1) the source is larger than the equivalent of one pixel or (2) the source is smaller than the equivalent of one pixel but the image is smeared over more than one pixel because the telescope is intentionally defocused.]
- Data on a reference image of the source (e.g., a laser-intensity profile) are available for correlation with images on the detector.
- The spacecraft or other object to be tracked may be translating but is not rotating, relative to the tracking station.
- Differences or uncertainties between the reference image and the detected image at a given instant can be modeled as independent additive white Gaussian disturbances.
The proposed scheme utilizes discrete Fourier transforms (DFTs) of the image received at a sampling instant and of the reference image. Correlations between the received image and the original reference image are computed in the transform domain. The coordinates of the target in the image are estimated by an open-loop acquisition algorithm, then tracked by a closed-loop tracking algorithm. The open-loop acquisition algorithm involves the solution of two nonlinear equations that contain phase and amplitude correlation terms; the solution yields a maximum- likelihood estimate of the translation vector between the received and reference images.
The closed-loop tracking algorithm (see figure) also involves a maximum-likelihood estimation and the computation of weighted transform-domain correlations between received and reference images. However, instead of the original reference image, translated versions (based on estimates of translation) of the original reference image are used in these correlations. The loop feedback signals are low-pass filtered to obtain the current estimate of the error in the maximum-likelihood estimate of the translation vector. The closed-loop tracking algorithm also involves the solution of two nonlinear equations that involve phase and amplitude correlation terms, but assuming that phase errors remain small during tracking, these equations can be approximated closely by linear ones that yield error terms that can be used to update the maximum-likelihood estimates of the translation vector.
This work was done by Haiping Tsou and Tsun-Yee Yan of Caltech for NASA’s Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.nasatech.com/tsp under the Information Sciences category.
This software is available for commercial licensing. Please contact Don Hart of the California Institute of Technology at (818) 393-3425. Refer to NPO-20698.
This Brief includes a Technical Support Package (TSP).
Maximum-Likeihhood Scheme for Tracking an Optical Source
(reference NPO-20698) is currently available for download from the TSP library.
Don't have an account? Sign up here.