Topics
features
Publications
Issue Archive
Nonlinear Estimation Approach to RealTime Georegistration from Aerial Images
 Created: Saturday, 01 September 2012
This technology can be used for realtime search and rescue operations and surveillance applications using cameras mounted on aircraft or UAVs.
When taking aerial images, it is important to know locations of specific points of interest in an Earthcentered coordinate system (latitude, longitude, height) (see figure). The correspondence between a pixel location in the image and its Earth coordinate is known as georegistration. There are two main technical challenges arising in the intended application. The first is that no known features are assumed to be available in any of the images. The second is that the intended applications are real time. Here, images are taken at regular intervals (i.e. once per second), and it is desired to make decisions in real time based on the geolocation of specific objects seen in the images as they arrive. This is in sharp contrast to most current methods for geolocation that operate “afterthefact” by processing, on the ground, a database of stored images using computationally intensive methods.
The solution is a nonlinear estimation algorithm that combines processed realtime camera images with vehicle position and attitude information obtained from an onboard GPS receiver. This approach provides accurate georegistration estimates (latitude, longitude, height) of arbitrary features and/or points of interest seen in the camera images. This solves the georegistration problem at the modest cost of augmenting the camera information with a GPS receiver carried onboard the vehicle.The nonlinear estimation algorithm is based on a linearized Kalman filter structure that carries 19 states in its current implementation. Six of the 19 states are calibration parameters associated with the initial camera pose. One of the states calibrates the scale factor associated with all cameraderived information. The remaining 12 states are used to model the current kinematic state of the vehicle (position, velocity, acceleration, and attitude).
The new georegistration approach was validated by computer simulation based on an aircraft flying at a speed of 70 m/s in a 3km radius circle at an altitude of 15,000 ft (≈4,600 m), using a camera pointed at the ground toward the center of the circle. Results from using the nonlinear estimation algorithm, in combination with GPS and camera images taken once per second, indicate that after 20 minutes of operation, realtime georegistration errors are reduced to values of less than 2 m, 1 sigma, on the ground.
The new method is very modular and cleanly separates computer vision functions from optimal estimation functions. This allows the vision and estimation functions to be developed separately, and leverages the power of modern estimation theory to fuse information in an optimal manner. Heuristics are avoided, which are generally suboptimal, as are other methods that require humaninthe loop intervention, ad hoc parameter weightings, and awkward stitching together of various types of data.
The work is applicable to any scientific or engineering application that requires finding the geolocation of specific objects seen in a sequence of camera images. For example, in a surveying application, the precise location and height of a mountain peak can be determined by having an airplane take aerial images while circling around it.
This work was done by David S. Bayard and Curtis W. Padgett of Caltech for NASA’s Jet Propulsion Laboratory.
This software is available for commercial licensing. Please contact Daniel Broderick of the California Institute of Technology at This email address is being protected from spambots. You need JavaScript enabled to view it.. NPO47255
This Brief includes a Technical Support Package (TSP).
Nonlinear Estimation Approach to RealTime Georegistration from Aerial Images (reference NPO47255) is currently available for download from the TSP library.
Please Login at the top of the page to download.
White Papers

