Real-Time Web-Based Image Distribution Using an Airborne GPS/Inertial Image Server
- Monday, 01 September 2008
The use of geo-referenced imagery across the Internet is becoming prevalent thanks to the development of Web-based location servers such as Google Earth, TerraServer, and Yahoo Local. But users of these services are continually asking for more timely high-resolution data. Civil agencies such as firefighters, search and rescue teams, law enforcement, 911 emergency operations, border patrol operations, traffic monitoring systems, and geological survey crews — as well as the military — could benefit from a near-real-time, Web-based geospatial visualization capability.
To address this need for real-time geospatial awareness, NAVSYS Corp. — with funding from the National Geospatial-Intelligence Agency, Office of Naval Research, and the United States Marine Corps — has developed the GI-Eye product, which provides the capability to generate precision mensurated imagery directly on the aircraft collecting the data. The GI-Eye system integrates GPS (Global Positioning System), inertial, and digital camera data to provide autonomous registration capability for imagery without requiring access to any Ground Control Points (GCPs). This provides real-time, high-quality registered imagery at a 1-Hz rate.
The GI-Eye system has been combined with an Enterprise Server, termed the GeoReferenced Information Manager (GRIM), which uses this imagery to auto-generate mosaics as the data is being collected. With this approach, a near-real-time geospatial view of the environment can be generated in a format that can be viewed using current Web-based geospatial visualization tools.
GI-Eye System Design
The GI-Eye product is offered to sensor manufacturers and systems integrators to provide an embedded precision auto-registration capability for electro-optic (EO), infrared (IR), or other focal plane array (FPA)-type sensors. It has been integrated with a variety of digital cameras and sensors.
GI-Eye provides the capability to precisely time-mark each camera image and uses NAVSYS’ proprietary InterNav kinematic alignment algorithm to measure the precise position and attitude of the camera using GPS and inertial sensor data. The GI-Eye system auto-registration capability provides the location and pointing angle of the sensor with each image, and also sensor calibration data from which the coordinates of each pixel in the image can be derived. This information can be used with a Digital Elevation Model (DEM) to extract the individual pixel coordinates of each image. It can also be used to derive a 3D DEM from multiple images through photogrammetry.
A self-calibration capability has been designed into the GI-Eye system that allows for estimation of camera misalignment, focal length, and lens distortion parameters. This allows geo-registration accuracies of 1-2 meters to be provided when flying at an altitude of 1,000 feet without requiring the use of any ground truth.
The GI-Eye has also been configured for use in an unmanned air vehicle (UAV) payload. The approach is to use a modular design that enables sensor upgrades over time, with only software configuration changes. The GI-Eye system comprises the digital camera, power converter board, IMU interface board, single-board computer, and the BAE Multisensor Inertial Measurement Unit (MIMU).
The hard drive is positioned under the single-board computer. Figure 1 shows these components assembled in the ARES UAV payload. ARES is a small UAV developed by the Research and Engineering Center for Unmanned Vehicles (RECUV) at the University of Colorado at Boulder for use in flight testing advanced unmanned air system concepts. The camera and IMU can be either fixed in the aircraft or mounted inside a gimbal to allow geo-pointing to targets of interest.