Generic Helicopter-Based Testbed for Surface Terrain Imaging Sensors
- Created: Tuesday, 01 January 2008
This flexible field test system is designed for sensors that require an aerial test platform.
To be certain that a candidate sensor system will perform as expected during missions, we have developed a field test system and have executed test flights with a helicopter- mounted sensor platform over desert terrains, which simulate Lunar features. A key advantage to this approach is that different sensors can be tested and characterized in an environment relevant to the flight needs prior to flight. Testing the various sensors required the development of a field test system, including an instrument to validate the “truth” of the sensor system under test. The field test system was designed to be flexible enough to cover the test needs of many sensors (lidar, radar, cameras) that require an aerial test platform, including helicopters, airplanes, unmanned aerial vehicles (UAV), or balloons. To validate the performance of the sensor under test, the dynamics of the test platform must be known with sufficient accuracy to provide accurate models for input into algorithm development. The test system provides support equipment to measure the dynamics of the field test sensor platform, and allow computation of the “truth” position, velocity, attitude, and time.
The first test of the field test system provided verification and truth measurements to the LAND (Lunar Access Navigation Device) laser radar, which enable the comparison of the instrument data versus “ground truth” measurement. The instrumentation includes a GPS (Global Positioning System) receiver, Inertial Measurement Unit (IMU), two visible cameras, a support video camera, and a data collection and time-tagging system. These instruments are mounted on a gyro-stabilized gimbal platform attached to the nose of a helicopter. The gimbal is covered by a dome to reduce the amount of aerodynamic drag on the helicopter, with an observation window, which allows the instruments to view the ground below. The gyro-stabilized platform operates in both “nadir” mode, with the sensors pointed with a fixed angle to the ground, and in “geo” mode, in which the gimbal is directed to a fixed GPS location on the ground. The modes can be changed by a ground team via radio remote control during flight.
During an actual flight test, the flight verification equipment includes three computers for collecting data and controlling instruments. The first laptop performs the timing and synchronization of all equipment and logs IMU and GPS data, as well as recording the synchronization pulses from the LAND system (this could potentially be any other sensor) and provides the image trigger pulses to the cameras. These data are fed to the laptop through an interface box into a PCMCIA (Personal Computer Memory Card International Association) interface card, which contains a field-programmable gate array (FPGA). This part of the system builds on heritage from a field test done for the Descent Imager Motion Estimation System (DIMES) project for the Mars Exploration Rover project in 2002.
A second laptop contains a GUI to control the LAND system. Commands are sent through an Ethernet interface to the LAND computer using TCP/IP protocol (Transmission Control Protocol/ Internet Protocol). These commands control the start/stop of the laser radar, and the number of lidar frames to gather for a single run, as well as also giving estimated altitude measurements to the LAND system. A third computer acts as a digital video recorder (DVR) for acquiring and time-tagging images taken by the two visible cameras.
Summarizing, the architecture includes the use of guidance and control instruments, data collection equipment, flight and ground procedures, ground fixed position reference targets, and data analysis tools. The test system also provides the processing of the collected instrument data, and includes image motion compensation using the attitude/ position instrumentation. This resulted in providing test and validation of an imaging lidar, and has the capability to test other types of surface terrain imaging sensors during aerial field tests. This task thus provides data and truth measurements to algorithms for a variety of applications including precision Lunar landing algorithm development.
This work was done by James Alexander, Hannah Goldberg, James Montgomery, Gary Spiers, Carl Liebe, Andrew Johnson, Konstantin Gromov, Edward Konefat, Raymond Lam, and Patrick Meras of Caltech for NASA’s Jet Propulsion Laboratory.
The software used in this innovation is available for commercial licensing. Please contact Karina of the California Institute of Technology at (626) 395-2322. Refer to NPO-44581.