The Advanced Cockpit Evaluation System (ACES) includes communication, computing, and display subsystems, mounted in a van, that synthesize out-the-window views to approximate the views of the outside world as it would be seen from the cockpit of a crewed spacecraft, aircraft, or remote control of a ground vehicle or UAV (unmanned aerial vehicle). The system includes five flat-panel display units arranged approximately in a semicircle around an operator, like cockpit windows. The scene displayed on each panel represents the view through the corresponding cockpit window. Each display unit is driven by a personal computer equipped with a video-capture card that accepts live input from any of a variety of sensors (typically, visible and/or infrared video cameras).

Software running in the computers blends the live video images with synthetic images that could be generated, for example, from heads-up-display outputs, waypoints, corridors, or from satellite photographs of the same geographic region. Data from a Global Positioning System receiver and an inertial navigation system aboard the remote vehicle are used by the ACES software to keep the synthetic and live views in registration. If the live image were to fail, the synthetic scenes could still be displayed to maintain situational awareness.

This work was done by Jeffrey Fox, Eric A. Boe, and Francisco Delgado of Johnson Space Center; James B. Secor II of Barrios Technology, Inc.; Michael R. Clark and Kevin D. Ehlinger of Jacobs Sverdrup; and Michael F. Abernathy of Rapid Imaging Software, Inc.

Rapid Imaging Software, Inc. has requested permission to assert copyright for the software code. MSC-24020-1

NASA Tech Briefs Magazine

This article first appeared in the July, 2009 issue of NASA Tech Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.