2011

Fused Reality for Enhanced Flight Test Capabilities

Complex maneuvers can be accomplished without additional aircraft resources or risk.

The feasibility of using Fused Reality-based simulation technology to enhance flight test capabilities has been investigated. In terms of relevancy to piloted evaluation, there remains no substitute for actual flight tests, even when considering the fidelity and effectiveness of modern ground-based simulators. In addition to real-world cueing (vestibular, visual, aural, environmental, etc.), flight tests provide subtle but key intangibles that cannot be duplicated in a ground-based simulator. There is, however, a cost to be paid for the benefits of flight in terms of budget, mission complexity, and safety, including the need for ground and control-room personnel, additional aircraft, etc. 

A Fused Reality™ (FR) Flight system was developed that allows a virtual environment to be integrated with the test aircraft so that tasks such as aerial refueling, formation flying, or approach and landing can be accomplished without additional aircraft resources or the risk of operating in close proximity to the ground or other aircraft. Furthermore, the dynamic motions of the simulated objects can be directly correlated with the responses of the test aircraft. The FR Flight system will allow real-time observation of, and manual interaction with, the cockpit environment that serves as a frame for the virtual out-the-window scene.

FR is a mixed-reality approach that employs four technologies: live video capture, real-time video editing, machine vision, and virtual environment simulation. Video from the trainee’s perspective is sent to a processor that preserves pixels in the near-space environment (i.e., cockpit), and makes transparent the far-space environment (outside the cockpit windows) pixels using blue-screen imaging techniques. This bitmap is overlaid on a virtual environment and sent to the trainee’s helmet-mounted display (HMD). The user can directly view and interact with the physical environment, while the simulated outside world serves as an interactive backdrop.

The system employs a head-mounted camera and display assembly, where the camera captures live video from the user’s perspective and sends it to a computer for processing. The window frames of the cockpit are bordered with colored tape, and when these colorcoded borders are sensed, the computer keys out pixels lying within each window so that an underlying virtual scene is seen in place of the window pixels. The virtual simulation reacts to the user’s head motion and control inputs, and the two layers — processed video and virtual scene — are combined and viewed by the user through a head-mounted display.

Critical hardware challenges included selection of color and material of the window bordering material, and identifying a lens filter to allow machine color recognition in the presence of bright sunlight. Software challenges included accommodating for every possible view (of one or more window borders), balancing sensor noise-smoothing against precision loss, and creating a means for rapidly calibrating color sensing thresholds for a given lighting environment.

This work was done by Ed Bachelder and David Klyde of Systems Technology, Inc. for Dryden Flight Research Center. DRC-010-033

This Brief includes a Technical Support Package (TSP).

Fused Reality for Enhanced Flight Test Capabilities (reference DRC-010-033) is currently available for download from the TSP library.

Please Login at the top of the page to download.