About 2.5 seconds after the launch of STS-114, the first Return to Flight launch after the Columbia accident, a large bird struck the external tank and fell into the exhaust plume. While this particular bird strike did not threaten the vehicle because it did not strike the orbiter, it did raise awareness of the threat of avian fauna present at launch. In response to the avian threat, a Bird Radar system was developed to detect birds above the launch pad. Because of interference with launch systems, the radar system could not span closer than a few hundred feet above the pad. There was a gap region that would not be visible with the radar.

The 3D model plot mode provides a real-time rotation, zoom, and translation of a 3D view against a 3D CAD model background of a terrain map.
The Bird Vision System is both a backup system and a supplement to the Bird Radar system. The Bird Vision System is a multi-camera photogrammetry software application that runs on a network of LabView platforms and uses data from three pad ascent-tracking cameras. The system detects and locates, in 3D, avian populations above the shuttle launch pads to ensure clear launch conditions. Computers acquire video data from any number of possible video streams [for the case of the shuttle system, it was three SD/SDI (serial digital interface) tracking cameras], and process the images to locate birds in the field of view of the camera. A central server merges all the data from the acquisition computers into a single dataset consisting of triangulated bird tracks. The merged data from the central server is displayable in 2D and 3D forms as well as a “count” history of number of birds. Remote viewing applications can access the data from the server to provide individually customizable views. Bird positions, 2D and 3D tracks, and synchronized video can be recorded by the system for playback at a later date through specialized playback software.

The Bird Vision System consists of four separate software applications: Bird Vision Overlay, Acquire and Process Images, Bird Vision Overlay Remote, and Bird Vision Playback. The system, in its current implementation, consists of four networked computers. Two computers are dedicated to one camera each. One machine connects both to a camera and serves as the central server. One remote viewing machine resides in the Kennedy Space Center Launch Control Center (LCC) room console.

The system is much less expensive and less intrusive than traditional radar systems, and can be adapted to virtually any viewing scenario and used with virtually any video sources. It operates semi-automatically; operators only need to supervise and ensure data integrity. Real-time results can be viewed from a virtually unlimited number of independent stations on the local network.

This work was done by Christopher Immer and John Lane of ASRC Aerospace Corporation for Kennedy Space Center. KSC-13513