On a December night in 1995, 159 passengers and crewmembers died when American Airlines Flight 965 flew into the side of a mountain while in route to Cali, Colombia. A key factor in the tragedy: The pilots had lost situational awareness in the dark, unfamiliar terrain. They had no idea the plane was approaching a mountain until the ground proximity warning system sounded an alarm only seconds before impact.
The accident was of the kind most common at the time—CFIT, or controlled flight into terrain—says Trey Arthur, research aerospace engineer in the Crew Systems and Aviation Operations Branch at NASA’s Langley Research Center. In situations such as bad weather, fog, or nighttime flights, pilots would rely on airspeed, altitude, and other readings to get an accurate sense of location. Miscalculations and rapidly changing conditions could contribute to a fully functioning, in-control airplane flying into the ground.
To improve aviation safety by enhancing pilots’ situational awareness even in poor visibility, NASA began exploring the possibilities of synthetic vision—creating a graphical display of the outside terrain on a screen inside the cockpit.
“How do you display a mountain in the cockpit? You have to have a graphics-powered computer, a terrain database you can render, and an accurate navigation solution,” says Arthur.
In the mid-1990s, developing GPS technology offered a means for determining an aircraft’s position in space with high accuracy, Arthur explains. As the necessary technologies to enable synthetic vision emerged, NASA turned to an industry partner to develop the terrain graphical engine and database for creating the virtual rendering of the outside environment.
In 2003, Langley partnered with TerraMetrics Inc. of Littleton, Colorado, through the Small Business Innovation Research (SBIR) program to develop a 3D terrain rendering technology for flight-qualified synthetic vision systems. The company’s innovative solution, called TerraBlocks, rendered satellite imagery on top of terrain data to provide the pilot with a virtual view of the environment outside the cockpit window. This kind of rendering vastly improved on typical flat-earth displays by mapping the terrain in three dimensions on a model of the Earth’s sphere. The resulting visualization was not only more realistic, but also highly accurate.
To produce its visuals, TerraBlocks needed satellite imagery and terrain data. For the imagery, the company worked with the Scientific Data Purchase program at Stennis Space Center to locate the data it needed, using an archive from NASA’s Earth-observing Landsat 7 satellite that proved suitable for use in TerraBlocks. TerraMetrics chose NASA’s Shuttle Radar Topography Mission (SRTM) terrain data to provide the 3D element. Since then, NASA has used the TerraBlocks engine for multiple experiments with flight simulators for aircraft and even lunar lander vehicles, the latter using a graphical rendering of the Moon the company created. The lander tests demonstrated the potential for synthetic vision on spacecraft.
“If we do manned missions to asteroids or other destinations, all we need is the data and a good navigational system, and we can essentially draw that world for the pilot,” Arthur says.
Through the combined collaboration with the NASA centers, TerraMetrics has developed its NASA-derived innovations into products for helping pilots navigate more safely in the skies, as well as for assisting people in finding their way on the ground.