NASA Technology

When a plane overshoots the final approach for a landing, often the pilot’s natural—and dangerous—instinct is to pitch up the aircraft to slow down and land.

Using virtual reality software designed and tested in part with NASA funding, pilots can practice difficult maneuvers much more safely. The pilot here just landed the plane—on a virtual runway some 5,000 feet in the air.

It’s one of the leading causes of accidents, especially with small airplanes. Increasing the plane’s angle at a slow, approaching-landing speed can easily throw it into a stall and result in the plane crashing.

But what if that same mistake happened at 5,000 feet instead?

A new head-mounted virtual reality tool, branded Fused Reality and developed in cooperation with NASA’s Armstrong Flight Research Center, can help military, commercial, or even hobbyist pilots train for such potentially dangerous scenarios in real life, in the air, but with far less danger.

Pilots have long been using virtual reality on the ground for training purposes, but fixed-place ground simulators have limitations. For one thing, “there’s not a lot of fear factor, because you’re not really doing it. You screw up, you hit the reset button, you try again,” says Bruce Cogan, an aeronautical engineer at Armstrong.

Also, he says, the simulator is only ever as good as its programming. “It’s challenging to replicate how the aircraft feels. You may be training a pilot to land, but if the dynamics of the simulation aren’t very good, it’s not going to be very useful and could even be harmful.”

The new simulator hooks into any airplane and layers a virtual reality scene over the real world outside the cockpit. “You actually get the dynamics of the exact airplane you’re flying,” Cogan says. That means external factors like cross winds, as well as intrinsic ones like how the plane handles, are all real.

With a virtual runway created by the software, “you can train for this landing task at 5,000 feet, so if you mess up, you won’t hurt the airplane. You can go try again.”

“You can actually demonstrate that you will stall and roll over and it will look like you’re crashing into the runway,” adds David Landon, CEO of Systems Technology Inc. (STI), which built Fused Reality. “What you teach them is, you don’t want to do this. You just roll your wings level and increase your airspeed and go around” to try the approach again.

Technology Transfer

NASA’s initial interest in the Fused Reality platform wasn’t for training, Cogan explains, but for evaluating how well an aircraft flies.

“As part of flight controls and new aircraft designs, we have pilots evaluate aircraft handling qualities: When he puts in a control input, does the airplane do what he wants? Is it too slow? Too fast?”

Testing those maneuvers requires actually flying them, and that can be an expensive proposition. For example, aerial refueling not only requires a second airplane—an added cost—but also carries the risk of a collision, even with an experienced pilot.

With a virtual reality platform, however, the second plane can be simulated.

Hawthorne, California-based STI started working on Fused Reality under Small Business Innovation Research (SBIR) contracts with the Air Force and Navy, but in the early phases, it was designed to be used in a static vehicle on the ground.

“They wanted the crewman to wear a device that would let him see the interior of the cabin of the aircraft as it was—so he’d see his hands and the gun he was firing—but when he looked out the door of the helicopter, he would see a virtual world,” Landon explains.

Cogan’s team at Armstrong wanted to take the simulator up into the air. So in 2008, STI was awarded additional Phase I and Phase II SBIR contracts from NASA to develop the platform as a potential in-flight simulator.

The biggest challenge, explains Landon, was figuring out how to cue the technology on what should be visible as normal and what should be overlaid with the virtual scene. On the ground, they used color, much like a green screen for a television weather map. But NASA wanted the pilot to be able to see the real view outside the windscreen, so they had to develop a new system that cued off of brightness and infrared light.

There are many possible applications for Systems Technology Inc.’s Fused Reality system, including in space. A new contract with Johnson Space Center investigates use of the system on the space station, to practice complicated maneuvers of the robotic arm.

Three years later, Armstrong and STI worked together again, through center innovation funds, on additional development and test flights. Among other improvements, STI worked on making the system compatible with any type of aircraft, even one that doesn’t have a fully automated onboard computer to feed flight data into the virtual reality simulator.

STI adapted their system to work with an inexpensive, off-the-shelf, instrumented measurement unit, “basically a black box, about $5,000, that you put on the seat next to you. It takes external GPS signals and internal accelerometer data and feeds it into your laptop,” which tells the Fused Reality system what the aircraft is doing, where it’s positioned, its speed, and its roll and pitch.

Benefits

The beauty of the Fused Reality platform, says Landon, is that “going forward, every aircraft can become its own simulator.” Pilots and airplane developers can test and practice maneuvers in real conditions with just one system that can be moved from one plane to another and even to a helicopter.

The system can be used to train pilots on difficult, potentially dangerous maneuvers and to test pilots and aircraft in extreme conditions that are typically too risky to try, he notes.

“Say I want to practice doing a cross-wind landing in winds that are very close to the limits of how you could actually do a landing,” Landon says. “If I did it using that virtual runway, I can make an approach down to a virtual touchdown,” and since the plane is still very high from the ground, the potential danger is far lower.

Fused Reality also reduces the need for additional equipment and personnel during training, whether it’s the tanker plane in aerial fueling or a person in the water in a helicopter rescue.

To keep costs down further, STI has designed its system to work almost entirely with interchangeable off-the-shelf components, from the virtual reality headset to a standard laptop. “Our secret sauce is in the software, the algorithms we developed. It’s not hardware-specific.”

With head-mounted displays coming down in price, the system is poised to become more affordable than ever, he adds.

The system is commercially available, and STI has been in talks with airplane manufacturers interested in using Fused Reality both to evaluate their planes and market them—as well as to help in designing features like the head-up display, so tweaks can be tested without the cost of rebuilding the display every time.

Training academies are also interested, and the company has a new contract with Johnson Space Center to investigate use of the system on the International Space Station, where astronauts could use it to practice complicated maneuvers with the station’s robotic arm, among others.

Looking forward, Cogan says, NASA sees applications for longer-duration voyages, like to Mars, where the Fused Reality simulator could be adapted to, for example, provide diagrams that overlay broken hardware to help repair it, or even provide visual aids for surgery or other medical procedures.