NASA researchers want to prove that a virtual reality headset like the Oculus Rift is not just a toy. Josh Kinne, a deputy project manager at NASA Langley Research Center, sees a future role for virtual and augmented reality in the agency’s research and development, CAD design, and data visualization efforts.
Virtual vs. Augmented
With virtual reality (VR), headsets like the Oculus Rift and HTC Vive immerse a person in a completely different environment. Augmented reality (AR) devices project objects, like a 3D geometry overlay or a visualization of data, to a location in which a user already exists. The Microsoft HoloLens, for example, is a self-contained holographic computer that currently displays three-dimensional objects for astronauts on the International Space Station. Mixed reality, an increasingly popular term, combines virtual and augmented reality functions into a larger system.
The commoditization of VR and AR products has lowered costs, providing opportunities for more widespread use of the technology. Facilities, including NASA Langley, no longer have to shell out millions of dollars for a room-sized virtual “cave” environment. Instead, organizations can take advantage of headsets and smaller, cheaper options.
Easing the Move from CAD to VR
Although the idea of an Oculus Rift for every NASA employee is far from a reality, Langley’s Engineering Design Studio now features both VR and AR workstations, available to anyone in the facility.
The workstations are preloaded with 3D development platforms and software like Unity 3D, SketchUp, and Visual Studio. Langley’s Office of the Chief Information Officer also has additional hardware available, including HoloLens headsets.
To build support and interest for the emerging technologies, Kinne and his team provided a series of examples of how VR could be best used in research and development environments. One basic and persuasive example, Kinne said, was the ability for a CAD designer to create a part and see it in three dimensions.
With the Langley workstations, an individual researcher can load virtual reality prototypes, don a VR headset, and see if the demonstration works as planned — a compelling use case that Kinne compares to the design transition from paper drawings to 2D and 3D CAD design.
“It’s even more compelling if somebody else can pop on a VR headset, see that part from the same perspective, and interact,” said Kinne. “That gives you a capability that you don’t necessarily get with a 2D screen.”
The visualization tools provide the possibility of a virtual test facility, seen entirely within a VR headset. A representation of the test facility would allow colleagues to reference and interact with the same test articles while working at different sites.
“That’s another example of a really compelling use case, where you can start pulling expert resources from other centers and locations, and do real-time engineering,” said Kinne.
Although the 3D visualization of parts and assemblies provides perhaps the lowest barrier of entry to VR and AR use, Kinne sees data visualization applications as an area that will provide an even greater impact.
The brain has an uncanny ability to recognize patterns, said Kinne, but people often struggle with the vision and conception of large, complex data sets like computational fluid dynamics (CFD) results, especially on a screen in two dimensions. Kinne believes that VR and AR technologies will bring one’s visualization capabilities to maximum potential.
Multiple teams at Langley are experimenting with VR’s ability to visualize data, including CFD code output, material analysis information, and test facility operation data. Additionally, Langley’s Lab 77 small satellite project team has also developed concept-of-operations visualizations for multiple CubeSat missions.
While the center’s Advanced Concepts Lab investigates additional applications, Langley’s Media Services’ animators and artists have also begun developing a VR walkthrough experience of the new Katherine G. Johnson Computational Research Facility, the location of NASA Langley’s consolidated data centers. NASA is also testing the idea of interactive virtual classroom sessions.
Kinne has a vision of VR and AR technologies creating “one big NASA,” where physical location is irrelevant. “The fact that you’re physically located at a particular center shouldn’t impact your ability to participate in a project,” said Kinne.
Mixed Reality Beyond Langley
In the meantime, as part of Langley’s Comprehensive Digital Transformation Program vision, the center seeks to break down policy and procedural barriers that prevent the use of virtual and augmented reality in the workplace. The Hampton, VA-based facility has also secured funding to set up a “Maker Space,” a design area that will integrate the VR and AR devices.
Virtual efforts have spread beyond Langley and into the agency’s other centers. NASA’s Jet Propulsion Laboratory, based in Pasadena, CA, and Microsoft have developed advanced VR software. The “OnSight” tool uses rover data (and the HoloLens) to build a 3D simulation of the Martian environment — a valuable virtual setting for scientists who want to discuss rover operations. NASA is also testing the use of Vive and Oculus devices to recreate the International Space Station and train astronauts.
As Oculus Rift and HTC Vive products explode onto the market, researchers like Kinne believe that virtual reality and augmented reality will be embedded in all parts of the research and development lifecycle. Having NASA and its employees as early adopters will help to push design innovation.
“We want to make sure that everybody at the center can take advantage of this technology if they have an application in mind,” Kinne said.