NASA Technology

Let’s say you are an academic researcher who needs to collect data over a long period of time, on a site hundreds of miles from your base of operation. You set up a variety of sensors in the field to continuously measure temperature, soil moisture, and capture visual imagery. All of that data needs to be stored securely so that it can later be extracted and analyzed.

But reliably acquiring and storing data is harder than it sounds. Sometimes sensors fail part way into a long-term experiment, and if they aren’t communicating beyond their local network, a researcher might not know they have failed until he or she comes to retrieve the data and discovers that the experiment is lost.

NASA, the US Forest Service, and the National Oceanic and Atmospheric Administration teamed up to equip an unmanned aerial vehicle with sensors for detecting the severity of forest fires in California. Dryden Flight Research Center was responsible for transmitting flight data to the ground and used DataTurbine for the job.

Obtaining timely and reliable data from another location is also a challenge. Some researchers want to set up equipment in a remote location solely to provide an alert when conditions are right—say, if a storm develops—so that a team can head out to the field only when it matters. Unreliable data feeds can ruin these plans.

In addition, many research setups incorporate sensors from several vendors, and it is often difficult to integrate heterogeneous proprietary software and hardware into one system. Without a uniform format, it can take extra time after data collection to organize a data set with appropriate tags and identifiers.

All of these problems and more are encountered not just in academic science but in hugely diverse settings, from structural analysis to manufacturing to medical health monitoring. Tony Fountain, director of the Cyberinfrastructure Lab for Environmental Observing Systems at the California Institute of Telecommunications and Information Technology, University of California, San Diego (UCSD), has long worked with researchers on securing good data. As he says, “The question is always: how do you move data between data sources, data repositories, and applications that use it? And how do you do that in near-real time, reliably, and over networks that may have various performance problems?”

At NASA’s Dryden Flight Research Center, researchers have long faced the same questions. Dryden is NASA’s primary center for research into every aspect of atmospheric flight and aircraft design. Many of the world’s most advanced aircraft have graced the skies over the Center’s location in the California desert, often with sensors attached to collect critical information.

Researchers getting data in real time

A research team based out of San Diego releases a buoy equipped with sensors into Crystal Lake, Wisconsin. The hardware is running a version of DataTurbine ported to the Android platform, and the team has successfully streamed its real-time data across the country.

Images courtesy of Peter Shin.

In the mid-1990s, Matthew Miller was brought into the Dryden facility to discuss sensor data analysis software he had been developing. Miller was working as an engineer with Creare Inc. and became the principal investigator on a Phase I Small Business Innovation Research (SBIR) contract to apply wavelet-based mathematical analysis techniques to aircraft vibrations.

As Miller was discussing the project with the NASA technical monitor on the contract, Larry Freudinger, the two realized they had both spent time in data acquisition using devices manufactured by Hewlett-Packard (HP). “Larry happened to have some HP equipment, so we worked together to set up a demo of the software I had been working on,” says Miller. “One of its key features was that you could take data from one place and stream it live over a network to another place.” Having prepped a demo, the pair was able to get live data streaming from a test flight directly into the control room.

Technology Transfer

Both Freudinger and Miller were enthusiastic about the potential of the technology in more general applications beyond wavelet analysis and the particular equipment they had been using. Their continued efforts to develop the software resulted in a technology Miller named the Ring Buffered Network Bus (RBNB) (Spinoff 2000). A patent was filed by the co-inventors, and Dryden has since used the tool in many research flights.

The network streaming feature of RBNB was increasingly recognized as a significant enabling technology for distributed test and measurement applications. From its initial test run on specific machinery, RBNB soon became an open-ended tool, specializing in integrating input from any number of sources, in many kinds of formats, and moving that data to any number of end applications. A special emphasis was placed on making RBNB reliable for data storage and syncing, even when the sensor network giving it input became unreliable.

Miller, who is also a part-time contractor working for Dryden, used the partnership as a launchpad for further projects. He refined the technology through work with the Air Force, the National Institutes of Health, and the National Science Foundation (NSF), among others. All along, he had an eye on how the software could be commercialized, but he soon discovered that the tool could sometimes be a hard sell.

The challenge Miller faced was RBNB’s status as “middleware”—that is, software that is by design agnostic with respect to its inputs and outputs and which doesn’t specify how it must be used or what product it produces. Miller points out that many enterprises prefer dedicated, one-of-a-kind systems to tackle a given challenge rather than investing in reusable, general purpose tools. “If it had been just a self-contained, off-the-shelf product, it might have been easier to sell. As middleware, it took a lot of high-tech sales effort for the payoff. Companies were reluctant to purchase it.”

Seeing “the writing on the wall,” as he says, Miller began looking into how Creare might continue to benefit financially from the product other than by selling it as proprietary software. “I thought: In five years, who are you going to call for support if the software remains a privately owned commercial product that isn’t actively being sold?”

One phenomenon Miller had noticed was that the software attracted attention from groups that were interested in open source tools. For example, the Network for Earthquake Engineering Simulation (NEES), a large project created by NSF, was building structures several stories high on giant shaker tables to create earthquake simulations, and planned on using RBNB to capture its data from sensors and cameras. NEES expressed interest in being able to use RBNB as open source software.

Another proponent for going open source was Fountain, who at the time was working with a number of academic researchers at UCSD. “I’m a computer scientist, but I work with ecologists, marine biologists, and others like that,” he says. “They needed environmental monitoring, and I was given the responsibility of finding the right software to do that systems integration. Our evaluation came down to two products, one of which was Creare’s.”

The Great Barrier Reef seen from space
The Great Barrier Reef, here shown as imaged from space, is the world’s largest coral reef system and a subject of intense scientific research and study. One group in Australia is using DataTurbine to collect data gathered by sensor-laden buoys and send them to computers on shore.

Miller saw that Creare might have a better commercial opportunity in providing consulting services for users of the software rather than selling the software itself. Going open source would lower RBNB’s cost, which could increase its user base—especially among the kinds of academic researchers Fountain was working with. That could in turn create demand for support services.

Miller worked with Creare to release the software under an open source license. Fountain, meanwhile, secured a grant from NSF to help its transition from proprietary software to open source and to create a website where the project could be hosted and organized. In 2007, the technology was officially rebranded as the Open Source DataTurbine Initiative —DataTurbine being an unofficial name RBNB had gone by for years.