Edward Chow leads the development of AUDREY, the Assistant for Understanding Data through Reasoning, Extraction, and sYnthesis. The artificial-intelligence system captures a variety of sensor data, including gases, temperature, and location signals. By sending alerts through a mobile device or head-mounted display, AUDREY could soon be used to guide first responders through dangerous conditions.
NASA Tech Briefs: What is AUDREY?
Edward Chow: AUDREY is our attempt at creating a next-generation artificial-intelligence system that thinks like a human. In our work, we combine human-like cognitive software [capabilities] with some of the latest machine learning techniques. AUDREY is a system that can look at a massive amount of data, reason and learn from it, and then apply findings right back into processing data. It becomes a sort of positive-feedback loop.
NTB: What kinds of data are being collected?
Chow: It depends on the applications. For the Next Generation First Responder (NGFR) program [part of a five-year commitment from the Department of Homeland Security Science and Technology Directorate], responders actually have sensors on their body – temperature sensors, heartbeat sensors, and gas sensors. They can pick up sensory information from the Internet of Things (IoT). You see more and more sensors now going into buildings, and more intelligence monitors on a network. We pick up information from a variety of different sources. The key is that Audrey has to be smart enough to know what context — what conditions — is absolutely necessary for a first responder to see. Firefighters are busy fighting fire. Police officers are busy doing their job. We only want to provide necessary insight.
NTB: How does AUDREY make a decision, given the massive amount of data being collected?
Chow: AUDREY is run in the cloud. [Critical] information is based on location and orientation of the sensor. The variety of environmental information provides the context necessary for AUDREY to make smart decisions.
NTB: How does a user receive alerts?
Chow: When the technology makes smart decisions, an alert message is shown as a display on an Android cellphone. There will be a warning sound from the [device]. We are also working on virtual reality/augmented reality glasses that provide users with voice prompts and visual guidance. Imagine fighting fire in a building. With augmented-reality glasses, we can provide the right pointer information to firefighters, and then they can follow directions safely out of the building.
NTB: What is most exciting about this technology?
Chow: With Audrey, we can capture human knowledge and process more data than a human has the energy to do. My dream is to one day have a personal AUDREY helping me to process all the data around me, to save me time so I can do the innovative things that I’m designed to do.
To download this interview as a podcast, click here.