Dr. David Lary of the University of Texas at Dallas is leading a research group that has developed a team of autonomous robotic devices that can be used at hazardous or difficult-to-reach sites to make surveys and collect data — providing more and faster insights than human beings are able to deliver.

Tech Briefs: What inspired you to use multiple autonomous devices to collect holistic sets of environmental data?

Dr. David Lary: Well, there are two parts to that journey. The first is the passion that drives me. I deeply desire to have comprehensive holistic sensing to keep people out of harm's way so that there can be the appropriate actionable insights to make timely decisions. That is my motivation, but the actual journey for me to this point started a few years ago —well, nearly 30 years ago.

When I did my PhD at Cambridge, the person that was really close by, who discovered the ozone hole, was a guy called Joe Farman. So, for my PhD I developed the first 3-dimensional global model for ozone depletion. It was a chemical module that was a plug-in to the global model that was used by the European Centre for Medium Range weather forecasting. With my plug-in I could do global simulations for ozone-related chemistry. And so, the obvious question I wanted to ask was: how good is this model? In order to verify it I had to bring together as many data sources I could: satellites, aircraft, ground-based sensors, and balloons. One of the

pernicious things that I faced was inter-instrument bias. So, I was looking for a way to help to deal with these biases. Although this was 30 years ago, I quite by chance, came across machine learning. This was before it hit the big-time adoption that it has today, and I found it did a really good job. That started me looking into what else we could do with it. Along the way, we were the first people to develop chemical data assimilation, which is now used by agencies around the world as part of their air quality forecasting systems.

Aerial robots developed by UT Dallas researchers can carry several cameras, an array of onboard sensors and a downwelling irradiance spectrometer — all used to gather data for mapping and to learn the characteristics of environments. (Image Credit: Lakitha Wijeratne/UT Dallas physics doctoral student)

One of the things we do in data assimilation is to pay a lot of attention to uncertainties. Part of my work at NASA was creating global remote-sensing data products. The way that works is, you use the remote sensing information to create a data product, say on the composition of the atmosphere or the land surface or underwater composition, say for global oceans. You get the remote sensing data from the satellite and compare it to the in-situ ground truth. Typically collecting the training data to do that could take up to a decade or so. It is quite a task because you want to be able to sample as many different conditions and contexts as you are likely to encounter globally.

Our autonomous robotic team, in about 15 minutes, collected the same data volume as is typically used to create those satellite-based remote sensing data products, albeit for one location. So, it could go into a new environment, one that it's not seen before and rapidly make precisely coordinated observations. In this case, the team was a boat and an aerial vehicle. We chose the boat because that's a bit more challenging than a ground-based measurement because of the access issues.

This type of paradigm is useful not only for helping to rapidly create new products, but also for the calibration and validation of satellite observations, it is also useful for helping to keep people out of harm's way. If you have a very contaminated environment, or an environment in which there are threats to humans that enter it, the robot team can go there and in a coordinated way collect the appropriate data.

The study in our paper used an aerial vehicle with a hyperspectral imager that collects a huge volume of data very rapidly. So, even our fastest data pipes right now, maybe 5G cellular communication, are not fast enough to deal with the required bandwidth for streaming hyperspectral images. We address that problem with on-board processing that lets us create these data products on the fly, using on-board machine learning, and then stream them. The final data products, say the abundance of a contaminant, is a much smaller amount of data that we can easily stream in real time.

So, it's really the ability to quickly gather comprehensive data that could be used to keep people out of harm's way, to characterize ecosystems, be part of an emergency response activity, say after a hurricane that has caused inundation of a neighborhood near a chemical plant, or any number of such applications: harmful algal blooms, oil spills, or various agricultural applications.

It's designed to be a flexible set of components. Just like now we're used to having an app store on our phone, or Tesla cars having over the air updates — these are software-defined sensors with their own app store that can be updated to improve their capabilities with time.

Tech Briefs: You have flying sensors and ground sensors and you send the information back to where? How is all that processed? Where is it processed?

Dr. Lary: Think of this as an ensemble of smart sensors. There's a set of things: so first there's a software-defined sensor. The software-defined sensor would be a smart sensing package, which combines the physical sensing system, say a camera, a hyperspectral camera, thermal camera, or a mass spectrometer. It could be any sensing device with some software/machine learning wrapped around it, which then provides the facility to provide some calibrated and/or derived data products. Most sensors will need calibration of some kind.

By coupling the sensor with a software/machine learning wrapper we can do a complicated calibration that lets us have a much more flexible system. So, that software-defined sensor can also have an app store of its own. One or more of these software-defined sensors can be on a platform that provides the sensor with power and time and location stamps for all the data it produces, and also communication connectivity, and maybe where relevant, mobility.

Tech Briefs: So, this is a physical platform?

Dr. Lary: Yes — in this example we had two platforms. We had the robotic aerial vehicle with the hyperspectral camera and the thermal camera and a few other sensors on board. And then the second platform was the robotic boat, which had a whole suite of sensors in the water underneath it, including sonar and various composition sensors and on the top of it, an ultrasonic weather station.

The software-defined sensor plus the platform forms a sentinel. This sentinel is something that can typically move around, make measurements, process data and/or stream it.

Researchers at UT Dallas have developed an autonomous team of robotic devices that can be used at hazardous or difficult-to-reach sites. Working together, the robots can make surveys and collect thousands of data records in just a few minutes. (Image Credit: Lakitha Wijeratne/UT Dallas physics doctoral student)

Multiple sentinels working together can form a robot team that can cooperate with each other to provide more capabilities than any of them can do on their own. In this case, the aerial robot with its sensors is cooperating with the water robot — the robotic boat and its sensors. Since they’re on the same network, the aerial robots, by design, glide over the same path as the boat. The boat measures what's in the water, while the aerial robot is looking down at the water from above with its remote hyperspectral camera and, using machine learning, learns the mapping from what our hyperspectral camera sees to the composition of the water. Once it's learned that mapping, it can now rapidly fly over a much wider area and provide for us, say, a wide-area composition map of the oil concentration, the chlorophyll abundance, dissolved organic carbon, or whatever component of the water we’re interested in.

We can do that, having never seen that environment before — the robot team cooperates to gather this training data. The training data is used by the machine-learning to create new data products like the wide-area composition map. Once that model has been trained, the hyperspectral sensing can be done just from the aerial measurements, it can be processed on-board the aerial vehicle and then the results streamed in real-time. Normally, that is such a huge volume of data that it can take a substantial time to do the number crunching. Since you can't stream it in real time because it's so large, being able to edge process it on board and then stream it, not only gives you a new capability, but it also reduces the latency, the delay in being able to even do such a task.

Tech Briefs: It's like what they talk about with sensors for edge processing to reduce the amount of data you have to send.

Dr. Lary: Exactly.

Tech Briefs: Would you consider the work that you’ve just done to be prototyping?

Dr. Lary: Yes, we have to start somewhere, so this is our first step.

Tech Briefs: How would you foresee it being used practically — say there's a disaster and the authorities want to use your system, what would they do?

Dr. Lary: This prototype is just one instance of a much more comprehensive vision, it's the minimal implementation of what could be a multi-robot team. Here we just had two robots, the aerial one and the robotic boat. We chose those two because, being able to sense water has access challenges. But this team could easily have many more members, say a ground-walking robot, or an amphibious vehicle that could carry the entire robot team into a potentially hazardous environment to be deployed remotely.

It could respond to an oil spill like Deepwater Horizon, where we saw the pictures of the wildlife impacts, fisheries being impacted, and so on — and oil spills are happening all the time. Or there could be chemical spills. For example, when Hurricane Harvey hit Houston, with its large number of processing plants, there was a heavy inundation of those facilities and some nearby neighborhoods were surrounded on three sides by contaminated water. Volatile organic compounds in the water ended up out-gassing, which caused severe respiratory issues — people didn't know what they were breathing, but they knew it was affecting them. Workers going in to clean up were also being affected by the contaminated water.

With our sensing system, you’d know exactly what you’re dealing with so you could tailor your response appropriately. But it could just as well apply to other cases like harmful algal blooms. Or even if there's no disaster, this type of capability can be used to characterize ecosystems and do surveys of infrastructure. Say, roads, railways, and bridges, where the autonomous robots with their sensing can rapidly take detailed measurements.

Now imagine a different scenario. Say you have the aerial robots much like we had in this example. With the hyperspectral, thermal, and say synthetic aperture radar, looking at the texture of a surface, it could be coupled with a ground robot that has a ground-penetrating radar looking for voids or other faults. Whether it's a tunnel or a road, cavities are formed with use and weathering. There also many different scenarios you could us for agriculture. It's designed to be comprehensive sensing, which like Lego blocks you can use together, like plug and play. You would be rapidly able to use them for a whole variety of real-life use cases, where real-time data-driven decisions lead to more transparency, keeping people out of harm's way.

Tech Briefs: If someone wants to use this system, would they need to have custom-built robots and drones, or would you have a package that you could mount on an existing device? How do you envision this becoming practical?

Dr. Lary: I’ve had to struggle for many years with getting things to work together. It's one thing to buy the equipment, it's another thing for the components to work together. Everything we bought was off the shelf, because our effort went into the, for want of a better word, the smarts, like the software integration.

Having said that, a key step for these software-designed sensors for which we're using the machine learning, is to calibrate against a reference or learn on the fly. We're using the same type of idea for air quality and distributing low-cost sensors across cities that have been calibrated against really expensive reference sensors. We can deploy sensors on a neighborhood scale, which previously would have been prohibitive in cost.

By being able to calibrate the low-cost sensors against a reference in much the same way as in this robot team, we calibrated the hyperspectral measurements made by the remote sensing capability against the in-situ composition, in this case of the water, and you can achieve things that otherwise would be very, very challenging.

It's really the network of sensors, the network of autonomous sentinels working together cooperatively using machine learning, that lets you do far more than any of those components could do on their own.

Tech Briefs: Do you foresee this being commercialized, built by private companies or do you see the government getting involved with it? What do you see going forward?

Dr. Lary: My dream going forward is to have a store where individuals, municipalities, or companies, can have ready access to these types of capabilities, and not just get the sensors but also the back-end services. So that when you plug and play this stuff together, it just works, and you don't have to go through a long development. The National Science Foundation categorizes this as a cyber-physical system. Cyber-physical systems are basically sensing systems coupled with algorithms to help you make better, timely decisions.

So, my dream for all of this, and what several of us are working towards — and we welcome partners of all kinds — is to have a cyber-physical social observatory. It would have to be a national facility just like an astronomical observatory with a giant telescope, because no one else can afford to do something on that scale.

Imagine now you have a set of sensing capabilities with multiple components, which in our project, are aerial robots and a robotic boat. But our system actually has nine sentinel types for various types of situations. We can use remote sensing from satellites and weather radars. Besides aerial vehicles, we have street-level sensors, streaming 24/7 air quality, light intensity, ionizing radiation and so-on. We have walking robots, we have electric ground vehicles and robotic boats, but then also we have wearable sensors.

We also want to be able to have multiscale sensing from the global big picture, from a satellite. So, say we now go back to the Hurricane Harvey example. Long before Hurricane Harvey made landfall, we could see it with the satellites and then as it got closer to land fall, with weather radar. But then the minute it makes landfall, the micro-environment details become critical. The exact height of particular streams could make a really big difference to the local environment. So, we want to have information on both the global large-scale and the hyper-local scale because you and I live at that very localized scale. To be able to simultaneously sense both large scale and local scale, we really need multiple sentinels.

But then wearable sensing is also really important. For example, in some of the parallel work we're doing, you see news headlines that say poor air quality makes you “dumb.” But how dumb is dumb? What pollutants can make us dumber than the rest? So, in one of our studies, we're using comprehensive biometric sensing and measuring over 16,000 parameters a second, together with comprehensive environmental sensing of about 2,000 environmental parameters to see how the environmental state affects our autonomic response.

All of this is designed to both be the holistic sensing to keep people out of harm's way, but also to find that unseen elephant in the room that can be impacting our health. Once we realize what that is and we can quantify it, there's normally an obvious path of data-driven decisions to make things better and then to take the appropriate next steps to monitor our progress.

That is really my dream — to be a catalyst for this holistic sensing to keep people out of harm's way: sensing in service of society. We have many prototypes that we are trying to bring to the point that they can be usable. So, we always welcome partnerships to help expedite that — from governments, from individuals, local municipalities, community groups, companies. We are working with all of those types of entities.

Tech Briefs: Sounds like you're inventing a whole brand-new kind of infrastructure.

Dr. Lary: We're trying— it's basically driven by needs. Holistic information can make such a big difference by giving us information to make appropriate decisions. It would be non-trivial to do that without the appropriate infrastructure.

Tech Briefs: It sounds wonderful, I just hope it can be implemented someday.

Dr. Lary: Me too, me too. We've come a long way. I think we're making the first step.

The other bit is outside the physical, which I didn't get to, is you can have things like school absenteeism, which then leads to poor learning outcomes. But it often turns out that the absenteeism can be due to things like asthma. The asthma is due to high pollen or air pollution, and it's actually a cascade of effects — the social is interacting with the environmental. We don't want this to be just a one-way thing. We want the triggers for the observations to be both what we see from the sensing directly, but also from societal issues like clusters of health outcomes or algal blooms, which can take out fisheries, or an oil spill — it's a two-way interaction.

Having a data platform that can bring together all these environmental parameter layers with the societal layers like mortality trends, absenteeism, cancer incidence, etc. can help make the decision-making process to help individuals, so much more transparent and effective.

Tech Briefs: What are you working on now? What's your very next step?

Dr. Lary: The very next step for this particular robot team with the boats plus the aerial vehicle would be to enhance various aspects of that autonomous team. And then we want to extend it to have more members of the team. For example, to have an amphibious ground vehicle that can maybe carry both the boat and the aerial vehicle into a contaminated environment, and then deploy them, while also making measurements on its own. Also, to make the robots be part of the remediation.

It's the different components working together. The same type of team can also be used for looking at maintenance of infrastructure, whether it's roads or rails or bridges, and also for other aspects of the environmental quality — air quality, water quality. So really, this proof of concept was just a prototype to show: “Hey really this can be done and now we would like to scale it in so many different applications.”

Also, these can be prototypes for satellite missions. You can imagine a pipeline where you have a proof of concept on an aerial vehicle. Then it could transition to other platforms like a CubeSat, for example. This can also be part of the validation process, collecting data for satellite missions, as well as collecting data for any of the different purposes I’ve mentioned.

An edited version of this interview appeared in the June 2021 issue of Tech Briefs.



Magazine cover
Tech Briefs Magazine

This article first appeared in the June, 2021 issue of Tech Briefs Magazine (Vol. 45 No. 6).

Read more articles from this issue here.

Read more articles from the archives here.


Transcript

00:00:00 [Music] my name is professor david larry and i'm here with a team from the university of texas at dallas in the hansen center for space sciences and our group is called mintz which stands for multi-scale intelligent interactive and integrated sensing

00:00:31 so what we try and do is comprehensive holistic sensing of the environment from micro scales to the global scale using a whole array of sentinels from satellites robotic vehicles we have sensors across cities recording 24 7 so we try and comprehensively characterize the environment today our exercise was to demonstrate an autonomous robot system

00:00:57 that can rapidly much quicker than the existing systems that we know of to gather data to then use machine learning to provide comprehensive maps of an environment that maybe you've never seen before so our robot team here had two robots so one part of that was the aerial robot so the aerial robot was carrying underneath

00:01:21 it both a hyperspectral camera measuring these 462 wavelengths from the ultraviolet to the near infrared it also had a thermal camera the thermal camera can be useful for a whole range of things and also a visible camera so we had pre-programmed with our autonomous control software

00:01:43 exactly the area we wanted the vehicle to survey and in fact in just over three minutes it surveyed that entire area and it took off automatically it flew automatically and it can also land automatically and while it was flying it was capturing this hyperspectral imagery the thermal imagery and the regular

00:02:06 visible imagery now the reason it's significant that we had a regular visible camera is that we're also investigating something called super resolution so um you're probably familiar like in a csi movie there's a an obscured car tag and you can't see it and they auto magically suddenly zoom in

00:02:28 so we want to we're working on that same type of technology but instead of just doing it for enhancing the spatial resolution we're investigating going from using a regular visible camera to the much more detailed information of a hyperspectral camera so the second part of the team that operated today was the robotic boat

00:02:52 so you can think of the robotic boat as giving us the ground truth so we can rapidly survey a large area with the aerial vehicle and capture imagery but we really need to know what does that imagery correspond to what do those spectra actually mean so the boat was laden with a set of sensors and about 30 sensors in fact a whole

00:03:18 array a sonar um giving us the depth and the bottom type the type of vegetation growth and then a whole set of composition sensors things like um algae oils and we can also have a mass spectrometer that can measure all the components of the water and the air so the boat goes

00:03:39 out and directly sails underneath where the aerial vehicle was flying so that precisely coordinated element is critical because that allows us to rapidly get the the relevant coincidences so when nasa does this and i i was part of doing that when i worked there you actually have these happenstance

00:04:02 overpasses you have a satellite in a fixed orbit that happens to overfly your scene of interest your scientific cruise or whatever else is making the measurements here we very rapidly get a massive data volume because the robotic team knows what the other robot is doing

00:04:22 and in a precisely coordinated way measures at just the locations required so this was a proof of concept that could easily be scaled to have many more robots and amphibious ground vehicle um walking robots that could be part of this collecting the ground truth on the one hand but then once you've got that ground truth

00:04:47 you've created your large area maps they can also be part of verification so if it seems there's a hot spot of some contaminant at this location before anyone goes near it you can then know where to send your walking robot your ground robot to verify is there really a contaminant there or your boat

00:05:08 so you don't just get a map you can have some confidence that you really are seeing what you thought you saw obviously safety is a prime concern for us so today we were very fortunate to be able to operate on a ranch so to keep things safe we used as an example

00:05:27 release a dye that has been used for a couple of decades for hydrological experiments so it rapidly disperses but what is significant for our use here is it has a very characteristic spectra that we can see from the aerial vehicle so it allows us to exercise the entire process so step one was

00:05:51 we did our training of the whole wide area we used our machine learning to learn the mapping from what's in the water to what the aerial vehicle sees then to exercise that team to check that it really works we then release a known contaminant we then fly over with the aerial vehicle

00:06:11 do we actually see it can we map it yes we can see it yes we can map it then we send out our boat to the location of the the contaminant to see to verify okay yes we really did see it this is how much we measure and we not only were taking the real-time measurements we were also doing underwater

00:06:35 video of that over water to for part of the verification we chose this particular case study in an aquatic environment for a couple of reasons one because the water is a little less accessible to make measurements so it's a slightly more challenging task it's not your easiest task

00:06:55 and number two that the maritime environment say around ports hugely significant for deployments of all kinds so much material moves through ports there's a lot of people that can be there so if a contaminant has been released into that environment it could have a massive impact

00:07:16 not just of those that directly encounter the water but those that are in the vicinity of it so i'll give you a specific example of that with the recent hurricane harvey hurricane harvey made landfall in lockport rockport and introduced and encountered a lot of houston

00:07:36 in houston we had these huge oil refineries and other superfund sites many of those chemicals went directly into the water as a result of the hurricane several neighborhoods were surrounded on three sides by that heavily contaminated water those chemicals were out gassing people were having

00:07:56 severe respiratory issues and they had no idea what was in the water that was being released and then the cleanup crews were going unprotected into the water causing further issues so that type of issue could be repeated many times especially as the bulk of the mega cities on the planet and there's more and more

00:08:19 mega move to mega cities happen to be coastal near waters one of the things we mapped today was dissolved organic carbon in this lake so in our survey which took about 15 minutes we gathered close to 4 000 data points completely autonomously so you could see that you can have a very rapid

00:08:42 collection once you've done that data collection machine learning can create a map from what the aerial camera sees and then create a map over a large area so the new things in the system are is completely autonomous you may never have seen the components that you are trying to map

00:09:05 before and it's also extensible so we just had a very small team this time one aerial vehicle and one robotic boat but that could easily be expand extended to multiple aerial vehicles for example extending the wavelength range into the infrared to have a synthetic aperture radar which

00:09:26 measures the texture so if you have a an oily release for example that will change the wave heights and the texture you see can also be used over land to look for buried explosive devices and things of that nature one of the paradigms that we've operated on is we want everything we do to be scalable

00:09:47 we don't want just to have a one-off we want to be able to build systems that we can roll out at large scale so if it's not just one unit supposing now it's standard issue for units in a given context they can have this capability it's all packed up in packing cases and out it goes

00:10:07 so so far what we used was really quite a high-end camera that was off the shelf but because of its capabilities it's rather heavy and so we needed to use a large aerial vehicle that performed very well it was very smooth it's made for movie making and it did an excellent job but it's easy to operate but it's a little large

00:10:29 so you could easily imagine that we might want a smaller system that the aerial vehicle can go in the backpack or something like that so there are now cameras that actually weigh just 30 grams that would allow us to use a much smaller aerial vehicle it doesn't have quite the wavelength range

00:10:48 but it could be very effective nonetheless so this was a proof of concept with off-the-shelf items but it's very tractable to easily go to the next step where we have now a much more miniaturized solution that an individual operator could use so not only for the aerial vehicle also

00:11:09 for the robotic boat so the robotic boat we used is a really beautiful boat it can go unmanned for 20 hours it's very easy to operate but still you could imagine you might want a smaller aquatic vehicle so we built a demonstration hovercraft which is essentially based on a small

00:11:28 surfboard that could is controlled by the same software that controls our aerial vehicle so both robots could be being controlled from the same software and one individual could then deploy both of them so it's much smaller and much lighter one of the things i'm really passionate

00:11:48 about is making things usable to provide actionable insights and a key part of something being usable for actual insights is if we can use it in many contexts it becomes even more valuable so if our system is not just used for say the surveys we were talking about today

00:12:06 but if we can use it for other purposes that could also be relevant to keep people out of harm's way that is very valuable so take for example this hyperspectral camera that today we were using to look at the composition of the water and map the release of our contaminant that same camera we could be using to

00:12:27 look at the healing of wounds and in that context you can use that hyperspectral camera to see like the depth of the calluses there the oxygen in the skin and a whole range of other things that very usefully map the progress of the wound healing we mentioned that we want to provide

00:12:49 actionable insights so that either individuals or their commanders can make informed decisions so some of these are actually rather basic questions that right now it's very hard for us to understand or to characterize like is the area safe like does an area we're about to move into

00:13:09 are there some contaminants there or some issues that we should really know about before we go into it so we've seen we can start to address that question is the area safe by the comprehensive and autonomous sensing producing these large area maps rapidly that we've spoken about another

00:13:30 reason that that might be important is do we need protective clothing and if so what protective clothing should we have and finally sometimes we know there's a contaminant maybe been released there but we don't know the best sampling pattern to rapidly identify so typically there are

00:13:49 basic star patterns or other patterns that are used to do those large area surveys but they can take us a while to get to the right location so in the paradigm with this autonomous team what we do is the large area survey say of just with just the aerial vehicle then we use something called

00:14:09 unsupervised learning so essentially what that is is we we split up our environment into a set of different regions where each region is is similar to the other elements in that region and so in doing that way supposing we split up this whole ranch into a thousand classes

00:14:30 and supposing someone had come in here and released a contaminant that contaminant would have a characteristic spectral signature so it would fall into say one or two of our thousand classes so we form the wide area map and then once we know that this class contains a contaminant we now know every other location where

00:14:53 that contaminant is present and so being able to rapidly do a survey classify the area you've just surveyed can help you do a much smarter um real life truth checking on the ground for where the contaminants are it can quickly guide you to the contaminants rather than just stumbling across it by

00:15:17 using the usual approach