When female butterflies make their mating decisions, they don’t rely on just one aspect of the male butterfly, they rely on two things: wing color of the prospective mate and the pheromone released by the male. (Image: Penn State University)
Tech Briefs: What started you thinking about butterflies and AI?

Saptarshi Das: In general, our research is toward smart electronic devices. You need to make them low power because AI consumes a huge amount of energy and that can very soon become unsustainable. So, one of the major thrusts in our research is to develop sustainable electronic devices, many of which are smart sensors that do computing at the edge — close to what they are sensing rather than at a server or in the cloud. So, our challenge is to make them as energy efficient as possible.

In nature, animals thrive based on their sensory skills, their decision-making, whether trying to hide from a predator, or finding prey. This relies on special sensory organs that allow them to detect things like very small amounts of chemicals or gases or small perturbations in mechanical forces. They have been able to do that because they have developed those organs over the course of millions of years of evolution. Sensory organs are natural low-power sensors because they consume very little food and thrive in very remote locations where you hardly have a lot of resources like we humans have — they're not that privileged.

That motivated us to look into how animals do sensing and make decisions in wilderness or other remote locations that are very resource constrained. Our group has been working on this for the last seven or eight years. We did not start with the butterfly; we started our research with the locust.

Locusts all move together — there could be a million of them in a swarm. But although they all fly together, they never collide with each other. That means they have an amazing collision-avoidance mechanism. So, we tried to discover what in the brain of locusts allows them to avoid collisions, because that is really natural intelligence. We found that there's a special neuron in their brains that enables that. Then, we mimicked the functioning of that neuron using solid-state devices and showed that if we adopted that principle for collision avoidance, it could be at extremely low power. And collision avoidance is very important for autonomous vehicles, flying drones, or even robots.

Next, we tried to mimic barn owls. The barn owl is very good at locating its prey in complete darkness, which means that it doesn't rely on vision. It is aided by its ability to hear very small sounds, and it can localize which direction the sound is coming from very precisely. So, we investigated what in the neural architecture of the barn owl’s brain allows it to do that. Based on that, we published a work showing that nanoscale electronic devices can also be very precise for locating sound or other kinds of waves.

However, in all our previous investigations, we were looking into a single sensory input. Then we came across butterflies. When female butterflies make their mating decisions, they don’t rely on just one aspect of the male butterfly, they rely on two things. One is wing color of the prospective mate, which is optical information about color. The other is the pheromone released by the male butterfly, which is chemical information. And both must match for the particular species. Here it is not just one sense, but two senses combined to make a decision. And that is something current-day AI doesn't do. So, we thought about a next generation of AI making decisions by combining information coming from multiple sensory sources.

We are thinking about how to utilize this type of decision-making for real-life applications. We’re working on a lot of projects with the military. For example, let's say you are in a forest and there's an army and they're trying to figure out if there is some activity, they have to get information from multiple sources. For example, is there movement, or a spark of light, or a certain kind of a chemical?

To implement this, the next question is: We understand now what algorithms we need to use to make the decision, but how are we going to build the hardware? We had to then decide whether it would be energy efficient to simply combine existing sensors, and we concluded that it would not — all these sensors have been developed separately, not talking to each other. But when an animal evolves, these sensory capabilities are intertwined.

So, we went back to the drawing board and started thinking about how we could redesign things. We developed the idea of using new, more energy-efficient materials. We ultimately chose molybdenum disulfide (MoS2), which is a very thin layer of material, yet it is very good at detecting light. And we chose graphene as our chemical sensor. We then put them together on the same chip. It made a compact device, which was not only very small but also very efficient. The two different types of sensors don't have to be connected by external wires and external hardware since they’re embedded together on a single chip, making this device very efficient.

Tech Briefs: I imagine there also has to be some electronics so you can change the weighting of the two inputs.

Das: Right, all the needed circuitry is also integrated into the chip rather than using bulky external components, so the device has a reduced overall footprint.

Tech Briefs: How do you determine the desired ratio between the inputs from the two different sensors?

Das: We can program the relative importance of the two on-chip — we call it reconfigurability. Let's say, for example, you are walking in a forest in daylight, you really don't care too much about the different sounds because you can see everything very clearly, so you rely more on your vision than your hearing ability. But now imagine the same situation of walking in a forest but now it’s nighttime. Then your ears become more active, because you need to know whether something is chasing you from behind. Animal senses have developed in a way that they can prioritize one vs. the other, and that is precisely what we show in this butterfly-inspired chip. With butterflies, there are near species and there are far species. So, if you are in a forest and the butterfly does not really have the male butterfly with the right pheromones, then it has to reduce its choices to perhaps a different color in order to just make sure the genes are passed over — that's what animals do all the time. And that's what we also do with our chip. All our devices can be programmed to prioritize one sense vs. the other, or you can give them equal importance.

That’s a unique feature that you cannot do with currently available sensors, they cannot self-adapt, but ours can — you don't have to go back and change them every time.

Tech Briefs: How do they self-adapt?

Das: MoS2 is a very good photodetector, and graphene is a very good chemical sensor. When, for example, the light intensity is very low, the response of the photo detector is going to be very low. So, we can set a threshold level such that if the light intensity is below the threshold, you prioritize chemical sensing. That can be done with a very simple integrated logic circuit. If the chemical information falls below a threshold, you naturally prioritize the light information. When both fall below the threshold, you can combine them to make the sensing better. There are existing logic gates for these functions.

So, we don't need to abandon human developments completely. We are embracing both — animals use algorithms for detection, and we use the power of modern microprocessors and computing algorithms.

Tech Briefs: If both sensor inputs are above the thresholds, how do you decide which should predominate?

Das: If the signals from both sensors are very low, you could combine them to make the signal stronger. But when both signals are very large, as long as the combined strength is high enough, you will make the right decision. This is something quite remarkable in nature. Generally, one plus one is two. But that rule does not apply when you integrate multiple sensors. What happens with animals is that when both signals are very small, let's say you're adding 0.1 plus 0.1. The result is not 0.2, it can actually be 0.5 or even 10. One sense aids the other sense. But when you have one plus one, the total result will still be one — the result kind of saturates. And this is something we can also design using the integrated circuit: The response is not going to overshoot when both sensors are very high.

Tech Briefs: What happens if the first signal is one and the other is two?

Das: It depends upon your ultimate goal. If they are at one and two, does it matter when you're making the decision? You design your circuit accordingly — you can change your threshold. You can make it, for example, 1.5. Then it will prioritize two over one. We make sure that all these devices have a tunable threshold. So that when you are operating in an environment when all signals are weak, you use a lower threshold of detection; while if the signals in your operating environment are strong, you use a higher threshold of detection. We call it a programmable integrated circuit, like these old FPGA chips.

Tech Briefs: So, what you're coming up with is a very versatile sensor that could be programmed for any particular application.

Das: Yes, absolutely.

Tech Briefs: Would you have the gate circuits on the same chip as the sensor and then output that to the AI?

Das: Absolutely, the good part about these 2D materials is that they can be used as sensors as well as for building logic circuits. So, you don't need to combine different sets of 2D materials. You can actually do that more efficiently because now both the integrated circuit and the sensors are on the same chip. This is a new paradigm that we call near-sensor compute or in-sensor compute.

Tech Briefs: Do you have any particular applications you're going to be working on first?

Das: We are trying to develop different edge sensors; we don't restrict ourselves to only chemical or light. If you think about VR/AR, all the emerging technologies, these are all about different senses. Human experiences or animal experiences are not just about one sense — it is all about mixing all the senses together and prioritizing one vs. the other. So, that’s why we are trying to develop very application-specific low-power sensors.

So, for example, this kind of sensor which combines chemical and optical information could be very important for many defense applications. But, at the end of the day, it is not that the same sensor can be used for multiple purposes, you develop sensors for specific applications. For example, we have developed collision detectors, which are very good for autonomous vehicles. We have developed sensors based on auditory capabilities, which would be good for navigational performance in underwater environments. Sensors that combine chemical and optical information are good for space missions. We work with NASA because they're looking for extraterrestrial life, so they're looking for small traces of certain types of chemicals. We are looking for multiple applications using these edge sensors, which are inspired by nature.



Magazine cover
Tech Briefs Magazine

This article first appeared in the October, 2024 issue of Tech Briefs Magazine (Vol. 48 No. 10).

Read more articles from this issue here.

Read more articles from the archives here.